Sample records for proper statistical analysis

  1. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  2. Study/experimental/research design: much more than statistics.

    PubMed

    Knight, Kenneth L

    2010-01-01

    The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes "Methods" sections hard to read and understand. To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results.

  3. Study/Experimental/Research Design: Much More Than Statistics

    PubMed Central

    Knight, Kenneth L.

    2010-01-01

    Abstract Context: The purpose of study, experimental, or research design in scientific manuscripts has changed significantly over the years. It has evolved from an explanation of the design of the experiment (ie, data gathering or acquisition) to an explanation of the statistical analysis. This practice makes “Methods” sections hard to read and understand. Objective: To clarify the difference between study design and statistical analysis, to show the advantages of a properly written study design on article comprehension, and to encourage authors to correctly describe study designs. Description: The role of study design is explored from the introduction of the concept by Fisher through modern-day scientists and the AMA Manual of Style. At one time, when experiments were simpler, the study design and statistical design were identical or very similar. With the complex research that is common today, which often includes manipulating variables to create new variables and the multiple (and different) analyses of a single data set, data collection is very different than statistical design. Thus, both a study design and a statistical design are necessary. Advantages: Scientific manuscripts will be much easier to read and comprehend. A proper experimental design serves as a road map to the study methods, helping readers to understand more clearly how the data were obtained and, therefore, assisting them in properly analyzing the results. PMID:20064054

  4. LAKE DATA ANALYSIS AND NUTRIENT BUDGET MODELING

    EPA Science Inventory

    Several quantitative methods that may be useful for lake trophic quality management planning are discussed and illustrated. An emphasis is placed on scientific methods in research, data analysis, and modeling. Proper use of statistical methods is also stressed, along with conside...

  5. OH maser proper motions in Cepheus A

    NASA Astrophysics Data System (ADS)

    Migenes, V.; Cohen, R. J.; Brebner, G. C.

    1992-02-01

    MERLIN measurements made between 1982 and 1989 reveal proper motions of OH masers in the source Cepheus A. The proper motions are typically a few milliarcsec per year, and are mainly directed away from the central H II regions. Statistical analysis of the data suggests an expansion time-scale of some 300 yr. The distance of the source implied by the proper motions is 320+140/-80 pc, assuming that the expansion is isotropic. The proper motions can be reconciled with the larger distance of 730 pc which is generally accepted, provided that the masers are moving at large angles to the line of sight. The expansion time-scale agrees with that of the magnetic field decay recently reported by Cohen, et al. (1990).

  6. A crash course on data analysis in asteroseismology

    NASA Astrophysics Data System (ADS)

    Appourchaux, Thierry

    2014-02-01

    In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.

  7. Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Guyer, T.; Stringfellow, G. B.

    1982-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.

  8. The prior statistics of object colors.

    PubMed

    Koenderink, Jan J

    2010-02-01

    The prior statistics of object colors is of much interest because extensive statistical investigations of reflectance spectra reveal highly non-uniform structure in color space common to several very different databases. This common structure is due to the visual system rather than to the statistics of environmental structure. Analysis involves an investigation of the proper sample space of spectral reflectance factors and of the statistical consequences of the projection of spectral reflectances on the color solid. Even in the case of reflectance statistics that are translationally invariant with respect to the wavelength dimension, the statistics of object colors is highly non-uniform. The qualitative nature of this non-uniformity is due to trichromacy.

  9. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  10. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    ERIC Educational Resources Information Center

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…

  11. Primer of statistics in dental research: part I.

    PubMed

    Shintani, Ayumi

    2014-01-01

    Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.

  12. On the Occurrence of Wide Binaries in the Local Disk and Halo Populations

    NASA Astrophysics Data System (ADS)

    Hartman, Zachary; Lepine, Sebastien

    2018-01-01

    We present results from our search for wide binaries in the SUPERBLINK+GAIA all-sky catalog of 2.8 million high proper motion stars (μ>40 mas/yr). Through a Bayesian analysis of common proper motion pairs, we have identified highly probable wide binary/multiple systems based on statistics of their proper motion differences and angular separations. Using a reduced proper motion diagram, we determine whether these wide are part of the young disk, old disk, or Galactic halo population. We examine the relative occurrence rate for very wide companions in these respective populations. All groups are found to contain a significant number of wide binary systems, with about 1 percent of the stars in each group having pairs with separations >1,000 AU.

  13. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  14. Multi-scale statistical analysis of coronal solar activity

    DOE PAGES

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  15. 76 FR 52533 - Personnel Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ... financial impact on agencies. Another commenter stated that the OPM's Enterprise Human Resource Integration..., statistical analysis, and raw data used to justify the rule and the human capital cost increase to implement... activities that are properly considered functions of agency human resources offices and thus ensure that an...

  16. Statistical Analysis for the Solomon Four-Group Design. Research Report 99-06.

    ERIC Educational Resources Information Center

    van Engelenburg, Gijsbert

    The Solomon four-group design (R. Solomon, 1949) is a very useful experimental design to investigate the main effect of a pretest and the interaction of pretest and treatment. Although the design was proposed half a century ago, no proper data analysis techniques have been available. This paper describes how data from the Solomon four-group design…

  17. 75 FR 81999 - Notice of Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-29

    ... comments which: (1) Evaluate whether the proposed collection of information is necessary for the proper...) Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information... study will use descriptive statistics and regression analysis to study how student outcomes and school...

  18. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  19. Actuarial analysis of surgical results: rationale and method.

    PubMed

    Grunkemeier, G L; Starr, A

    1977-11-01

    The use of time-related methods of statistical analysis is essential for valid evaluation of the long-term results of a surgical procedure. Accurate comparison of two procedures or two prosthetic devices is possible only when the length of follow-up is properly accounted for. The purpose of this report is to make the technical aspects of the acturial, or life table, method easily accessible to the surgeon, with emphasis on the motivation for and the rationale behind it. This topic is illustrated in terms of heart valve prostheses, a field that is rapidly developing. Both the authors and readers of articles must be aware that controversies surrounding the relative merits of various prosthetic designs or operative procedures can be settled only if proper time-related methods of analysis are utilized.

  20. Modelling night-time ecosystem respiration by a constrained source optimization method

    Treesearch

    Chun-Tai Lai; Gabriel Katul; John Butnor; David Ellsworth; Ram Oren

    2002-01-01

    One of the main challenges to quantifying ecosystem carbon budgets is properly quantifying the magnitude of night-time ecosystem respiration. Inverse Lagrangian dispersion analysis provides a promising approach to addressing such a problem when measured mean CO2 concentration profiles and nocturnal velocity statistics are available. An inverse...

  1. Proper joint analysis of summary association statistics requires the adjustment of heterogeneity in SNP coverage pattern.

    PubMed

    Zhang, Han; Wheeler, William; Song, Lei; Yu, Kai

    2017-07-07

    As meta-analysis results published by consortia of genome-wide association studies (GWASs) become increasingly available, many association summary statistics-based multi-locus tests have been developed to jointly evaluate multiple single-nucleotide polymorphisms (SNPs) to reveal novel genetic architectures of various complex traits. The validity of these approaches relies on the accurate estimate of z-score correlations at considered SNPs, which in turn requires knowledge on the set of SNPs assessed by each study participating in the meta-analysis. However, this exact SNP coverage information is usually unavailable from the meta-analysis results published by GWAS consortia. In the absence of the coverage information, researchers typically estimate the z-score correlations by making oversimplified coverage assumptions. We show through real studies that such a practice can generate highly inflated type I errors, and we demonstrate the proper way to incorporate correct coverage information into multi-locus analyses. We advocate that consortia should make SNP coverage information available when posting their meta-analysis results, and that investigators who develop analytic tools for joint analyses based on summary data should pay attention to the variation in SNP coverage and adjust for it appropriately. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  2. Bayesian models: A statistical primer for ecologists

    USGS Publications Warehouse

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  3. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  4. Common Scientific and Statistical Errors in Obesity Research

    PubMed Central

    George, Brandon J.; Beasley, T. Mark; Brown, Andrew W.; Dawson, John; Dimova, Rositsa; Divers, Jasmin; Goldsby, TaShauna U.; Heo, Moonseong; Kaiser, Kathryn A.; Keith, Scott; Kim, Mimi Y.; Li, Peng; Mehta, Tapan; Oakes, J. Michael; Skinner, Asheley; Stuart, Elizabeth; Allison, David B.

    2015-01-01

    We identify 10 common errors and problems in the statistical analysis, design, interpretation, and reporting of obesity research and discuss how they can be avoided. The 10 topics are: 1) misinterpretation of statistical significance, 2) inappropriate testing against baseline values, 3) excessive and undisclosed multiple testing and “p-value hacking,” 4) mishandling of clustering in cluster randomized trials, 5) misconceptions about nonparametric tests, 6) mishandling of missing data, 7) miscalculation of effect sizes, 8) ignoring regression to the mean, 9) ignoring confirmation bias, and 10) insufficient statistical reporting. We hope that discussion of these errors can improve the quality of obesity research by helping researchers to implement proper statistical practice and to know when to seek the help of a statistician. PMID:27028280

  5. Turbulent Flow Over Large Roughness Elements: Effect of Frontal and Plan Solidity on Turbulence Statistics and Structure

    NASA Astrophysics Data System (ADS)

    Placidi, M.; Ganapathisubramani, B.

    2018-04-01

    Wind-tunnel experiments were carried out on fully-rough boundary layers with large roughness (δ /h ≈ 10, where h is the height of the roughness elements and δ is the boundary-layer thickness). Twelve different surface conditions were created by using LEGO™ bricks of uniform height. Six cases are tested for a fixed plan solidity (λ _P) with variations in frontal density (λ _F), while the other six cases have varying λ _P for fixed λ _F. Particle image velocimetry and floating-element drag-balance measurements were performed. The current results complement those contained in Placidi and Ganapathisubramani (J Fluid Mech 782:541-566, 2015), extending the previous analysis to the turbulence statistics and spatial structure. Results indicate that mean velocity profiles in defect form agree with Townsend's similarity hypothesis with varying λ _F, however, the agreement is worse for cases with varying λ _P. The streamwise and wall-normal turbulent stresses, as well as the Reynolds shear stresses, show a lack of similarity across most examined cases. This suggests that the critical height of the roughness for which outer-layer similarity holds depends not only on the height of the roughness, but also on the local wall morphology. A new criterion based on shelter solidity, defined as the sheltered plan area per unit wall-parallel area, which is similar to the `effective shelter area' in Raupach and Shaw (Boundary-Layer Meteorol 22:79-90, 1982), is found to capture the departure of the turbulence statistics from outer-layer similarity. Despite this lack of similarity reported in the turbulence statistics, proper orthogonal decomposition analysis, as well as two-point spatial correlations, show that some form of universal flow structure is present, as all cases exhibit virtually identical proper orthogonal decomposition mode shapes and correlation fields. Finally, reduced models based on proper orthogonal decomposition reveal that the small scales of the turbulence play a significant role in assessing outer-layer similarity.

  6. Synthetic data sets for the identification of key ingredients for RNA-seq differential analysis.

    PubMed

    Rigaill, Guillem; Balzergue, Sandrine; Brunaud, Véronique; Blondet, Eddy; Rau, Andrea; Rogier, Odile; Caius, José; Maugis-Rabusseau, Cathy; Soubigou-Taconnat, Ludivine; Aubourg, Sébastien; Lurin, Claire; Martin-Magniette, Marie-Laure; Delannoy, Etienne

    2018-01-01

    Numerous statistical pipelines are now available for the differential analysis of gene expression measured with RNA-sequencing technology. Most of them are based on similar statistical frameworks after normalization, differing primarily in the choice of data distribution, mean and variance estimation strategy and data filtering. We propose an evaluation of the impact of these choices when few biological replicates are available through the use of synthetic data sets. This framework is based on real data sets and allows the exploration of various scenarios differing in the proportion of non-differentially expressed genes. Hence, it provides an evaluation of the key ingredients of the differential analysis, free of the biases associated with the simulation of data using parametric models. Our results show the relevance of a proper modeling of the mean by using linear or generalized linear modeling. Once the mean is properly modeled, the impact of the other parameters on the performance of the test is much less important. Finally, we propose to use the simple visualization of the raw P-value histogram as a practical evaluation criterion of the performance of differential analysis methods on real data sets. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. [Statistical validity of the Mexican Food Security Scale and the Latin American and Caribbean Food Security Scale].

    PubMed

    Villagómez-Ornelas, Paloma; Hernández-López, Pedro; Carrasco-Enríquez, Brenda; Barrios-Sánchez, Karina; Pérez-Escamilla, Rafael; Melgar-Quiñónez, Hugo

    2014-01-01

    This article validates the statistical consistency of two food security scales: the Mexican Food Security Scale (EMSA) and the Latin American and Caribbean Food Security Scale (ELCSA). Validity tests were conducted in order to verify that both scales were consistent instruments, conformed by independent, properly calibrated and adequately sorted items, arranged in a continuum of severity. The following tests were developed: sorting of items; Cronbach's alpha analysis; parallelism of prevalence curves; Rasch models; sensitivity analysis through mean differences' hypothesis test. The tests showed that both scales meet the required attributes and are robust statistical instruments for food security measurement. This is relevant given that the lack of access to food indicator, included in multidimensional poverty measurement in Mexico, is calculated with EMSA.

  8. [Analysis the epidemiological features of 3,258 patients with allergic rhinitis in Yichang City].

    PubMed

    Chen, Bo; Zhang, Zhimao; Pei, Zhi; Chen, Shihan; Du, Zhimei; Lan, Yan; Han, Bei; Qi, Qi

    2015-02-01

    To investigate the epidemiological features in patients with allergic rhinitis (AR) in Yichang city, and put forward effective prevention and control measures. Collecting the data of allergic rhinitis in city proper from 2010 to 2013, input the data into the database and used statistical analysis. In recent years, the AR patients in this area increased year by year. The spring and the winter were the peak season of onset. The patients was constituted by young men. There was statistically significant difference between the age, the area,and the gender (P < 0.01). The history of allergy and the diseases related to the gender composition had statistical significance difference (P < 0.05). The allergens and the positive degree in gender, age structure had statistically significant difference (P < 0.01). Need to conduct the healthy propaganda and education, optimizing the environment, change the bad habits, timely medical treatment, standard treatment.

  9. Secular Extragalactic Parallax and Geometric Distances with Gaia Proper Motions

    NASA Astrophysics Data System (ADS)

    Paine, Jennie; Darling, Jeremiah K.

    2018-06-01

    The motion of the Solar System with respect to the cosmic microwave background (CMB) rest frame creates a well measured dipole in the CMB, which corresponds to a linear solar velocity of about 78 AU/yr. This motion causes relatively nearby extragalactic objects to appear to move compared to more distant objects, an effect that can be measured in the proper motions of nearby galaxies. An object at 1 Mpc and perpendicular to the CMB apex will exhibit a secular parallax, observed as a proper motion, of 78 µas/yr. The relatively large peculiar motions of galaxies make the detection of secular parallax challenging for individual objects. Instead, a statistical parallax measurement can be made for a sample of objects with proper motions, where the global parallax signal is modeled as an E-mode dipole that diminishes linearly with distance. We present preliminary results of applying this model to a sample of nearby galaxies with Gaia proper motions to detect the statistical secular parallax signal. The statistical measurement can be used to calibrate the canonical cosmological “distance ladder.”

  10. An Intuitive Graphical Approach to Understanding the Split-Plot Experiment

    ERIC Educational Resources Information Center

    Robinson, Timothy J.; Brenneman, William A.; Myers, William R.

    2009-01-01

    While split-plot designs have received considerable attention in the literature over the past decade, there seems to be a general lack of intuitive understanding of the error structure of these designs and the resulting statistical analysis. Typically, students learn the proper error terms for testing factors of a split-plot design via "expected…

  11. A polynomial-chaos-expansion-based building block approach for stochastic analysis of photonic circuits

    NASA Astrophysics Data System (ADS)

    Waqas, Abi; Melati, Daniele; Manfredi, Paolo; Grassi, Flavia; Melloni, Andrea

    2018-02-01

    The Building Block (BB) approach has recently emerged in photonic as a suitable strategy for the analysis and design of complex circuits. Each BB can be foundry related and contains a mathematical macro-model of its functionality. As well known, statistical variations in fabrication processes can have a strong effect on their functionality and ultimately affect the yield. In order to predict the statistical behavior of the circuit, proper analysis of the uncertainties effects is crucial. This paper presents a method to build a novel class of Stochastic Process Design Kits for the analysis of photonic circuits. The proposed design kits directly store the information on the stochastic behavior of each building block in the form of a generalized-polynomial-chaos-based augmented macro-model obtained by properly exploiting stochastic collocation and Galerkin methods. Using this approach, we demonstrate that the augmented macro-models of the BBs can be calculated once and stored in a BB (foundry dependent) library and then used for the analysis of any desired circuit. The main advantage of this approach, shown here for the first time in photonics, is that the stochastic moments of an arbitrary photonic circuit can be evaluated by a single simulation only, without the need for repeated simulations. The accuracy and the significant speed-up with respect to the classical Monte Carlo analysis are verified by means of classical photonic circuit example with multiple uncertain variables.

  12. An Attempt at Quantifying Factors that Affect Efficiency in the Management of Solid Waste Produced by Commercial Businesses in the City of Tshwane, South Africa

    PubMed Central

    Worku, Yohannes; Muchie, Mammo

    2012-01-01

    Objective. The objective was to investigate factors that affect the efficient management of solid waste produced by commercial businesses operating in the city of Pretoria, South Africa. Methods. Data was gathered from 1,034 businesses. Efficiency in solid waste management was assessed by using a structural time-based model designed for evaluating efficiency as a function of the length of time required to manage waste. Data analysis was performed using statistical procedures such as frequency tables, Pearson's chi-square tests of association, and binary logistic regression analysis. Odds ratios estimated from logistic regression analysis were used for identifying key factors that affect efficiency in the proper disposal of waste. Results. The study showed that 857 of the 1,034 businesses selected for the study (83%) were found to be efficient enough with regards to the proper collection and disposal of solid waste. Based on odds ratios estimated from binary logistic regression analysis, efficiency in the proper management of solid waste was significantly influenced by 4 predictor variables. These 4 influential predictor variables are lack of adherence to waste management regulations, wrong perception, failure to provide customers with enough trash cans, and operation of businesses by employed managers, in a decreasing order of importance. PMID:23209483

  13. Generalized linear models and point count data: statistical considerations for the design and analysis of monitoring studies

    Treesearch

    Nathaniel E. Seavy; Suhel Quader; John D. Alexander; C. John Ralph

    2005-01-01

    The success of avian monitoring programs to effectively guide management decisions requires that studies be efficiently designed and data be properly analyzed. A complicating factor is that point count surveys often generate data with non-normal distributional properties. In this paper we review methods of dealing with deviations from normal assumptions, and we focus...

  14. Statistical learning and selective inference.

    PubMed

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  15. Statistical flaws in design and analysis of fertility treatment studies on cryopreservation raise doubts on the conclusions

    PubMed Central

    van Gelder, P.H.A.J.M.; Nijs, M.

    2011-01-01

    Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care. PMID:24753877

  16. Statistical flaws in design and analysis of fertility treatment -studies on cryopreservation raise doubts on the conclusions.

    PubMed

    van Gelder, P H A J M; Nijs, M

    2011-01-01

    Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost -importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the -required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper -interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care.

  17. A scaling procedure for the response of an isolated system with high modal overlap factor

    NASA Astrophysics Data System (ADS)

    De Rosa, S.; Franco, F.

    2008-10-01

    The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).

  18. An Assessment of Oral Hygiene in 7-14-Year-Old Children undergoing Orthodontic Treatment.

    PubMed

    Krupińska-Nanys, Magdalena; Zarzecka, Joanna

    2015-01-01

    The study is focused on increased risk of dental plaque accumulation among the children undergoing orthodontic treatment in consideration of individual hygiene and dietary habits. The study was conducted among 91 children aged 7-14 including 47 girls and 44 boys. The main objectives of the study were: API index, plaque pH, DMF index, proper hygiene and dietary habits. Statistical analysis was provided in Microsoft Office Exel spreadsheet and STATISTICA statistical software. The average API index among the children wearing removable appliance was 9 (SD = 13), and among children without appliances was 16 (SD = 21). DMF index for patients using appliances was 5 (SD = 3) and for those without appliances was 4 (SD = 2). The average plaque pH was 6 for children with appliances (SD = 0.9) and 6.2 without ones (SD = 0.3). In patients in whom there is a higher risk of dental plaque accumulating, correct oral hygiene supported with regular visits to the dentist is one of the best ways to control dental caries. In the fight against caries the most effective and only approach is to promote awareness of the problem, foster proper hygiene and nutritional habits, as well as educate children from a very young age in how to maintain proper oral hygiene.

  19. Six Guidelines for Interesting Research.

    PubMed

    Gray, Kurt; Wegner, Daniel M

    2013-09-01

    There are many guides on proper psychology, but far fewer on interesting psychology. This article presents six guidelines for interesting research. The first three-Phenomena First, Be Surprising, and Grandmothers, Not Scientists-suggest how to choose your research question; the last three-Be The Participant, Simple Statistics, and Powerful Beginnings-suggest how to answer your research question and offer perspectives on experimental design, statistical analysis, and effective communication. These guidelines serve as reminders that replicability is necessary but not sufficient for compelling psychological science. Interesting research considers subjective experience; it listens to the music of the human condition. © The Author(s) 2013.

  20. Studies on the psychosomatic functioning of ill-health according to Eastern and Western medicine. 1. Visual observation of the sublingual vein for early detection of vital energy stagnation and blood stasis.

    PubMed

    Takeichi, M; Sato, T

    1999-01-01

    Computer-assisted image analyses were performed on the tongue color of 95 medical students without previous history of blood stasis-related condition to clarify the mutual relationship of the color of the tongue proper, the coating, and sublingual vein. The location of the measurement for the tongue proper was the underside of the tongue, and location of the measurement for the tongue coating was the upper surface of the tongue. A linear correlation analysis showed a correlation for each of the different positions for the non-normalized red value and normalized blue value. This analysis also demonstrated a statistically-significant relationship between the tongue proper and the sublingual vein using Red-Green-Blue components and normalized Red-Green-Blue components (r = +0.670 - 0.817, p < 0.0001). The most significant correlation between the tongue proper and the sublingual vein was the normalized red value and the normalized Red-Green-Blue values for minimizing the range of the standard error of the mean (r = +0.745, p < 0.0001), although non-normalized blue had the highest correlation coefficient. Therefore, it seems reasonable to select those normalized red values for the comparison in the tongue color analysis. Correlation of the color between the sublingual vein and the tongue proper strongly suggests that inspection with the naked eye of the sublingual vein is useful for the early detection of vital energy stagnation and blood stasis. Also, because of its close relation to sustained chronic stress, changes in the sublingual vein might be available as one physiological parameter of a stress reaction.

  1. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  2. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    PubMed

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  3. Statistical Models for Averaging of the Pump–Probe Traces: Example of Denoising in Terahertz Time-Domain Spectroscopy

    NASA Astrophysics Data System (ADS)

    Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem

    2018-05-01

    In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.

  4. Application of Linear Mixed-Effects Models in Human Neuroscience Research: A Comparison with Pearson Correlation in Two Auditory Electrophysiology Studies.

    PubMed

    Koerner, Tess K; Zhang, Yang

    2017-02-27

    Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers.

  5. Statistical analysis of ultrasonic measurements in concrete

    NASA Astrophysics Data System (ADS)

    Chiang, Chih-Hung; Chen, Po-Chih

    2002-05-01

    Stress wave techniques such as measurements of ultrasonic pulse velocity are often used to evaluate concrete quality in structures. For proper interpretation of measurement results, the dependence of pulse transit time on the average acoustic impedance and the material homogeneity along the sound path need to be examined. Semi-direct measurement of pulse velocity could be more convenient than through transmission measurement. It is not necessary to assess both sides of concrete floors or walls. A novel measurement scheme is proposed and verified based on statistical analysis. It is shown that Semi-direct measurements are very effective for gathering large amount of pulse velocity data from concrete reference specimens. The variability of measurements is comparable with that reported by American Concrete Institute using either break-off or pullout tests.

  6. Toward improved analysis of concentration data: Embracing nondetects.

    PubMed

    Shoari, Niloofar; Dubé, Jean-Sébastien

    2018-03-01

    Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.

  7. Published GMO studies find no evidence of harm when corrected for multiple comparisons.

    PubMed

    Panchin, Alexander Y; Tuzhikov, Alexander I

    2017-03-01

    A number of widely debated research articles claiming possible technology-related health concerns have influenced the public opinion on genetically modified food safety. We performed a statistical reanalysis and review of experimental data presented in some of these studies and found that quite often in contradiction with the authors' conclusions the data actually provides weak evidence of harm that cannot be differentiated from chance. In our opinion the problem of statistically unaccounted multiple comparisons has led to some of the most cited anti-genetically modified organism health claims in history. We hope this analysis puts the original results of these studies into proper context.

  8. Truncated Linear Statistics Associated with the Eigenvalues of Random Matrices II. Partial Sums over Proper Time Delays for Chaotic Quantum Dots

    NASA Astrophysics Data System (ADS)

    Grabsch, Aurélien; Majumdar, Satya N.; Texier, Christophe

    2017-06-01

    Invariant ensembles of random matrices are characterized by the distribution of their eigenvalues \\{λ _1,\\ldots ,λ _N\\}. We study the distribution of truncated linear statistics of the form \\tilde{L}=\\sum _{i=1}^p f(λ _i) with p

  9. Mechanical Characterization of Polysilicon MEMS: A Hybrid TMCMC/POD-Kriging Approach.

    PubMed

    Mirzazadeh, Ramin; Eftekhar Azam, Saeed; Mariani, Stefano

    2018-04-17

    Microscale uncertainties related to the geometry and morphology of polycrystalline silicon films, constituting the movable structures of micro electro-mechanical systems (MEMS), were investigated through a joint numerical/experimental approach. An on-chip testing device was designed and fabricated to deform a compliant polysilicon beam. In previous studies, we showed that the scattering in the input–output characteristics of the device can be properly described only if statistical features related to the morphology of the columnar polysilicon film and to the etching process adopted to release the movable structure are taken into account. In this work, a high fidelity finite element model of the device was used to feed a transitional Markov chain Monte Carlo (TMCMC) algorithm for the estimation of the unknown parameters governing the aforementioned statistical features. To reduce the computational cost of the stochastic analysis, a synergy of proper orthogonal decomposition (POD) and kriging interpolation was adopted. Results are reported for a batch of nominally identical tested devices, in terms of measurement error-affected probability distributions of the overall Young’s modulus of the polysilicon film and of the overetch depth.

  10. Statistical Analysis of Japanese Structural Damage Data

    DTIC Science & Technology

    1977-01-01

    buildings and no ready correlation between I-beam and lattice work columns could be established. The complete listing of the buildings contained in the final...subclassification efforts in this structure class. Of the 90 buildings in the data base, two have such light lattice work steel columns that they would...more properly be clas- sified as Very Light Steel Frame Buildings; six have concrete panel walls; two have lattice steel columns that are filled with

  11. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis

    PubMed Central

    Lin, Johnny; Bentler, Peter M.

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511

  12. Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain

    PubMed Central

    Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young

    2010-01-01

    Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071

  13. Radio Measurements of the Stellar Proper Motions in the Core of the Orion Nebula Cluster

    NASA Astrophysics Data System (ADS)

    Dzib, Sergio A.; Loinard, Laurent; Rodríguez, Luis F.; Gómez, Laura; Forbrich, Jan; Menten, Karl M.; Kounkel, Marina A.; Mioduszewski, Amy J.; Hartmann, Lee; Tobin, John J.; Rivera, Juana L.

    2017-01-01

    Using multi-epoch Very Large Array observations, covering a time baseline of 29.1 years, we have measured the proper motions of 88 young stars with compact radio emission in the core of the Orion Nebula Cluster (ONC) and the neighboring BN/KL region. Our work increases the number of young stars with measured proper motion at radio frequencies by a factor of 2.5 and enables us to perform a better statistical analysis of the kinematics of the region than was previously possible. Most stars (79 out of 88) have proper motions consistent with a Gaussian distribution centered on \\overline{{μ }α \\cos δ }=1.07+/- 0.09 mas yr-1, and \\overline{{μ }δ }=-0.84+/- 0.16 mas yr-1, with velocity dispersions of {σ }α =1.08+/- 0.07 mas yr-1, {σ }δ =1.27+/- 0.15 mas yr-1. We looked for organized movements of these stars but found no clear indication of radial expansion/contraction or rotation. The remaining nine stars in our sample show peculiar proper motions that differ from the mean proper motions of the ONC by more than 3σ. One of these stars, V 1326 Ori, could have been expelled from the Orion Trapezium 7000 years ago. Two could be related to the multi-stellar disintegration in the BN/KL region, in addition to the previously known sources BN, I and n. The others either have high uncertainties (so their anomalous proper motions are not firmly established) or could be foreground objects.

  14. 75 FR 37839 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-30

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... Systems, Bureau of Labor Statistics, Room 4080, 2 Massachusetts Avenue, NE., [[Page 37840

  15. 75 FR 5346 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... Clearance Officer, Division of Management Systems, Bureau of Labor Statistics, Room 4080, 2 Massachusetts...

  16. 77 FR 36296 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... Nora Kincaid, BLS Clearance Officer, Division of Management Systems, Bureau of Labor Statistics, Room...

  17. 78 FR 41958 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... Nora Kincaid, BLS Clearance Officer, Division of Management Systems, Bureau of Labor Statistics, Room...

  18. 76 FR 71076 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting comments on the proposed extension of the ``BLS Occupational Safety and Health Statistics (OSHS...

  19. Gap-free segmentation of vascular networks with automatic image processing pipeline.

    PubMed

    Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas

    2017-03-01

    Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Quantitative investigation of inappropriate regression model construction and the importance of medical statistics experts in observational medical research: a cross-sectional study.

    PubMed

    Nojima, Masanori; Tokunaga, Mutsumi; Nagamura, Fumitaka

    2018-05-05

    To investigate under what circumstances inappropriate use of 'multivariate analysis' is likely to occur and to identify the population that needs more support with medical statistics. The frequency of inappropriate regression model construction in multivariate analysis and related factors were investigated in observational medical research publications. The inappropriate algorithm of using only variables that were significant in univariate analysis was estimated to occur at 6.4% (95% CI 4.8% to 8.5%). This was observed in 1.1% of the publications with a medical statistics expert (hereinafter 'expert') as the first author, 3.5% if an expert was included as coauthor and in 12.2% if experts were not involved. In the publications where the number of cases was 50 or less and the study did not include experts, inappropriate algorithm usage was observed with a high proportion of 20.2%. The OR of the involvement of experts for this outcome was 0.28 (95% CI 0.15 to 0.53). A further, nation-level, analysis showed that the involvement of experts and the implementation of unfavourable multivariate analysis are associated at the nation-level analysis (R=-0.652). Based on the results of this study, the benefit of participation of medical statistics experts is obvious. Experts should be involved for proper confounding adjustment and interpretation of statistical models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  2. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    NASA Astrophysics Data System (ADS)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  3. ON THE FERMI -GBM EVENT 0.4 s AFTER GW150914

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greiner, J.; Yu, H.-F.; Burgess, J. M.

    In view of the recent report by Connaughton et al., we analyze continuous time-tagged event (TTE) data of Fermi -gamma-ray burst monitor (GBM) around the time of the gravitational-wave event GW 150914. We find that after proper accounting for low-count statistics, the GBM transient event at 0.4 s after GW 150914 is likely not due to an astrophysical source, but consistent with a background fluctuation, removing the tension between the INTEGRAL /ACS non-detection and GBM. Additionally, reanalysis of other short GRBs shows that without proper statistical modeling the fluence of faint events is over-predicted, as verified for some joint GBM–ACSmore » detections of short GRBs. We detail the statistical procedure to correct these biases. As a result, faint short GRBs, verified by ACS detections, with significances in the broadband light curve even smaller than that of the GBM–GW150914 event are recovered as proper non-zero source, while the GBM–GW150914 event is consistent with zero fluence.« less

  4. Standard deviation and standard error of the mean.

    PubMed

    Lee, Dong Kyu; In, Junyong; Lee, Sangseok

    2015-06-01

    In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.

  5. Standard deviation and standard error of the mean

    PubMed Central

    In, Junyong; Lee, Sangseok

    2015-01-01

    In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results. PMID:26045923

  6. 76 FR 6161 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-03

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... comments to Carol Rowan, BLS Clearance Officer, Division of Management Systems, Bureau of Labor Statistics...

  7. 77 FR 27798 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-11

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... comments to Carol Rowan, BLS Clearance Officer, Division of Management Systems, Bureau of Labor Statistics...

  8. 76 FR 71075 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... Statistics, Room 4080, 2 Massachusetts Avenue NE., Washington, DC 20212. Written comments also may be...

  9. 76 FR 60930 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Proposed Collection, Comment Request ACTION: Notice... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting comments concerning the proposed extension of the ``Mass Layoff Statistics Program.'' A copy of the...

  10. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  11. Assessing statistical differences between parameters estimates in Partial Least Squares path modeling.

    PubMed

    Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten

    2018-01-01

    Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.

  12. [Practical aspects regarding sample size in clinical research].

    PubMed

    Vega Ramos, B; Peraza Yanes, O; Herrera Correa, G; Saldívar Toraya, S

    1996-01-01

    The knowledge of the right sample size let us to be sure if the published results in medical papers had a suitable design and a proper conclusion according to the statistics analysis. To estimate the sample size we must consider the type I error, type II error, variance, the size of the effect, significance and power of the test. To decide what kind of mathematics formula will be used, we must define what kind of study we have, it means if its a prevalence study, a means values one or a comparative one. In this paper we explain some basic topics of statistics and we describe four simple samples of estimation of sample size.

  13. Application of Linear Mixed-Effects Models in Human Neuroscience Research: A Comparison with Pearson Correlation in Two Auditory Electrophysiology Studies

    PubMed Central

    Koerner, Tess K.; Zhang, Yang

    2017-01-01

    Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers. PMID:28264422

  14. Benefits of statistical molecular design, covariance analysis, and reference models in QSAR: a case study on acetylcholinesterase

    NASA Astrophysics Data System (ADS)

    Andersson, C. David; Hillgren, J. Mikael; Lindgren, Cecilia; Qian, Weixing; Akfur, Christine; Berg, Lotta; Ekström, Fredrik; Linusson, Anna

    2015-03-01

    Scientific disciplines such as medicinal- and environmental chemistry, pharmacology, and toxicology deal with the questions related to the effects small organic compounds exhort on biological targets and the compounds' physicochemical properties responsible for these effects. A common strategy in this endeavor is to establish structure-activity relationships (SARs). The aim of this work was to illustrate benefits of performing a statistical molecular design (SMD) and proper statistical analysis of the molecules' properties before SAR and quantitative structure-activity relationship (QSAR) analysis. Our SMD followed by synthesis yielded a set of inhibitors of the enzyme acetylcholinesterase (AChE) that had very few inherent dependencies between the substructures in the molecules. If such dependencies exist, they cause severe errors in SAR interpretation and predictions by QSAR-models, and leave a set of molecules less suitable for future decision-making. In our study, SAR- and QSAR models could show which molecular sub-structures and physicochemical features that were advantageous for the AChE inhibition. Finally, the QSAR model was used for the prediction of the inhibition of AChE by an external prediction set of molecules. The accuracy of these predictions was asserted by statistical significance tests and by comparisons to simple but relevant reference models.

  15. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  16. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  17. Improved Diagnostic Accuracy of SPECT Through Statistical Analysis and the Detection of Hot Spots at the Primary Sensorimotor Area for the Diagnosis of Alzheimer Disease in a Community-Based Study: "The Osaki-Tajiri Project".

    PubMed

    Kaneta, Tomohiro; Nakatsuka, Masahiro; Nakamura, Kei; Seki, Takashi; Yamaguchi, Satoshi; Tsuboi, Masahiro; Meguro, Kenichi

    2016-01-01

    SPECT is an important diagnostic tool for dementia. Recently, statistical analysis of SPECT has been commonly used for dementia research. In this study, we evaluated the accuracy of visual SPECT evaluation and/or statistical analysis for the diagnosis (Dx) of Alzheimer disease (AD) and other forms of dementia in our community-based study "The Osaki-Tajiri Project." Eighty-nine consecutive outpatients with dementia were enrolled and underwent brain perfusion SPECT with 99mTc-ECD. Diagnostic accuracy of SPECT was tested using 3 methods: visual inspection (SPECT Dx), automated diagnostic tool using statistical analysis with easy Z-score imaging system (eZIS Dx), and visual inspection plus eZIS (integrated Dx). Integrated Dx showed the highest sensitivity, specificity, and accuracy, whereas eZIS was the second most accurate method. We also observed that a higher than expected rate of SPECT images indicated false-negative cases of AD. Among these, 50% showed hypofrontality and were diagnosed as frontotemporal lobar degeneration. These cases typically showed regional "hot spots" in the primary sensorimotor cortex (ie, a sensorimotor hot spot sign), which we determined were associated with AD rather than frontotemporal lobar degeneration. We concluded that the diagnostic abilities were improved by the integrated use of visual assessment and statistical analysis. In addition, the detection of a sensorimotor hot spot sign was useful to detect AD when hypofrontality is present and improved the ability to properly diagnose AD.

  18. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  20. Understanding the Sampling Distribution and the Central Limit Theorem.

    ERIC Educational Resources Information Center

    Lewis, Charla P.

    The sampling distribution is a common source of misuse and misunderstanding in the study of statistics. The sampling distribution, underlying distribution, and the Central Limit Theorem are all interconnected in defining and explaining the proper use of the sampling distribution of various statistics. The sampling distribution of a statistic is…

  1. Cutting efficiency of Reciproc and waveOne reciprocating instruments.

    PubMed

    Plotino, Gianluca; Giansiracusa Rubini, Alessio; Grande, Nicola M; Testarelli, Luca; Gambarini, Gianluca

    2014-08-01

    The aim of the present study was to evaluate the cutting efficiency of 2 new reciprocating instruments, Reciproc and WaveOne. Twenty-four new Reciproc R25 and 24 new WaveOne Primary files were activated by using a torque-controlled motor (Silver Reciproc) and divided into 4 groups (n = 12): group 1, Reciproc activated by Reciproc ALL program; group 2, Reciproc activated by WaveOne ALL program; group 3, WaveOne activated by Reciproc ALL program; and group 4, WaveOne activated by WaveOne ALL program. The device used for the cutting test consisted of a main frame to which a mobile plastic support for the handpiece is connected and a stainless steel block containing a Plexiglas block (inPlexiglass, Rome, Italy) against which the cutting efficiency of the instruments was tested. The length of the block cut in 1 minute was measured in a computerized program with a precision of 0.1 mm. Means and standard deviations of each group were calculated, and data were statistically analyzed with 1-way analysis of variance and Bonferroni test (P < .05). Reciproc R25 displayed greater cutting efficiency than WaveOne Primary for both the movements used (P < .05); in particular, Reciproc instruments used with their proper reciprocating motion presented a statistically significant higher cutting efficiency than WaveOne instruments used with their proper reciprocating motion (P < .05). There was no statistically significant difference between the 2 movements for both instruments (P > .05). Reciproc instruments demonstrated statistically higher cutting efficiency than WaveOne instruments. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  2. Analysis of defect structure in silicon. Characterization of SEMIX material. Silicon sheet growth development for the large area silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.

    1983-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  3. Soil erosion assessment and its correlation with landslide events using remote sensing data and GIS: a case study at Penang Island, Malaysia.

    PubMed

    Pradhan, Biswajeet; Chaudhari, Amruta; Adinarayana, J; Buchroithner, Manfred F

    2012-01-01

    In this paper, an attempt has been made to assess, prognosis and observe dynamism of soil erosion by universal soil loss equation (USLE) method at Penang Island, Malaysia. Multi-source (map-, space- and ground-based) datasets were used to obtain both static and dynamic factors of USLE, and an integrated analysis was carried out in raster format of GIS. A landslide location map was generated on the basis of image elements interpretation from aerial photos, satellite data and field observations and was used to validate soil erosion intensity in the study area. Further, a statistical-based frequency ratio analysis was carried out in the study area for correlation purposes. The results of the statistical correlation showed a satisfactory agreement between the prepared USLE-based soil erosion map and landslide events/locations, and are directly proportional to each other. Prognosis analysis on soil erosion helps the user agencies/decision makers to design proper conservation planning program to reduce soil erosion. Temporal statistics on soil erosion in these dynamic and rapid developments in Penang Island indicate the co-existence and balance of ecosystem.

  4. Considerations in the statistical analysis of clinical trials in periodontitis.

    PubMed

    Imrey, P B

    1986-05-01

    Adult periodontitis has been described as a chronic infectious process exhibiting sporadic, acute exacerbations which cause quantal, localized losses of dental attachment. Many analytic problems of periodontal trials are similar to those of other chronic diseases. However, the episodic, localized, infrequent, and relatively unpredictable behavior of exacerbations, coupled with measurement error difficulties, cause some specific problems. Considerable controversy exists as to the proper selection and treatment of multiple site data from the same patient for group comparisons for epidemiologic or therapeutic evaluative purposes. This paper comments, with varying degrees of emphasis, on several issues pertinent to the analysis of periodontal trials. Considerable attention is given to the ways in which measurement variability may distort analytic results. Statistical treatments of multiple site data for descriptive summaries are distinguished from treatments for formal statistical inference to validate therapeutic effects. Evidence suggesting that sites behave independently is contested. For inferential analyses directed at therapeutic or preventive effects, analytic models based on site independence are deemed unsatisfactory. Methods of summarization that may yield more powerful analyses than all-site mean scores, while retaining appropriate treatment of inter-site associations, are suggested. Brief comments and opinions on an assortment of other issues in clinical trial analysis are preferred.

  5. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  6. Proper muscle layer damage affects ulcer healing after gastric endoscopic submucosal dissection.

    PubMed

    Horikawa, Yohei; Mimori, Nobuya; Mizutamari, Hiroya; Kato, Yuhei; Shimazu, Kazuhiro; Sawaguchi, Masayuki; Tawaraya, Shin; Igarashi, Kimihiro; Okubo, Syunji

    2015-11-01

    Endoscopic submucosal dissection (ESD) is the established therapy for superficial gastrointestinal neoplasms. However, management of the artificial ulcers associated with ESD has become important and the relationship between ulcer healing factors and treatment is still unclear. We aimed to evaluate ESD-related artificial ulcer reduction ratio at 4 weeks to assess factors associating with ulcer healing after ESD that may lead to optimal treatment. Between January 2009 and December 2013, a total of 375 lesions fulfilled the expanded criteria for ESD. We defined ulcer reduction rate <90% as (A) poor-healing group; and rate ≥90% as (B) well-healing group. After exclusion, 328 lesions were divided into two groups and analyzed. These two groups were compared based on clinicopathological/endoscopic features, concomitant drugs, and treatment. Ulcer reduction rate was significantly correlated with factors related to the ESD procedure (i.e. procedure time, submucosal fibrosis, and injury of the proper muscle layer, in univariate analysis. Multivariate logistic regression analysis showed that submucosal fibrosis (F2) (P = 0.03; OR, 16.46; 95% CI, 1.31-206.73) and injury of the proper muscle layer (P = 0.01; OR, 4.27; 95% CI, 2.04-8.92) were statistically significant predictors of delayed healing. This single-center retrospective study indicated that ESD-induced artificial ulcer healing was affected by submucosal fibrosis and injury of the proper muscle layer, which induced damage to the muscle layer. Therefore, the preferable pharmacotherapy can be determined on completion of the ESD procedure. © 2015 The Authors Digestive Endoscopy © 2015 Japan Gastroenterological Endoscopy Society.

  7. Statistical results on restorative dentistry experiments: effect of the interaction between main variables

    PubMed Central

    CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi

    2010-01-01

    Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003

  8. Machine learning to analyze images of shocked materials for precise and accurate measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dresselhaus-Cooper, Leora; Howard, Marylesa; Hock, Margaret C.

    A supervised machine learning algorithm, called locally adaptive discriminant analysis (LADA), has been developed to locate boundaries between identifiable image features that have varying intensities. LADA is an adaptation of image segmentation, which includes techniques that find the positions of image features (classes) using statistical intensity distributions for each class in the image. In order to place a pixel in the proper class, LADA considers the intensity at that pixel and the distribution of intensities in local (nearby) pixels. This paper presents the use of LADA to provide, with statistical uncertainties, the positions and shapes of features within ultrafast imagesmore » of shock waves. We demonstrate the ability to locate image features including crystals, density changes associated with shock waves, and material jetting caused by shock waves. This algorithm can analyze images that exhibit a wide range of physical phenomena because it does not rely on comparison to a model. LADA enables analysis of images from shock physics with statistical rigor independent of underlying models or simulations.« less

  9. On the Distribution of Orbital Poles of Milky Way Satellites

    NASA Astrophysics Data System (ADS)

    Palma, Christopher; Majewski, Steven R.; Johnston, Kathryn V.

    2002-01-01

    In numerous studies of the outer Galactic halo some evidence for accretion has been found. If the outer halo did form in part or wholly through merger events, we might expect to find coherent streams of stars and globular clusters following orbits similar to those of their parent objects, which are assumed to be present or former Milky Way dwarf satellite galaxies. We present a study of this phenomenon by assessing the likelihood of potential descendant ``dynamical families'' in the outer halo. We conduct two analyses: one that involves a statistical analysis of the spatial distribution of all known Galactic dwarf satellite galaxies (DSGs) and globular clusters, and a second, more specific analysis of those globular clusters and DSGs for which full phase space dynamical data exist. In both cases our methodology is appropriate only to members of descendant dynamical families that retain nearly aligned orbital poles today. Since the Sagittarius dwarf (Sgr) is considered a paradigm for the type of merger/tidal interaction event for which we are searching, we also undertake a case study of the Sgr system and identify several globular clusters that may be members of its extended dynamical family. In our first analysis, the distribution of possible orbital poles for the entire sample of outer (Rgc>8 kpc) halo globular clusters is tested for statistically significant associations among globular clusters and DSGs. Our methodology for identifying possible associations is similar to that used by Lynden-Bell & Lynden-Bell, but we put the associations on a more statistical foundation. Moreover, we study the degree of possible dynamical clustering among various interesting ensembles of globular clusters and satellite galaxies. Among the ensembles studied, we find the globular cluster subpopulation with the highest statistical likelihood of association with one or more of the Galactic DSGs to be the distant, outer halo (Rgc>25 kpc), second-parameter globular clusters. The results of our orbital pole analysis are supported by the great circle cell count methodology of Johnston, Hernquist, & Bolte. The space motions of the clusters Pal 4, NGC 6229, NGC 7006, and Pyxis are predicted to be among those most likely to show the clusters to be following stream orbits, since these clusters are responsible for the majority of the statistical significance of the association between outer halo, second-parameter globular clusters and the Milky Way DSGs. In our second analysis, we study the orbits of the 41 globular clusters and six Milky Way-bound DSGs having measured proper motions to look for objects with both coplanar orbits and similar angular momenta. Unfortunately, the majority of globular clusters with measured proper motions are inner halo clusters that are less likely to retain memory of their original orbit. Although four potential globular cluster/DSG associations are found, we believe three of these associations involving inner halo clusters to be coincidental. While the present sample of objects with complete dynamical data is small and does not include many of the globular clusters that are more likely to have been captured by the Milky Way, the methodology we adopt will become increasingly powerful as more proper motions are measured for distant Galactic satellites and globular clusters, and especially as results from the Space Interferometry Mission (SIM) become available.

  10. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  11. NCES Handbook of Survey Methods. NCES 2011-609

    ERIC Educational Resources Information Center

    Burns, Shelley, Ed.; Wang, Xiaolei, Ed.; Henning, Alexandra, Ed.

    2011-01-01

    Since its inception, the National Center for Education Statistics (NCES) has been committed to the practice of documenting its statistical methods for its customers and of seeking to avoid misinterpretation of its published data. The reason for this policy is to assure customers that proper statistical standards and techniques have been observed,…

  12. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  13. Statistical Tools And Artificial Intelligence Approaches To Predict Fracture In Bulk Forming Processes

    NASA Astrophysics Data System (ADS)

    Di Lorenzo, R.; Ingarao, G.; Fonti, V.

    2007-05-01

    The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.

  14. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Risk adjustment data validation standards. 153.350... validation standards. (a) General requirement. The State, or HHS on behalf of the State, must ensure proper implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of...

  15. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Risk adjustment data validation standards. 153.350... validation standards. (a) General requirement. The State, or HHS on behalf of the State, must ensure proper implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of...

  16. 40 CFR 91.512 - Request for public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...

  17. pROC: an open-source package for R and S+ to analyze and compare ROC curves.

    PubMed

    Robin, Xavier; Turck, Natacha; Hainard, Alexandre; Tiberti, Natalia; Lisacek, Frédérique; Sanchez, Jean-Charles; Müller, Markus

    2011-03-17

    Receiver operating characteristic (ROC) curves are useful tools to evaluate classifiers in biomedical and bioinformatics applications. However, conclusions are often reached through inconsistent use or insufficient statistical analysis. To support researchers in their ROC curves analysis we developed pROC, a package for R and S+ that contains a set of tools displaying, analyzing, smoothing and comparing ROC curves in a user-friendly, object-oriented and flexible interface. With data previously imported into the R or S+ environment, the pROC package builds ROC curves and includes functions for computing confidence intervals, statistical tests for comparing total or partial area under the curve or the operating points of different classifiers, and methods for smoothing ROC curves. Intermediary and final results are visualised in user-friendly interfaces. A case study based on published clinical and biomarker data shows how to perform a typical ROC analysis with pROC. pROC is a package for R and S+ specifically dedicated to ROC analysis. It proposes multiple statistical tests to compare ROC curves, and in particular partial areas under the curve, allowing proper ROC interpretation. pROC is available in two versions: in the R programming language or with a graphical user interface in the S+ statistical software. It is accessible at http://expasy.org/tools/pROC/ under the GNU General Public License. It is also distributed through the CRAN and CSAN public repositories, facilitating its installation.

  18. 40 CFR 90.712 - Request for public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...

  19. The VMC survey. XXVIII. Improved measurements of the proper motion of the Galactic globular cluster 47 Tucanae

    NASA Astrophysics Data System (ADS)

    Niederhofer, Florian; Cioni, Maria-Rosa L.; Rubele, Stefano; Schmidt, Thomas; Bekki, Kenji; de Grijs, Richard; Emerson, Jim; Ivanov, Valentin D.; Oliveira, Joana M.; Petr-Gotzens, Monika G.; Ripepi, Vincenzo; Sun, Ning-Chen; van Loon, Jacco Th.

    2018-05-01

    We use deep multi-epoch point-spread function (PSF) photometry taken with the Visible and Infrared Survey Telescope for Astronomy (VISTA) to measure and analyze the proper motions of stars within the Galactic globular cluster 47 Tucanae (47 Tuc, NGC 104). The observations are part of the ongoing near-infrared VISTA survey of the Magellanic Cloud system (VMC). The data analyzed in this study correspond to one VMC tile, which covers a total sky area of 1.77 deg2. Absolute proper motions with respect to 9070 background galaxies are calculated from a linear regression model applied to the positions of stars in 11 epochs in the Ks filter. The data extend over a total time baseline of about 17 months. We found an overall median proper motion of the stars within 47 Tuc of (μαcos(δ), μδ) = (+5.89 ± 0.02 (statistical) ± 0.13 (systematic), -2.14 ± 0.02 (statistical) ± 0.08 (systematic)) mas yr-1, based on the measurements of 35 000 individual sources between 5' and 42' from the cluster center. We compared our result to the proper motions from the newest US Naval Observatory CCD Astrograph Catalog (UCAC5), which includes data from the Gaia data release 1. Selecting cluster members ( 2700 stars), we found a median proper motion of (μαcos(δ), μδ) = (+5.30 ± 0.03 (statistical) ± 0.70 (systematic), -2.70 ± 0.03 (statistical) ± 0.70 (systematic)) mas yr-1. Comparing the results with measurements in the literature, we found that the values derived from the VMC data are consistent with the UCAC5 result, and are close to measurements obtained using the Hubble Space Telescope. We combined our proper motion results with radial velocity measurements from the literature and reconstructed the orbit of 47 Tuc, finding that the cluster is on an orbit with a low ellipticity and is confined within the inner 7.5 kpc of the Galaxy. We show that the use of an increased time baseline in combination with PSF-determined stellar centroids in crowded regions significantly improves the accuracy of the method. In future works, we will apply the methods described here to more VMC tiles to study in detail the kinematics of the Magellanic Clouds. Based on observations made with VISTA at the Paranal Observatory under program ID 179.B-2003.

  20. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    PubMed

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  1. Aspergillosis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  2. Candidiasis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  3. Blastomycosis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  4. Histoplasmosis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  5. The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis

    NASA Astrophysics Data System (ADS)

    Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali

    2018-04-01

    The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.

  6. On proper linearization, construction and analysis of the Boyle-van't Hoff plots and correct calculation of the osmotically inactive volume.

    PubMed

    Katkov, Igor I

    2011-06-01

    The Boyle-van't Hoff (BVH) law of physics has been widely used in cryobiology for calculation of the key osmotic parameters of cells and optimization of cryo-protocols. The proper use of linearization of the Boyle-vant'Hoff relationship for the osmotically inactive volume (v(b)) has been discussed in a rigorous way in (Katkov, Cryobiology, 2008, 57:142-149). Nevertheless, scientists in the field have been continuing to use inappropriate methods of linearization (and curve fitting) of the BVH data, plotting the BVH line and calculation of v(b). Here, we discuss the sources of incorrect linearization of the BVH relationship using concrete examples of recent publications, analyze the properties of the correct BVH line (which is unique for a given v(b)), provide appropriate statistical formulas for calculation of v(b) from the experimental data, and propose simplistic instructions (standard operation procedure, SOP) for proper normalization of the data, appropriate linearization and construction of the BVH plots, and correct calculation of v(b). The possible sources of non-linear behavior or poor fit of the data to the proper BVH line such as active water and/or solute transports, which can result in large discrepancy between the hyperosmotic and hypoosmotic parts of the BVH plot, are also discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. ON THE CONNECTION OF THE APPARENT PROPER MOTION AND THE VLBI STRUCTURE OF COMPACT RADIO SOURCES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moor, A.; Frey, S.; Lambert, S. B.

    2011-06-15

    Many of the compact extragalactic radio sources that are used as fiducial points to define the celestial reference frame are known to have proper motions detectable with long-term geodetic/astrometric very long baseline interferometry (VLBI) measurements. These changes can be as high as several hundred microarcseconds per year for certain objects. When imaged with VLBI at milliarcsecond (mas) angular resolution, these sources (radio-loud active galactic nuclei) typically show structures dominated by a compact, often unresolved 'core' and a one-sided 'jet'. The positional instability of compact radio sources is believed to be connected with changes in their brightness distribution structure. For themore » first time, we test this assumption in a statistical sense on a large sample rather than on only individual objects. We investigate a sample of 62 radio sources for which reliable long-term time series of astrometric positions as well as detailed 8 GHz VLBI brightness distribution models are available. We compare the characteristic direction of their extended jet structure and the direction of their apparent proper motion. We present our data and analysis method, and conclude that there is indeed a correlation between the two characteristic directions. However, there are cases where the {approx}1-10 mas scale VLBI jet directions are significantly misaligned with respect to the apparent proper motion direction.« less

  8. Experimental Design and Power Calculation for RNA-seq Experiments.

    PubMed

    Wu, Zhijin; Wu, Hao

    2016-01-01

    Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER.

  9. PHAST: Protein-like heteropolymer analysis by statistical thermodynamics

    NASA Astrophysics Data System (ADS)

    Frigori, Rafael B.

    2017-06-01

    PHAST is a software package written in standard Fortran, with MPI and CUDA extensions, able to efficiently perform parallel multicanonical Monte Carlo simulations of single or multiple heteropolymeric chains, as coarse-grained models for proteins. The outcome data can be straightforwardly analyzed within its microcanonical Statistical Thermodynamics module, which allows for computing the entropy, caloric curve, specific heat and free energies. As a case study, we investigate the aggregation of heteropolymers bioinspired on Aβ25-33 fragments and their cross-seeding with IAPP20-29 isoforms. Excellent parallel scaling is observed, even under numerically difficult first-order like phase transitions, which are properly described by the built-in fully reconfigurable force fields. Still, the package is free and open source, this shall motivate users to readily adapt it to specific purposes.

  10. Phase locking route behind complex periodic windows in a forced oscillator

    NASA Astrophysics Data System (ADS)

    Jan, Hengtai; Tsai, Kuo-Ting; Kuo, Li-wei

    2013-09-01

    Chaotic systems have complex reactions against an external driving force; even in cases with low-dimension oscillators, the routes to synchronization are diverse. We proposed a stroboscope-based method for analyzing driven chaotic systems in their phase space. According to two statistic quantities generated from time series, we could realize the system state and the driving behavior simultaneously. We demonstrated our method in a driven bi-stable system, which showed complex period windows under a proper driving force. With increasing periodic driving force, a route from interior periodic oscillation to phase synchronization through the chaos state could be found. Periodic windows could also be identified and the circumstances under which they occurred distinguished. Statistical results were supported by conditional Lyapunov exponent analysis to show the power in analyzing the unknown time series.

  11. Random heteropolymers preserve protein function in foreign environments

    NASA Astrophysics Data System (ADS)

    Panganiban, Brian; Qiao, Baofu; Jiang, Tao; DelRe, Christopher; Obadia, Mona M.; Nguyen, Trung Dac; Smith, Anton A. A.; Hall, Aaron; Sit, Izaac; Crosby, Marquise G.; Dennis, Patrick B.; Drockenmuller, Eric; Olvera de la Cruz, Monica; Xu, Ting

    2018-03-01

    The successful incorporation of active proteins into synthetic polymers could lead to a new class of materials with functions found only in living systems. However, proteins rarely function under the conditions suitable for polymer processing. On the basis of an analysis of trends in protein sequences and characteristic chemical patterns on protein surfaces, we designed four-monomer random heteropolymers to mimic intrinsically disordered proteins for protein solubilization and stabilization in non-native environments. The heteropolymers, with optimized composition and statistical monomer distribution, enable cell-free synthesis of membrane proteins with proper protein folding for transport and enzyme-containing plastics for toxin bioremediation. Controlling the statistical monomer distribution in a heteropolymer, rather than the specific monomer sequence, affords a new strategy to interface with biological systems for protein-based biomaterials.

  12. Multivariate meta-analysis: potential and promise.

    PubMed

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-09-10

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day 'Multivariate meta-analysis' event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Earth Observation System Flight Dynamics System Covariance Realism

    NASA Technical Reports Server (NTRS)

    Zaidi, Waqar H.; Tracewell, David

    2016-01-01

    This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.

  14. Simple Statistics: - Summarized!

    ERIC Educational Resources Information Center

    Blai, Boris, Jr.

    Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…

  15. Fungal Diseases

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  16. Mucormycosis (Zygomycosis)

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  17. Valley Fever (Coccidioidomycosis)

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  18. Symptoms of Aspergillosis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  19. Fungal Diseases: Ringworm

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  20. 22 CFR 505.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... record keeping functions; exercise of control over and hence responsibility and accountability for... proper and necessary uses even if any such uses occur infrequently. (h) Statistical record. A record in a system of records maintained for statistical research or reporting purposes only and not used in whole or...

  1. Impact of Eliminating Anchor Items Flagged from Statistical Criteria on Test Score Classifications in Common Item Equating

    ERIC Educational Resources Information Center

    Karkee, Thakur; Choi, Seung

    2005-01-01

    Proper maintenance of a scale established in the baseline year would assure the accurate estimation of growth in subsequent years. Scale maintenance is especially important when the state performance standards must be preserved for future administrations. To ensure proper maintenance of a scale, the selection of anchor items and evaluation of…

  2. Statistical Analyses of Femur Parameters for Designing Anatomical Plates.

    PubMed

    Wang, Lin; He, Kunjin; Chen, Zhengming

    2016-01-01

    Femur parameters are key prerequisites for scientifically designing anatomical plates. Meanwhile, individual differences in femurs present a challenge to design well-fitting anatomical plates. Therefore, to design anatomical plates more scientifically, analyses of femur parameters with statistical methods were performed in this study. The specific steps were as follows. First, taking eight anatomical femur parameters as variables, 100 femur samples were classified into three classes with factor analysis and Q-type cluster analysis. Second, based on the mean parameter values of the three classes of femurs, three sizes of average anatomical plates corresponding to the three classes of femurs were designed. Finally, based on Bayes discriminant analysis, a new femur could be assigned to the proper class. Thereafter, the average anatomical plate suitable for that new femur was selected from the three available sizes of plates. Experimental results showed that the classification of femurs was quite reasonable based on the anatomical aspects of the femurs. For instance, three sizes of condylar buttress plates were designed. Meanwhile, 20 new femurs are judged to which classes the femurs belong. Thereafter, suitable condylar buttress plates were determined and selected.

  3. Direct labeling of serum proteins by fluorescent dye for antibody microarray.

    PubMed

    Klimushina, M V; Gumanova, N G; Metelskaya, V A

    2017-05-06

    Analysis of serum proteome by antibody microarray is used to identify novel biomarkers and to study signaling pathways including protein phosphorylation and protein-protein interactions. Labeling of serum proteins is important for optimal performance of the antibody microarray. Proper choice of fluorescent label and optimal concentration of protein loaded on the microarray ensure good quality of imaging that can be reliably scanned and processed by the software. We have optimized direct serum protein labeling using fluorescent dye Arrayit Green 540 (Arrayit Corporation, USA) for antibody microarray. Optimized procedure produces high quality images that can be readily scanned and used for statistical analysis of protein composition of the serum. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Treatment and Outcomes of Histoplasmosis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  5. Symptoms of Valley Fever (Coccidioidomycosis)

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  6. Treatment for Valley Fever (Coccidioidomycosis)

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  7. Treatment and Outcomes of Aspergillosis

    MedlinePlus

    ... Treatment Healthcare Professionals Statistics More Resources en español Definición Síntomas Riesgo y prevención Fuentes Diagnóstico y pruebas ... and Proper Management Statistics More Resources en español Definición Síntomas Las Personas en Riesgo y Prevención Fuentes ...

  8. Evidence, temperature, and the laws of thermodynamics.

    PubMed

    Vieland, Veronica J

    2014-01-01

    A primary purpose of statistical analysis in genetics is the measurement of the strength of evidence for or against hypotheses. As with any type of measurement, a properly calibrated measurement scale is necessary if we want to be able to meaningfully compare degrees of evidence across genetic data sets, across different types of genetic studies and/or across distinct experimental modalities. In previous papers in this journal and elsewhere, my colleagues and I have argued that geneticists ought to care about the scale on which statistical evidence is measured, and we have proposed the Kelvin temperature scale as a template for a context-independent measurement scale for statistical evidence. Moreover, we have claimed that, mathematically speaking, evidence and temperature may be one and the same thing. On first blush, this might seem absurd. Temperature is a property of systems following certain laws of nature (in particular, the 1st and 2nd Law of Thermodynamics) involving very physical quantities (e.g., energy) and processes (e.g., mechanical work). But what do the laws of thermodynamics have to do with statistical systems? Here I address that question. © 2014 S. Karger AG, Basel.

  9. Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.

    PubMed

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.

  10. Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data

    PubMed Central

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786

  11. Oral cancer associated with chronic mechanical irritation of the oral mucosa.

    PubMed

    Piemonte, E; Lazos, J; Belardinelli, P; Secchi, D; Brunotto, M; Lanfranchi-Tizeira, H

    2018-03-01

    Most of the studies dealing with Chronic Mechanical Irritation (CMI) and Oral Cancer (OC) only considered prosthetic and dental variables separately, and CMI functional factors are not registered. Thus, the aim of this study was to assess OC risk in individuals with dental, prosthetic and functional CMI. Also, we examined CMI presence in relation to tumor size. A case-control study was carried out from 2009 to 2013. Study group were squamous cell carcinoma cases; control group was patients seeking dental treatment in the same institution. 153 patients were studied (Study group n=53, Control group n=100). CMI reproducibility displayed a correlation coefficient of 1 (p<0.0001). Bivariate analysis showed statistically significant associations for all variables (age, gender, tobacco and alcohol consumption and CMI). Multivariate analysis exhibited statistical significance for age, alcohol, and CMI, but not for gender or tobacco. Relationship of CMI with tumor size showed no statistically significant differences. CMI could be regarded as a risk factor for oral cancer. In individuals with other OC risk factors, proper treatment of the mechanical injuring factors (dental, prosthetic and functional) could be an important measure to reduce the risk of oral cancer.

  12. SAGITTARIUS STREAM THREE-DIMENSIONAL KINEMATICS FROM SLOAN DIGITAL SKY SURVEY STRIPE 82

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koposov, Sergey E.; Belokurov, Vasily; Evans, N. Wyn

    2013-04-01

    Using multi-epoch observations of the Stripe 82 region from the Sloan Digital Sky Survey (SDSS), we measure precise statistical proper motions of the stars in the Sagittarius (Sgr) stellar stream. The multi-band photometry and SDSS radial velocities allow us to efficiently select Sgr members and thus enhance the proper-motion precision to {approx}0.1 mas yr{sup -1}. We measure separately the proper motion of a photometrically selected sample of the main-sequence turn-off stars, as well as spectroscopically selected Sgr giants. The data allow us to determine the proper motion separately for the two Sgr streams in the south found in Koposov etmore » al. Together with the precise velocities from SDSS, our proper motions provide exquisite constraints of the three-dimensional motions of the stars in the Sgr streams.« less

  13. Aerosol, a health hazard during ultrasonic scaling: A clinico-microbiological study.

    PubMed

    Singh, Akanksha; Shiva Manjunath, R G; Singla, Deepak; Bhattacharya, Hirak S; Sarkar, Arijit; Chandra, Neeraj

    2016-01-01

    Ultrasonic scaling is a routinely used treatment to remove plaque and calculus from tooth surfaces. These scalers use water as a coolant which is splattered during the vibration of the tip. The splatter when mixed with saliva and plaque of the patients causes the aerosol highly infectious and acts as a major risk factor for transmission of the disease. In spite of necessary protection, sometimes, the operator might get infected because of the infectious nature of the splatter. To evaluate the aerosol contamination produced during ultrasonic scaling by the help of microbiological analysis. This clinico-microbiological study consisted of twenty patients. Two agar plates were used for each patient; the first was kept at the center of the operatory room 20 min before the treatment while the second agar plate was kept 40 cm away from the patient's chest during the treatment. Both the agar plates were sent for microbiological analysis. The statistical analysis was done with the help of STATA 11.0 (StataCorp. 2013. Stata Statistical Software, Release 13. College Station, TX: StataCorp LP, 4905 Lakeway Drive College Station, Texas, USA). Statistical software was used for data analysis and the P < 0.001 was considered to be statistically significant. The results for bacterial count were highly significant when compared before and during the treatment. The Gram staining showed the presence of Staphylococcus and Streptococcus species in high numbers. The aerosols and splatters produced during dental procedures have the potential to spread infection to dental personnel. Therefore, proper precautions should be taken to minimize the risk of infection to the operator.

  14. Identification of Microorganisms by High Resolution Tandem Mass Spectrometry with Accurate Statistical Significance

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo

    2016-02-01

    Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.

  15. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  16. Decrease Hospital Spending: There's an App for That! A Retrospective Analysis of Implementation of a Mobile Resident Handbook on Hospital Costs and Disposition.

    PubMed

    Holtkamp, Matthew D

    2017-10-01

    Patient care involves time sensitive decisions. Matching a patient's presenting condition with possible diagnoses requires proper assessment and diagnostic tests. Timely access to necessary information leads to improved patient care, better outcomes, and decreased costs. This study evaluated objective outcomes of the implementation of a novel Resident Handbook Application (RHAP) for smart phones. The RHAP included tools necessary to make proper assessments and to order appropriate tests. The RHAPs effectiveness was accessed using the Military Health System Military Mart database. This database includes patient specific aggregate data, including diagnosis, patient demographics, itemized cost, hospital days, and disposition status. Multivariable analysis was used to compare before and after RHAP implementation, controlling for patient demographics and diagnosis. Internal medicine admission data were used as a control group. There was a statistically significant decrease in laboratory costs and a strong trend toward statistically significant decreases in the cost of radiology performed after implementation of RHAP (p value of <0.02 and <0.07, respectively). There was also a decrease in hospital days (3.66-3.30 days), in total cost per admission ($18,866-$16,305), and in cost per hospital day per patient ($5,140-$4,936). During the same time period a Control group had no change or increases in these areas. The use of the RHAP resulted in decreases in costs in a variety of areas and a decrease in hospital bed days without any apparent negative effect upon patient outcomes or disposition status.

  17. Multivariate meta-analysis: Potential and promise

    PubMed Central

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-01-01

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day ‘Multivariate meta-analysis’ event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21268052

  18. 22 CFR 505.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... individual's name or personal identifier, such as a social security number. (g) Routine use. With respect to... proper and necessary uses even if any such uses occur infrequently. (h) Statistical record. A record in a system of records maintained for statistical research or reporting purposes only and not used in whole or...

  19. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part I: framework and fundamentals.

    PubMed

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    Virgin olive oil is the only food product for which sensory analysis is regulated to classify it in different quality categories. To harmonize the results of the sensorial method, the use of standards or reference materials is crucial. The stability of sensory reference materials is required to enable their suitable control, aiming to confirm that their specific target values are maintained on an ongoing basis. Currently, such stability is monitored by means of sensory analysis and the sensory panels are in the paradoxical situation of controlling the standards that are devoted to controlling the panels. In the present study, several approaches based on similarity analysis are exploited. For each approach, the specific methodology to build a proper multivariate control chart to monitor the stability of the sensory properties is explained and discussed. The normalized Euclidean and Mahalanobis distances, the so-called nearness and hardiness indices respectively, have been defined as new similarity indices to range the values from 0 to 1. Also, the squared mean from Hotelling's T 2 -statistic and Q 2 -statistic has been proposed as another similarity index. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  20. Improving Accuracy and Temporal Resolution of Learning Curve Estimation for within- and across-Session Analysis

    PubMed Central

    Tabelow, Karsten; König, Reinhard; Polzehl, Jörg

    2016-01-01

    Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809

  1. The Design and Analysis of Salmonid Tagging Studies in the Columbia Basin : Volume II: Experiment Salmonid Survival with Combined PIT-CWT Tagging.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Ken

    1997-06-01

    Experiment designs to estimate the effect of transportation on survival and return rates of Columbia River system salmonids are discussed along with statistical modeling techniques. Besides transportation, river flow and dam spill are necessary components in the design and analysis otherwise questions as to the effects of reservoir drawdowns and increased dam spill may never be satisfactorily answered. Four criteria for comparing different experiment designs are: (1) feasibility, (2) clarity of results, (3) scope of inference, and (4) time to learn. In this report, alternative designs for conducting experimental manipulations of smolt tagging studies to study effects of river operationsmore » such as flow levels, spill fractions, and transporting outmigrating salmonids around dams in the Columbia River system are presented. The principles of study design discussed in this report have broad implications for the many studies proposed to investigate both smolt and adult survival relationships. The concepts are illustrated for the case of the design and analysis of smolt transportation experiments. The merits of proposed transportation studies should be measured relative to these principles of proper statistical design and analysis.« less

  2. Regulatory considerations in the design of comparative observational studies using propensity scores.

    PubMed

    Yue, Lilly Q

    2012-01-01

    In the evaluation of medical products, including drugs, biological products, and medical devices, comparative observational studies could play an important role when properly conducted randomized, well-controlled clinical trials are infeasible due to ethical or practical reasons. However, various biases could be introduced at every stage and into every aspect of the observational study, and consequently the interpretation of the resulting statistical inference would be of concern. While there do exist statistical techniques for addressing some of the challenging issues, often based on propensity score methodology, these statistical tools probably have not been as widely employed in prospectively designing observational studies as they should be. There are also times when they are implemented in an unscientific manner, such as performing propensity score model selection for a dataset involving outcome data in the same dataset, so that the integrity of observational study design and the interpretability of outcome analysis results could be compromised. In this paper, regulatory considerations on prospective study design using propensity scores are shared and illustrated with hypothetical examples.

  3. Wavelet methodology to improve single unit isolation in primary motor cortex cells

    PubMed Central

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.

    2016-01-01

    The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461

  4. Breath Analysis as a Potential and Non-Invasive Frontier in Disease Diagnosis: An Overview

    PubMed Central

    Pereira, Jorge; Porto-Figueira, Priscilla; Cavaco, Carina; Taunk, Khushman; Rapole, Srikanth; Dhakne, Rahul; Nagarajaram, Hampapathalu; Câmara, José S.

    2015-01-01

    Currently, a small number of diseases, particularly cardiovascular (CVDs), oncologic (ODs), neurodegenerative (NDDs), chronic respiratory diseases, as well as diabetes, form a severe burden to most of the countries worldwide. Hence, there is an urgent need for development of efficient diagnostic tools, particularly those enabling reliable detection of diseases, at their early stages, preferably using non-invasive approaches. Breath analysis is a non-invasive approach relying only on the characterisation of volatile composition of the exhaled breath (EB) that in turn reflects the volatile composition of the bloodstream and airways and therefore the status and condition of the whole organism metabolism. Advanced sampling procedures (solid-phase and needle traps microextraction) coupled with modern analytical technologies (proton transfer reaction mass spectrometry, selected ion flow tube mass spectrometry, ion mobility spectrometry, e-noses, etc.) allow the characterisation of EB composition to an unprecedented level. However, a key challenge in EB analysis is the proper statistical analysis and interpretation of the large and heterogeneous datasets obtained from EB research. There is no standard statistical framework/protocol yet available in literature that can be used for EB data analysis towards discovery of biomarkers for use in a typical clinical setup. Nevertheless, EB analysis has immense potential towards development of biomarkers for the early disease diagnosis of diseases. PMID:25584743

  5. Breath analysis as a potential and non-invasive frontier in disease diagnosis: an overview.

    PubMed

    Pereira, Jorge; Porto-Figueira, Priscilla; Cavaco, Carina; Taunk, Khushman; Rapole, Srikanth; Dhakne, Rahul; Nagarajaram, Hampapathalu; Câmara, José S

    2015-01-09

    Currently, a small number of diseases, particularly cardiovascular (CVDs), oncologic (ODs), neurodegenerative (NDDs), chronic respiratory diseases, as well as diabetes, form a severe burden to most of the countries worldwide. Hence, there is an urgent need for development of efficient diagnostic tools, particularly those enabling reliable detection of diseases, at their early stages, preferably using non-invasive approaches. Breath analysis is a non-invasive approach relying only on the characterisation of volatile composition of the exhaled breath (EB) that in turn reflects the volatile composition of the bloodstream and airways and therefore the status and condition of the whole organism metabolism. Advanced sampling procedures (solid-phase and needle traps microextraction) coupled with modern analytical technologies (proton transfer reaction mass spectrometry, selected ion flow tube mass spectrometry, ion mobility spectrometry, e-noses, etc.) allow the characterisation of EB composition to an unprecedented level. However, a key challenge in EB analysis is the proper statistical analysis and interpretation of the large and heterogeneous datasets obtained from EB research. There is no standard statistical framework/protocol yet available in literature that can be used for EB data analysis towards discovery of biomarkers for use in a typical clinical setup. Nevertheless, EB analysis has immense potential towards development of biomarkers for the early disease diagnosis of diseases.

  6. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  7. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  8. Basic biostatistics for post-graduate students

    PubMed Central

    Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.

    2012-01-01

    Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501

  9. Awareness, Attitude, and Knowledge of Basic Life Support among Medical, Dental, and Nursing Faculties and Students in the University Hospital.

    PubMed

    Sangamesh, N C; Vidya, K C; Pathi, Jugajyoti; Singh, Arpita

    2017-01-01

    To assess the awareness, attitude, and knowledge about basic life support (BLS) among medical, dental, and nursing students and faculties and the proposal of BLS skills in the academic curriculum of undergraduate (UG) course. Recognition, prevention, and effective management of life-threatening emergencies are the responsibility of health-care professionals. These situations can be successfully managed by proper knowledge and training of the BLS skills. These life-saving maneuvers can be given through the structured resuscitation programs, which are lacking in the academic curriculum. A questionnaire study consisting of 20 questions was conducted among 659 participants in the Kalinga Institute of Dental Sciences, Kalinga Institute of Medical Sciences, KIIT University. Medical junior residents, BDS faculties, interns, nursing faculties, and 3 rd -year and final-year UG students from both medical and dental colleges were chosen. The statistical analysis was carried out using SPSS software version 20.0 (Armonk, NY:IBM Corp). After collecting the data, the values were statistically analyzed and tabulated. Statistical analysis was performed using Mann-Whitney U-test. The results with P < 0.05 were considered statistically significant. Our participants were aware of BLS, showed positive attitude toward it, whereas the knowledge about BLS was lacking, with the statistically significant P value. By introducing BLS regularly in the academic curriculum and by routine hands on workshops, all the health-care providers should be well versed with the BLS skills for effectively managing the life-threatening emergencies.

  10. Nonparametric Residue Analysis of Dynamic PET Data With Application to Cerebral FDG Studies in Normals.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A

    2009-06-01

    Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.

  11. Analyzing Dyadic Sequence Data—Research Questions and Implied Statistical Models

    PubMed Central

    Fuchs, Peter; Nussbeck, Fridtjof W.; Meuwly, Nathalie; Bodenmann, Guy

    2017-01-01

    The analysis of observational data is often seen as a key approach to understanding dynamics in romantic relationships but also in dyadic systems in general. Statistical models for the analysis of dyadic observational data are not commonly known or applied. In this contribution, selected approaches to dyadic sequence data will be presented with a focus on models that can be applied when sample sizes are of medium size (N = 100 couples or less). Each of the statistical models is motivated by an underlying potential research question, the most important model results are presented and linked to the research question. The following research questions and models are compared with respect to their applicability using a hands on approach: (I) Is there an association between a particular behavior by one and the reaction by the other partner? (Pearson Correlation); (II) Does the behavior of one member trigger an immediate reaction by the other? (aggregated logit models; multi-level approach; basic Markov model); (III) Is there an underlying dyadic process, which might account for the observed behavior? (hidden Markov model); and (IV) Are there latent groups of dyads, which might account for observing different reaction patterns? (mixture Markov; optimal matching). Finally, recommendations for researchers to choose among the different models, issues of data handling, and advises to apply the statistical models in empirical research properly are given (e.g., in a new r-package “DySeq”). PMID:28443037

  12. OB Stars and Cepheids From the Gaia TGAS Catalogue: Test of their Distances and Proper Motions

    NASA Astrophysics Data System (ADS)

    Bobylev, Vadim V.; Bajkova, Anisa T.

    2017-12-01

    We consider young distant stars from the Gaia TGAS catalog. These are 250 classical Cepheids and 244 OB stars located at distances up to 4 kpc from the Sun. These stars are used to determine the Galactic rotation parameters using both trigonometric parallaxes and proper motions of the TGAS stars. In this case the considered stars have relative parallax errors less than 200%. Following the well-known statistical approach, we assume that the kinematic parameters found from the line-of-sight velocities Vr are less dependent on errors of distances than the found from the velocity components Vl. From values of the first derivative of the Galactic rotation angular velocity '0, found from the analysis of velocities Vr and Vl separately, the scale factor of distances is determined.We found that from the sample of Cepheids the scale of distances of the TGAS should be reduced by 3%, and from the sample of OB stars, on the contrary, the scale should be increased by 9%.

  13. Selection and Reporting of Statistical Methods to Assess Reliability of a Diagnostic Test: Conformity to Recommended Methods in a Peer-Reviewed Journal

    PubMed Central

    Park, Ji Eun; Han, Kyunghwa; Sung, Yu Sub; Chung, Mi Sun; Koo, Hyun Jung; Yoon, Hee Mang; Choi, Young Jun; Lee, Seung Soo; Kim, Kyung Won; Shin, Youngbin; An, Suah; Cho, Hyo-Min

    2017-01-01

    Objective To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Materials and Methods Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Results Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Conclusion Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary. PMID:29089821

  14. Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index

    PubMed Central

    Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy

    2012-01-01

    Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124

  15. Data Analysis and Data Mining: Current Issues in Biomedical Informatics

    PubMed Central

    Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.

    2011-01-01

    Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916

  16. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  17. Identifying city PV roof resource based on Gabor filter

    NASA Astrophysics Data System (ADS)

    Ruhang, Xu; Zhilin, Liu; Yong, Huang; Xiaoyu, Zhang

    2017-06-01

    To identify a city’s PV roof resources, the area and ownership distribution of residential buildings in an urban district should be assessed. To achieve this assessment, remote sensing data analysing is a promising approach. Urban building roof area estimation is a major topic for remote sensing image information extraction. There are normally three ways to solve this problem. The first way is pixel-based analysis, which is based on mathematical morphology or statistical methods; the second way is object-based analysis, which is able to combine semantic information and expert knowledge; the third way is signal-processing view method. This paper presented a Gabor filter based method. This result shows that the method is fast and with proper accuracy.

  18. Soil moisture assimilation using a modified ensemble transform Kalman filter with water balance constraint

    NASA Astrophysics Data System (ADS)

    Wu, Guocan; Zheng, Xiaogu; Dan, Bo

    2016-04-01

    The shallow soil moisture observations are assimilated into Common Land Model (CoLM) to estimate the soil moisture in different layers. The forecast error is inflated to improve the analysis state accuracy and the water balance constraint is adopted to reduce the water budget residual in the assimilation procedure. The experiment results illustrate that the adaptive forecast error inflation can reduce the analysis error, while the proper inflation layer can be selected based on the -2log-likelihood function of the innovation statistic. The water balance constraint can result in reducing water budget residual substantially, at a low cost of assimilation accuracy loss. The assimilation scheme can be potentially applied to assimilate the remote sensing data.

  19. Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olama, Mohammed M; Sharma, Isha; Kuruganti, Teja

    In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis ofmore » building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.« less

  20. Do's and don'ts in Fourier analysis of steady-state potentials.

    PubMed

    Bach, M; Meigen, T

    1999-01-01

    Fourier analysis is a powerful tool in signal analysis that can be very fruitfully applied to steady-state evoked potentials (flicker ERG, pattern ERG, VEP, etc.). However, there are some inherent assumptions in the underlying discrete Fourier transform (DFT) that are not necessarily fulfilled in typical electrophysiological recording and analysis conditions. Furthermore, engineering software-packages may be ill-suited and/or may not fully exploit the information of steady-state recordings. Specifically: * In the case of steady-state stimulation we know more about the stimulus than in standard textbook situations (exact frequency, phase stability), so 'windowing' and calculation of the 'periodogram' are not necessary. * It is mandatory to choose an integer relationship between sampling rate and frame rate when employing a raster-based CRT stimulator. * The analysis interval must comprise an exact integer number (e.g., 10) of stimulus periods. * The choice of the number of stimulus periods per analysis interval needs a wise compromise: A high number increases the frequency resolution, but makes artifact removal difficult; a low number 'spills' noise into the response frequency. * There is no need to feel tied to a power-of-two number of data points as required by standard FFT, 'resampling' is an easy and efficient alternative. * Proper estimates of noise-corrected Fourier magnitude and statistical significance can be calculated that take into account the non-linear superposition of signal and noise. These aspects are developed in an intuitive approach with examples using both simulations and recordings. Proper use of Fourier analysis of our electrophysiological records will reduce recording time and/or increase the reliability of physiologic or pathologic interpretations.

  1. Properties of different selection signature statistics and a new strategy for combining them.

    PubMed

    Ma, Y; Ding, X; Qanbari, S; Weigend, S; Zhang, Q; Simianer, H

    2015-11-01

    Identifying signatures of recent or ongoing selection is of high relevance in livestock population genomics. From a statistical perspective, determining a proper testing procedure and combining various test statistics is challenging. On the basis of extensive simulations in this study, we discuss the statistical properties of eight different established selection signature statistics. In the considered scenario, we show that a reasonable power to detect selection signatures is achieved with high marker density (>1 SNP/kb) as obtained from sequencing, while rather small sample sizes (~15 diploid individuals) appear to be sufficient. Most selection signature statistics such as composite likelihood ratio and cross population extended haplotype homozogysity have the highest power when fixation of the selected allele is reached, while integrated haplotype score has the highest power when selection is ongoing. We suggest a novel strategy, called de-correlated composite of multiple signals (DCMS) to combine different statistics for detecting selection signatures while accounting for the correlation between the different selection signature statistics. When examined with simulated data, DCMS consistently has a higher power than most of the single statistics and shows a reliable positional resolution. We illustrate the new statistic to the established selective sweep around the lactase gene in human HapMap data providing further evidence of the reliability of this new statistic. Then, we apply it to scan selection signatures in two chicken samples with diverse skin color. Our analysis suggests that a set of well-known genes such as BCO2, MC1R, ASIP and TYR were involved in the divergent selection for this trait.

  2. A pilot investigation on impact of participation in a long-term follow-up clinic (LTFU) on breast cancer and cardiovascular screening among women who received chest radiation for Hodgkin lymphoma.

    PubMed

    Baxstrom, K; Peterson, B A; Lee, C; Vogel, R I; Blaes, A H

    2018-02-07

    Women treated with chest radiation for Hodgkin lymphoma (HL) are at significantly increased risk of breast cancer and cardiovascular disease. HL survivors are recommended to have annual dual screening with mammogram (MMG) and breast magnetic resonance imaging (MRI). They are also recommended to undergo echocardiogram (echo) 5 years after completion of radiation. We performed a pilot study to characterize the women who are and are not receiving proper dual screening for breast cancer and baseline echo, and to examine the impact of a LTFU clinic consultation on screening. A retrospective chart review of 114 women treated for HL at University of Minnesota (UMN) between 1993 and 2009 was performed. Demographics, disease and treatment history (age at diagnosis, stage, radiation dose and field, chemotherapy, recurrence) were assessed, as well as screening practices (MMG, MRI, both and echo), participation in LTFU clinic, and recommendations from providers. Data was summated in yes/no (y/n) format; statistical analysis was performed using chi-squared and Fisher's exact tests. Breast cancer and cardiovascular screening outcomes were compared by participation in the LTFU clinic (y/n) using Fisher's exact tests. P values < 0.05 were considered statistically significant. Forty-one of 114 women met inclusion criteria and had follow-up data for analysis. Median age at diagnosis was 29 years; 67.6% were diagnosed at stage IIa. Median dose of radiation was 3570 cGy. 56.1% participated in the LTFU clinic at the UMN. 36.6% had dual screening with both MMG and MRI, 41.5% had screening with only MMG, and 19.5% had no screening performed. Women were more likely to have dual screening if they were seen in LTFU clinic vs not seen in LTFU clinic (52.2 vs 16.7%, p = 0.02). 67.5% of women were screened with echo; women were also more likely to have screening with echo if seen in LTFU clinic vs not seen (86.4 vs 44.4%, p = 0.007). Many women are not getting the proper dual screening for breast cancer despite their increased risk, with only 36.6% of our study sample getting dual screening. Having a consultation in a LTFU clinic increases dual screening for breast cancer and echo screening for cardiovascular disease. Proper screening allows for detection of secondary breast cancer at earlier stages where treatment can be local therapy. Diagnosing CV disease early could allow for proper preventative treatment or intervention.

  3. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Sacks, David B.; Yu, Yi-Kuo

    2018-06-01

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.

  4. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry.

    PubMed

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo

    2018-06-05

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.

  5. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  6. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-06

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  7. Segment and Fit Thresholding: A New Method for Image Analysis Applied to Microarray and Immunofluorescence Data

    PubMed Central

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.

    2016-01-01

    Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  8. Proper and Paradigmatic Metonymy as a Lens for Characterizing Student Conceptions of Distributions and Sampling

    ERIC Educational Resources Information Center

    Noll, Jennifer; Hancock, Stacey

    2015-01-01

    This research investigates what students' use of statistical language can tell us about their conceptions of distribution and sampling in relation to informal inference. Prior research documents students' challenges in understanding ideas of distribution and sampling as tools for making informal statistical inferences. We know that these…

  9. Technical Considerations on Scanning and Image Analysis for Amyloid PET in Dementia.

    PubMed

    Akamatsu, Go; Ohnishi, Akihito; Aita, Kazuki; Ikari, Yasuhiko; Yamamoto, Yasuji; Senda, Michio

    2017-01-01

    Brain imaging techniques, such as computed tomography (CT), magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), and positron emission tomography (PET), can provide essential and objective information for the early and differential diagnosis of dementia. Amyloid PET is especially useful to evaluate the amyloid-β pathological process as a biomarker of Alzheimer's disease. This article reviews critical points about technical considerations on the scanning and image analysis methods for amyloid PET. Each amyloid PET agent has its own proper administration instructions and recommended uptake time, scan duration, and the method of image display and interpretation. In addition, we have introduced general scanning information, including subject positioning, reconstruction parameters, and quantitative and statistical image analysis. We believe that this article could make amyloid PET a more reliable tool in clinical study and practice.

  10. Multifractal analysis of mobile social networks

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Zhang, Zifeng; Deng, Yufan

    2017-09-01

    As Wireless Fidelity (Wi-Fi)-enabled handheld devices have been widely used, the mobile social networks (MSNs) has been attracting extensive attention. Fractal approaches have also been widely applied to characterierize natural networks as useful tools to depict their spatial distribution and scaling properties. Moreover, when the complexity of the spatial distribution of MSNs cannot be properly charaterized by single fractal dimension, multifractal analysis is required. For further research, we introduced a multifractal analysis method based on box-covering algorithm to describe the structure of MSNs. Using this method, we find that the networks are multifractal at different time interval. The simulation results demonstrate that the proposed method is efficient for analyzing the multifractal characteristic of MSNs, which provides a distribution of singularities adequately describing both the heterogeneity of fractal patterns and the statistics of measurements across spatial scales in MSNs.

  11. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    PubMed

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  12. Effectiveness of Various Methods of Teaching Proper Inhaler Technique.

    PubMed

    Axtell, Samantha; Haines, Seena; Fairclough, Jamie

    2017-04-01

    To compare the effectiveness of 4 different instructional interventions in training proper inhaler technique. Randomized, noncrossover trial. Health fair and indigent clinic. Inhaler-naive adult volunteers who spoke and read English. Subjects were assigned to complete the following: (1) read a metered dose inhaler (MDI) package insert pamphlet, (2) watch a Centers for Disease Control and Prevention (CDC) video demonstrating MDI technique, (3) watch a YouTube video demonstrating MDI technique, or (4) receive direct instruction of MDI technique from a pharmacist. Inhaler use competency (completion of all 7 prespecified critical steps). Of the 72 subjects, 21 (29.2%) demonstrated competent inhaler technique. A statistically significant difference between pharmacist direct instruction and the remaining interventions, both combined ( P < .0001) and individually ( P ≤ .03), was evident. No statistically significant difference was detected among the remaining 3 intervention groups. Critical steps most frequently omitted or improperly performed were exhaling before inhalation and holding of breath after inhalation. A 2-minute pharmacist counseling session is more effective than other interventions in successfully educating patients on proper inhaler technique. Pharmacists can play a pivotal role in reducing the implications of improper inhaler use.

  13. Environmentally safe areas and routes in the Baltic proper using Eulerian tracers.

    PubMed

    Höglund, A; Meier, H E M

    2012-07-01

    In recent years, the shipping of environmentally hazardous cargo has increased considerably in the Baltic proper. In this study, a large number of hypothetical oil spills with an idealized, passive tracer are simulated. From the tracer distributions, statistical measures are calculated to optimize the quantity of tracer from a spill that would stay at sea as long as possible. Increased time may permit action to be taken against the spill before the oil reaches environmentally vulnerable coastal zones. The statistical measures are used to calculate maritime routes with maximum probability that an oil spill will stay at sea as long as possible. Under these assumptions, ships should follow routes that are located south of Bornholm instead of the northern routes in use currently. Our results suggest that the location of the optimal maritime routes depends on the season, although interannual variability is too large to identify statistically significant changes. Copyright © 2012. Published by Elsevier Ltd.

  14. Proteomic Workflows for Biomarker Identification Using Mass Spectrometry — Technical and Statistical Considerations during Initial Discovery

    PubMed Central

    Orton, Dennis J.; Doucette, Alan A.

    2013-01-01

    Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400

  15. Standardized data collection to build prediction models in oncology: a prototype for rectal cancer.

    PubMed

    Meldolesi, Elisa; van Soest, Johan; Damiani, Andrea; Dekker, Andre; Alitto, Anna Rita; Campitelli, Maura; Dinapoli, Nicola; Gatta, Roberto; Gambacorta, Maria Antonietta; Lanzotti, Vito; Lambin, Philippe; Valentini, Vincenzo

    2016-01-01

    The advances in diagnostic and treatment technology are responsible for a remarkable transformation in the internal medicine concept with the establishment of a new idea of personalized medicine. Inter- and intra-patient tumor heterogeneity and the clinical outcome and/or treatment's toxicity's complexity, justify the effort to develop predictive models from decision support systems. However, the number of evaluated variables coming from multiple disciplines: oncology, computer science, bioinformatics, statistics, genomics, imaging, among others could be very large thus making traditional statistical analysis difficult to exploit. Automated data-mining processes and machine learning approaches can be a solution to organize the massive amount of data, trying to unravel important interaction. The purpose of this paper is to describe the strategy to collect and analyze data properly for decision support and introduce the concept of an 'umbrella protocol' within the framework of 'rapid learning healthcare'.

  16. Replication, lies and lesser-known truths regarding experimental design in environmental microbiology.

    PubMed

    Lennon, Jay T

    2011-06-01

    A recent analysis revealed that most environmental microbiologists neglect replication in their science (Prosser, 2010). Of all peer-reviewed papers published during 2009 in the field's leading journals, slightly more than 70% lacked replication when it came to analyzing microbial community data. The paucity of replication is viewed as an 'endemic' and 'embarrassing' problem that amounts to 'bad science', or worse yet, as the title suggests, lying (Prosser, 2010). Although replication is an important component of experimental design, it is possible to do good science without replication. There are various quantitative techniques - some old, some new - that, when used properly, will allow environmental microbiologists to make strong statistical conclusions from experimental and comparative data. Here, I provide examples where unreplicated data can be used to test hypotheses and yield novel information in a statistically robust manner. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.

  17. In pursuit of a science of agriculture: the role of statistics in field experiments.

    PubMed

    Parolini, Giuditta

    2015-09-01

    Since the beginning of the twentieth century statistics has reshaped the experimental cultures of agricultural research taking part in the subtle dialectic between the epistemic and the material that is proper to experimental systems. This transformation has become especially relevant in field trials and the paper will examine the British agricultural institution, Rothamsted Experimental Station, where statistical methods nowadays popular in the planning and analysis of field experiments were developed in the 1920s. At Rothamsted statistics promoted randomisation over systematic arrangements, factorisation over one-question trials, and emphasised the importance of the experimental error in assessing field trials. These changes in methodology transformed also the material culture of agricultural science, and a new body, the Field Plots Committee, was created to manage the field research of the agricultural institution. Although successful, the vision of field experimentation proposed by the Rothamsted statisticians was not unproblematic. Experimental scientists closely linked to the farming community questioned it in favour of a field research that could be more easily understood by farmers. The clash between the two agendas reveals how the role attributed to statistics in field experimentation defined different pursuits of agricultural research, alternately conceived of as a scientists' science or as a farmers' science.

  18. eSACP - a new Nordic initiative towards developing statistical climate services

    NASA Astrophysics Data System (ADS)

    Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine

    2015-04-01

    The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.

  19. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    NASA Astrophysics Data System (ADS)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the ARMA class. In most cases the relative improvement of SETAR models against AR models of first order was low ranging between 1% and 4% with the exception of the three-regime model for the River Stour time-series where the improvement was 48.9%. In comparison, the relative improvement of MSW models was between 44.6% and 52.5 for two-regime and from 60.4% to 75% for three-regime models. However, the visual assessment of models plotted against original datasets showed that despite a high value of RSS, some ARMA models could describe the analyzed time-series better than AR, MA and SETAR models with lower values of RSS. In both datasets MSW models provided a very good visual fit describing most of the extreme values.

  20. The Optical Gravitational Lensing Experiment. The Catalog of Stellar Proper Motions toward the Magellanic Clouds

    NASA Astrophysics Data System (ADS)

    Poleski, R.; Soszyński, I.; Udalski, A.; Szymański, M. K.; Kubiak, M.; Pietrzyński, G.; Wyrzykowski, Ł.; Ulaczyk, K.

    2012-03-01

    We present a catalog of over 6.2 million stars with measured proper motions. All these stars are observed in the direction of the Magellanic Clouds within the brightness range 12

  1. Fundamental Principles of Proper Space Kinematics

    NASA Astrophysics Data System (ADS)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  2. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms

    PubMed Central

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-01-01

    Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836

  3. The use of statistical tools in field testing of putative effects of genetically modified plants on nontarget organisms.

    PubMed

    Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F

    2013-08-01

    To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.

  4. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2006-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  5. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  6. Wavelet methodology to improve single unit isolation in primary motor cortex cells.

    PubMed

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A

    2015-05-15

    The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein's unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. Copyright © 2015. Published by Elsevier B.V.

  7. Regression modeling of ground-water flow

    USGS Publications Warehouse

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  8. Conducting tests for statistically significant differences using forest inventory data

    Treesearch

    James A. Westfall; Scott A. Pugh; John W. Coulston

    2013-01-01

    Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...

  9. An All-Sky Search for Wide Binaries in the SUPERBLINK Proper Motion Catalog

    NASA Astrophysics Data System (ADS)

    Hartman, Zachary; Lepine, Sebastien

    2017-01-01

    We present initial results from an all-sky search for Common Proper Motion (CPM) binaries in the SUPERBLINK all-sky proper motion catalog of 2.8 million stars with proper motions greater than 40 mas/yr, which has been recently enhanced with data from the GAIA mission. We initially search the SUPERBLINK catalog for pairs of stars with angular separations up to 1 degree and proper motion difference less than 40 mas/yr. In order to determine which of these pairs are real binaries, we develop a Bayesian analysis to calculate probabilities of true companionship based on a combination of proper motion magnitude, angular separation, and proper motion differences. The analysis reveals that the SUPERBLINK catalog most likely contains ~40,000 genuine common proper motion binaries. We provide initial estimates of the distances and projected physical separations of these wide binaries.

  10. Total costs of injury from accidents in the home and during education, sports and leisure activities: estimates for Norway with assessment of uncertainty.

    PubMed

    Veisten, Knut; Nossum, Ase; Akhtar, Juned

    2009-07-01

    Injury accidents occurring in the home, during educational, sports or leisure activities were estimated from samples of hospital data, combined with fatality data from vital statistics. Uncertainty of estimated figures was assessed in simulation-based analysis. Total economic costs to society from injuries and fatalities due to such accidents were estimated at approximately NOK 150 billion per year. The estimated costs reveal the scale of the public health problem and lead to arguments for the establishment of a proper injury register for the identification of preventive measures to reduce the costs to society.

  11. S-matrix analysis of the baryon electric charge correlation

    NASA Astrophysics Data System (ADS)

    Lo, Pok Man; Friman, Bengt; Redlich, Krzysztof; Sasaki, Chihiro

    2018-03-01

    We compute the correlation of the net baryon number with the electric charge (χBQ) for an interacting hadron gas using the S-matrix formulation of statistical mechanics. The observable χBQ is particularly sensitive to the details of the pion-nucleon interaction, which are consistently incorporated in the current scheme via the empirical scattering phase shifts. Comparing to the recent lattice QCD studies in the (2 + 1)-flavor system, we find that the natural implementation of interactions and the proper treatment of resonances in the S-matrix approach lead to an improved description of the lattice data over that obtained in the hadron resonance gas model.

  12. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism.

    PubMed

    Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-04-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication.

  13. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism

    PubMed Central

    Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-01-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication. PMID:21157472

  14. ArraySolver: an algorithm for colour-coded graphical display and Wilcoxon signed-rank statistics for comparing microarray gene expression data.

    PubMed

    Khan, Haseeb Ahmad

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann-Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n < or = 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform.

  15. ArraySolver: An Algorithm for Colour-Coded Graphical Display and Wilcoxon Signed-Rank Statistics for Comparing Microarray Gene Expression Data

    PubMed Central

    2004-01-01

    The massive surge in the production of microarray data poses a great challenge for proper analysis and interpretation. In recent years numerous computational tools have been developed to extract meaningful interpretation of microarray gene expression data. However, a convenient tool for two-groups comparison of microarray data is still lacking and users have to rely on commercial statistical packages that might be costly and require special skills, in addition to extra time and effort for transferring data from one platform to other. Various statistical methods, including the t-test, analysis of variance, Pearson test and Mann–Whitney U test, have been reported for comparing microarray data, whereas the utilization of the Wilcoxon signed-rank test, which is an appropriate test for two-groups comparison of gene expression data, has largely been neglected in microarray studies. The aim of this investigation was to build an integrated tool, ArraySolver, for colour-coded graphical display and comparison of gene expression data using the Wilcoxon signed-rank test. The results of software validation showed similar outputs with ArraySolver and SPSS for large datasets. Whereas the former program appeared to be more accurate for 25 or fewer pairs (n ≤ 25), suggesting its potential application in analysing molecular signatures that usually contain small numbers of genes. The main advantages of ArraySolver are easy data selection, convenient report format, accurate statistics and the familiar Excel platform. PMID:18629036

  16. DIGE Analysis of Human Tissues.

    PubMed

    Gelfi, Cecilia; Capitanio, Daniele

    2018-01-01

    Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.

  17. P values are only an index to evidence: 20th- vs. 21st-century statistical science.

    PubMed

    Burnham, K P; Anderson, D R

    2014-03-01

    Early statistical methods focused on pre-data probability statements (i.e., data as random variables) such as P values; these are not really inferences nor are P values evidential. Statistical science clung to these principles throughout much of the 20th century as a wide variety of methods were developed for special cases. Looking back, it is clear that the underlying paradigm (i.e., testing and P values) was weak. As Kuhn (1970) suggests, new paradigms have taken the place of earlier ones: this is a goal of good science. New methods have been developed and older methods extended and these allow proper measures of strength of evidence and multimodel inference. It is time to move forward with sound theory and practice for the difficult practical problems that lie ahead. Given data the useful foundation shifts to post-data probability statements such as model probabilities (Akaike weights) or related quantities such as odds ratios and likelihood intervals. These new methods allow formal inference from multiple models in the a prior set. These quantities are properly evidential. The past century was aimed at finding the "best" model and making inferences from it. The goal in the 21st century is to base inference on all the models weighted by their model probabilities (model averaging). Estimates of precision can include model selection uncertainty leading to variances conditional on the model set. The 21st century will be about the quantification of information, proper measures of evidence, and multi-model inference. Nelder (1999:261) concludes, "The most important task before us in developing statistical science is to demolish the P-value culture, which has taken root to a frightening extent in many areas of both pure and applied science and technology".

  18. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. The Digital Shoreline Analysis System (DSAS) Version 4.0 - An ArcGIS extension for calculating shoreline change

    USGS Publications Warehouse

    Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan

    2009-01-01

    The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.

  20. Statistically optimal estimation of Greenland Ice Sheet mass variations from GRACE monthly solutions using an improved mascon approach

    NASA Astrophysics Data System (ADS)

    Ran, J.; Ditmar, P.; Klees, R.; Farahani, H. H.

    2018-03-01

    We present an improved mascon approach to transform monthly spherical harmonic solutions based on GRACE satellite data into mass anomaly estimates in Greenland. The GRACE-based spherical harmonic coefficients are used to synthesize gravity anomalies at satellite altitude, which are then inverted into mass anomalies per mascon. The limited spectral content of the gravity anomalies is properly accounted for by applying a low-pass filter as part of the inversion procedure to make the functional model spectrally consistent with the data. The full error covariance matrices of the monthly GRACE solutions are properly propagated using the law of covariance propagation. Using numerical experiments, we demonstrate the importance of a proper data weighting and of the spectral consistency between functional model and data. The developed methodology is applied to process real GRACE level-2 data (CSR RL05). The obtained mass anomaly estimates are integrated over five drainage systems, as well as over entire Greenland. We find that the statistically optimal data weighting reduces random noise by 35-69%, depending on the drainage system. The obtained mass anomaly time-series are de-trended to eliminate the contribution of ice discharge and are compared with de-trended surface mass balance (SMB) time-series computed with the Regional Atmospheric Climate Model (RACMO 2.3). We show that when using a statistically optimal data weighting in GRACE data processing, the discrepancies between GRACE-based estimates of SMB and modelled SMB are reduced by 24-47%.

  1. Dietary Soy Supplement on Fibromyalgia Symptoms: A Randomized, Double-Blind, Placebo-Controlled, Early Phase Trial

    PubMed Central

    Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.

    2011-01-01

    Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P < .001). The difference in change in scores between the groups was not significant (P = .16). With the same analysis, CES-D scores decreased in the soy group by 16% (P = .004) and in the placebo group by 15% (P = .05). The change in scores was similar in the groups (P = .83). Results of statistical analysis using the separation test and intent-to-treat analysis revealed no benefit of soy compared with placebo. Shakes that contain soy and shakes that contain casein, when combined with a multidisciplinary fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724

  2. A quantitative analysis of statistical power identifies obesity endpoints for improved in vivo preclinical study design

    PubMed Central

    Selimkhanov, Jangir; Thompson, W. Clayton; Guo, Juen; Hall, Kevin D.; Musante, Cynthia J.

    2017-01-01

    The design of well-powered in vivo preclinical studies is a key element in building knowledge of disease physiology for the purpose of identifying and effectively testing potential anti-obesity drug targets. However, as a result of the complexity of the obese phenotype, there is limited understanding of the variability within and between study animals of macroscopic endpoints such as food intake and body composition. This, combined with limitations inherent in the measurement of certain endpoints, presents challenges to study design that can have significant consequences for an anti-obesity program. Here, we analyze a large, longitudinal study of mouse food intake and body composition during diet perturbation to quantify the variability and interaction of key metabolic endpoints. To demonstrate how conclusions can change as a function of study size, we show that a simulated pre-clinical study properly powered for one endpoint may lead to false conclusions based on secondary endpoints. We then propose guidelines for endpoint selection and study size estimation under different conditions to facilitate proper power calculation for a more successful in vivo study design. PMID:28392555

  3. Earth Observing System Covariance Realism

    NASA Technical Reports Server (NTRS)

    Zaidi, Waqar H.; Hejduk, Matthew D.

    2016-01-01

    The purpose of covariance realism is to properly size a primary object's covariance in order to add validity to the calculation of the probability of collision. The covariance realism technique in this paper consists of three parts: collection/calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics. An empirical cumulative distribution function (ECDF) Goodness-of-Fit (GOF) method is employed to determine if a covariance is properly sized by comparing the empirical distribution of Mahalanobis distance calculations to the hypothesized parent 3-DoF chi-squared distribution. To realistically size a covariance for collision probability calculations, this study uses a state noise compensation algorithm that adds process noise to the definitive epoch covariance to account for uncertainty in the force model. Process noise is added until the GOF tests pass a group significance level threshold. The results of this study indicate that when outliers attributed to persistently high or extreme levels of solar activity are removed, the aforementioned covariance realism compensation method produces a tuned covariance with up to 80 to 90% of the covariance propagation timespan passing (against a 60% minimum passing threshold) the GOF tests-a quite satisfactory and useful result.

  4. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  5. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  6. Data Mining CMMSs: How to Convert Data into Knowledge.

    PubMed

    Fennigkoh, Larry; Nanney, D Courtney

    2018-01-01

    Although the healthcare technology management (HTM) community has decades of accumulated medical device-related maintenance data, little knowledge has been gleaned from these data. Finding and extracting such knowledge requires the use of the well-established, but admittedly somewhat foreign to HTM, application of inferential statistics. This article sought to provide a basic background on inferential statistics and describe a case study of their application, limitations, and proper interpretation. The research question associated with this case study involved examining the effects of ventilator preventive maintenance (PM) labor hours, age, and manufacturer on needed unscheduled corrective maintenance (CM) labor hours. The study sample included more than 21,000 combined PM inspections and CM work orders on 2,045 ventilators from 26 manufacturers during a five-year period (2012-16). A multiple regression analysis revealed that device age, manufacturer, and accumulated PM inspection labor hours all influenced the amount of CM labor significantly (P < 0.001). In essence, CM labor hours increased with increasing PM labor. However, and despite the statistical significance of these predictors, the regression analysis also indicated that ventilator age, manufacturer, and PM labor hours only explained approximately 16% of all variability in CM labor, with the remainder (84%) caused by other factors that were not included in the study. As such, the regression model obtained here is not suitable for predicting ventilator CM labor hours.

  7. Very Low-mass Stars and Brown Dwarfs in Upper Scorpius Using Gaia DR1: Mass Function, Disks, and Kinematics

    NASA Astrophysics Data System (ADS)

    Cook, Neil J.; Scholz, Aleks; Jayawardhana, Ray

    2017-12-01

    Our understanding of the brown dwarf population in star-forming regions is dependent on knowing distances and proper motions and therefore will be improved through the Gaia space mission. In this paper, we select new samples of very low-mass objects (VLMOs) in Upper Scorpius using UKIDSS colors and optimized proper motions calculated using Gaia DR1. The scatter in proper motions from VLMOs in Upper Scorpius is now (for the first time) dominated by the kinematic spread of the region itself, not by the positional uncertainties. With age and mass estimates updated using Gaia parallaxes for early-type stars in the same region, we determine masses for all VLMOs. Our final most complete sample includes 453 VLMOs of which ˜125 are expected to be brown dwarfs. The cleanest sample is comprised of 131 VLMOs, with ˜105 brown dwarfs. We also compile a joint sample from the literature that includes 415 VLMOs, out of which 152 are likely brown dwarfs. The disk fraction among low-mass brown dwarfs (M< 0.05 {M}⊙ ) is substantially higher than in more massive objects, indicating that disks around low-mass brown dwarfs survive longer than in low-mass stars overall. The mass function for 0.01< M< 0.1 {M}⊙ is consistent with the Kroupa Initial Mass Function. We investigate the possibility that some “proper motion outliers” have undergone a dynamical ejection early in their evolution. Our analysis shows that the color-magnitude cuts used when selecting samples introduce strong bias into the population statistics due to varying levels of contamination and completeness.

  8. The pituitary gland under infrared light - in search of a representative spectrum for homogeneous regions.

    PubMed

    Banas, A; Banas, K; Furgal-Borzych, A; Kwiatek, W M; Pawlicki, B; Breese, M B H

    2015-04-07

    The pituitary gland is a small but vital organ in the human body. It is located at the base of the brain and is often described as the master gland due to its multiple functions. The pituitary gland secretes and stores hormones, such as the thyroid-stimulating hormone (TSH), adrenocorticotropic hormone (ACTH), growth hormone (hGH), prolactin, gonadotropins, and luteinizing hormones, as well as the antidiuretic hormone (ADH). A proper diagnosis of pituitary disorders is of utmost importance as this organ participates in regulating a variety of body functions. Typical histopathological analysis provides much valuable information, but it gives no insight into the biochemical background of the changes that occur within the gland. One approach that could be used to evaluate the biochemistry of tissue sections obtained from pituitary disorders is Fourier Transform Infra-Red (FTIR) spectromicroscopy. In order to collect diagnostically valuable information large areas of tissue must be investigated. This work focuses on obtaining a unique and representative FTIR spectrum characteristic of one type of cell architecture within a sample. The idea presented is based on using hierarchical cluster analysis (HCA) for data evaluation to search for uniform patterns within samples from the perspective of FTIR spectra. The results obtained demonstrate that FTIR spectromicroscopy, combined with proper statistical evaluation, can be treated as a complementary method for histopathological analysis and ipso facto can increase the sensitivity and specificity for detecting various disorders not only for the pituitary gland, but also for other human tissues.

  9. Statistics of Land-Grant Colleges, Year Ended June 30, 1922. Bulletin, 1924, No. 6

    ERIC Educational Resources Information Center

    Blauch, L. E.

    1924-01-01

    This report is made in accordance with the provisions of the land-grant act of 1862 and the Morrill-Nelson Acts of 1890 and 1907. To assure the proper usage of these funds, the specialist in charge of land-grant college statistics makes, from reports submitted by the treasurers of the land-grant colleges, an audit of disbursements from the funds.…

  10. Statistical Modelling of the Soil Dielectric Constant

    NASA Astrophysics Data System (ADS)

    Usowicz, Boguslaw; Marczewski, Wojciech; Bogdan Usowicz, Jerzy; Lipiec, Jerzy

    2010-05-01

    The dielectric constant of soil is the physical property being very sensitive on water content. It funds several electrical measurement techniques for determining the water content by means of direct (TDR, FDR, and others related to effects of electrical conductance and/or capacitance) and indirect RS (Remote Sensing) methods. The work is devoted to a particular statistical manner of modelling the dielectric constant as the property accounting a wide range of specific soil composition, porosity, and mass density, within the unsaturated water content. Usually, similar models are determined for few particular soil types, and changing the soil type one needs switching the model on another type or to adjust it by parametrization of soil compounds. Therefore, it is difficult comparing and referring results between models. The presented model was developed for a generic representation of soil being a hypothetical mixture of spheres, each representing a soil fraction, in its proper phase state. The model generates a serial-parallel mesh of conductive and capacitive paths, which is analysed for a total conductive or capacitive property. The model was firstly developed to determine the thermal conductivity property, and now it is extended on the dielectric constant by analysing the capacitive mesh. The analysis is provided by statistical means obeying physical laws related to the serial-parallel branching of the representative electrical mesh. Physical relevance of the analysis is established electrically, but the definition of the electrical mesh is controlled statistically by parametrization of compound fractions, by determining the number of representative spheres per unitary volume per fraction, and by determining the number of fractions. That way the model is capable covering properties of nearly all possible soil types, all phase states within recognition of the Lorenz and Knudsen conditions. In effect the model allows on generating a hypothetical representative of the soil type, and that way it enables clear comparing to results from other soil type dependent models. The paper is focused on proper representing possible range of porosity in commonly existing soils. This work is done with aim of implementing the statistical-physical model of the dielectric constant to a use in the model CMEM (Community Microwave Emission Model), applicable to SMOS (Soil Moisture and Ocean Salinity ESA Mission) data. The input data to the model clearly accepts definition of soil fractions in common physical measures, and in opposition to other empirical models, does not need calibrating. It is not dependent on recognition of the soil by type, but instead it offers the control of accuracy by proper determination of the soil compound fractions. SMOS employs CMEM being funded only by the sand-clay-silt composition. Common use of the soil data, is split on tens or even hundreds soil types depending on the region. We hope that only by determining three element compounds of sand-clay-silt, in few fractions may help resolving the question of relevance of soil data to the input of CMEM, for SMOS. Now, traditionally employed soil types are converted on sand-clay-silt compounds, but hardly cover effects of other specific properties like the porosity. It should bring advantageous effects in validating SMOS observation data, and is taken for the aim in the Cal/Val project 3275, in the campaigns for SVRT (SMOS Validation and Retrieval Team). Acknowledgements. This work was funded in part by the PECS - Programme for European Cooperating States, No. 98084 "SWEX/R - Soil Water and Energy Exchange/Research".

  11. Cooperativity in plastic crystals

    NASA Astrophysics Data System (ADS)

    Pieruccini, Marco; Tombari, Elpidio

    2018-03-01

    A statistical mechanical model previously adopted for the analysis of the α -relaxation in structural glass formers is rederived within a general theoretical framework originally developed for systems approaching the ideal glassy state. The interplay between nonexponentiality and cooperativity is reconsidered in the light of energy landscape concepts. The method is used to estimate the cooperativity in orientationally disordered crystals, either from the analysis of literature data on linear dielectric response or from the enthalpy relaxation function obtained by temperature-modulated calorimetry. Knowledge of the specific heat step due to the freezing of the configurational or conformational modes at the glass transition is needed in order to properly account for the extent to which the relaxing system deviates from equilibrium during the rearrangement processes. A number of plastic crystals have been analyzed, and relatively higher cooperativities are found in the presence of hydrogen bonding interaction.

  12. HENDRICS: High ENergy Data Reduction Interface from the Command Shell

    NASA Astrophysics Data System (ADS)

    Bachetti, Matteo

    2018-05-01

    HENDRICS, a rewrite and update to MaLTPyNT (ascl:1502.021), contains command-line scripts based on Stingray (ascl:1608.001) to perform a quick-look (spectral-)timing analysis of X-ray data, treating the gaps in the data due, e.g., to occultation from the Earth or passages through the SAA, properly. Despite its original main focus on NuSTAR, HENDRICS can perform standard aperiodic timing analysis on X-ray data from, in principle, any other satellite, and its features include power density and cross spectra, time lags, pulsar searches with the Epoch folding and the Z_n^2 statistics, color-color and color-intensity diagrams. The periodograms produced by HENDRICS (such as a power density spectrum or a cospectrum) can be saved in a format compatible with XSPEC (ascl:9910.005) or ISIS (ascl:1302.002)

  13. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  14. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  15. Semi-supervised vibration-based classification and condition monitoring of compressors

    NASA Astrophysics Data System (ADS)

    Potočnik, Primož; Govekar, Edvard

    2017-09-01

    Semi-supervised vibration-based classification and condition monitoring of the reciprocating compressors installed in refrigeration appliances is proposed in this paper. The method addresses the problem of industrial condition monitoring where prior class definitions are often not available or difficult to obtain from local experts. The proposed method combines feature extraction, principal component analysis, and statistical analysis for the extraction of initial class representatives, and compares the capability of various classification methods, including discriminant analysis (DA), neural networks (NN), support vector machines (SVM), and extreme learning machines (ELM). The use of the method is demonstrated on a case study which was based on industrially acquired vibration measurements of reciprocating compressors during the production of refrigeration appliances. The paper presents a comparative qualitative analysis of the applied classifiers, confirming the good performance of several nonlinear classifiers. If the model parameters are properly selected, then very good classification performance can be obtained from NN trained by Bayesian regularization, SVM and ELM classifiers. The method can be effectively applied for the industrial condition monitoring of compressors.

  16. Statistical auditing of toxicology reports.

    PubMed

    Deaton, R R; Obenchain, R L

    1994-06-01

    Statistical auditing is a new report review process used by the quality assurance unit at Eli Lilly and Co. Statistical auditing allows the auditor to review the process by which the report was generated, as opposed to the process by which the data was generated. We have the flexibility to use different sampling techniques and still obtain thorough coverage of the report data. By properly implementing our auditing process, we can work smarter rather than harder and continue to help our customers increase the quality of their products (reports). Statistical auditing is helping our quality assurance unit meet our customers' need, while maintaining or increasing the quality of our regulatory obligations.

  17. ENVIRONMENTAL SAMPLING: A BRIEF REVIEW

    EPA Science Inventory

    Proper application of statistical principles at the outset of an environmental study can make the difference between an effective, efficient study and wasted resources. This review distills some of the thoughts current among environmental scientists from a variety of backgrounds ...

  18. FOOD RISK ANALYSIS

    USDA-ARS?s Scientific Manuscript database

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  19. [Prevalence of thyroid function in pregnant and lactating women in areas with different iodine levels of Shanxi province].

    PubMed

    Ren, Y T; Jia, Q Z; Zhang, X D; Guo, B S; Zhang, F F; Cheng, X T; Wang, Y P

    2018-05-10

    Objective: To investigate the effects of high iodine intake on thyroid function in pregnant and lactating women. Methods: A cross sectional epidemiological study was conducted among 130 pregnant women and 220 lactating women aged 19-40 years in areas with high environment iodine level (>300 μg/L) or proper environment iodine level (50-100 μg/L) in Shanxi in 2014. The general information, urine samples and blood samples of the women surveyed and water samples were collected. The water and urine iodine levels were detected with arsenic and cerium catalysis spectrophotometric method, the blood TSH level was detected with electrochemiluminescence immunoassay, and thyroid stimulating hormone (FT(4)), antithyroid peroxidase autoantibody (TPOAb) and anti-thyroglobulin antibodies (TGAb) were detected with chemiluminescence immunoassay. Results: The median urine iodine levels of the four groups were 221.9, 282.5, 814.1 and 818.6 μg/L, respectively. The median serum FT(4) of lactating women in high iodine area and proper iodine area were 12.96 and 13.22 pmol/L, and the median serum TSH was 2.45 and 2.17 mIU/L, respectively. The median serum FT(4) of pregnant women in high iodine area and proper iodine area were 14.66 and 16.16 pmol/L, and the median serum TSH was 2.13 and 1.82 mIU/L, respectively. The serum FT(4) levels were lower and the abnormal rates of serum TSH were higher in lactating women than in pregnant women in both high iodine area and proper iodine area, the difference was statistically significant (FT(4): Z =-6.677, -4.041, P <0.01; TSH: Z =8.797, 8.910, P <0.01). In high iodine area, the abnormal rate of serum FT(4) in lactating women was higher than that in pregnant women, the difference was statistically significant ( Z =7.338, P =0.007). The serum FT(4) level of lactating women in high iodine area was lower than that in proper iodine area, the difference was statistically significant ( Z =-4.687, P =0.000). In high iodine area, the median serum FT(4) in early pregnancy, mid-pregnancy and late pregnancy was 16.26, 14.22 and 14.80 pmol/L, respectively, and the median serum TSH was 1.74, 1.91 and 2.38 mIU/L, respectively. In high iodine area, the serum FT(4) level in early pregnancy was higher than that in mid-pregnancy and late pregnancy, and the serum TSH level was lower than that in mid-pregnancy and late pregnancy, the difference was statistically significant (FT(4): Z =-2.174, -2.238, P <0.05; TSH: Z =-2.985, -1.978, P <0.05). There were no significant differences in the positive rates of serum thyroid autoantibodies among the four groups of women and women in different periods of pregnancy ( P >0.05). The morbidity rates of subclinical hyperthyroidism in pregnant women and lactating women in high iodine area were obviously higher than those in proper iodine areas, the difference was statistically significant ( χ (2)=5.363, 5.007, P <0.05). Conclusions: Excessive iodine intake might increase the risk of subclinical hypothyroidism in pregnant women and lactating women. It is suggested to strengthen the iodine nutrition and thyroid function monitoring in women, pregnant women and lactating women in areas with high environmental iodine.

  20. Unwanted pregnancy and induced abortion among young women 16-22 years old in Greece: a retrospective study of the risk factors.

    PubMed

    Salakos, N; Koumousidis, A; Bakalianou, K; Paltoglou, G; Kalampokas, T; Iavazzo, C

    2010-01-01

    Unwanted pregnancies and the subsequent induced abortions are common problems of our youths in modern Greece. The aim of this study was to recognize the risk factors of the problem in an effort to find the best possible solution out of this social dead end. We interviewed 1,320 young female individuals and analyzed their answers using statistical analysis. Several useful conclusions were reached concerning the forces that are involved in unwanted pregnancy/induced abortions. We have tried to underline the strategy to combat the problem. Sexual education and the proper use of contraception remain the essential tools in this effort.

  1. Comparison of de novo assembly statistics of Cucumis sativus L.

    NASA Astrophysics Data System (ADS)

    Wojcieszek, Michał; Kuśmirek, Wiktor; Pawełkowicz, Magdalena; PlÄ der, Wojciech; Nowak, Robert M.

    2017-08-01

    Genome sequencing is the core of genomic research. With the development of NGS and lowering the cost of procedure there is another tight gap - genome assembly. Developing the proper tool for this task is essential as quality of genome has important impact on further research. Here we present comparison of several de Bruijn assemblers tested on C. sativus genomic reads. The assessment shows that newly developed software - dnaasm provides better results in terms of quantity and quality. The number of generated sequences is lower by 5 - 33% with even two fold higher N50. Quality check showed reliable results were generated by dnaasm. This provides us with very strong base for future genomic analysis.

  2. "Big Data" in Rheumatology: Intelligent Data Modeling Improves the Quality of Imaging Data.

    PubMed

    Landewé, Robert B M; van der Heijde, Désirée

    2018-05-01

    Analysis of imaging data in rheumatology is a challenge. Reliability of scores is an issue for several reasons. Signal-to-noise ratio of most imaging techniques is rather unfavorable (too little signal in relation to too much noise). Optimal use of all available data may help to increase credibility of imaging data, but knowledge of complicated statistical methodology and the help of skilled statisticians are required. Clinicians should appreciate the merits of sophisticated data modeling and liaise with statisticians to increase the quality of imaging results, as proper imaging studies in rheumatology imply more than a supersensitive imaging technique alone. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Impact of parametric uncertainty on estimation of the energy deposition into an irradiated brain tumor

    NASA Astrophysics Data System (ADS)

    Taverniers, Søren; Tartakovsky, Daniel M.

    2017-11-01

    Predictions of the total energy deposited into a brain tumor through X-ray irradiation are notoriously error-prone. We investigate how this predictive uncertainty is affected by uncertainty in both the location of the region occupied by a dose-enhancing iodinated contrast agent and the agent's concentration. This is done within the probabilistic framework in which these uncertain parameters are modeled as random variables. We employ the stochastic collocation (SC) method to estimate statistical moments of the deposited energy in terms of statistical moments of the random inputs, and the global sensitivity analysis (GSA) to quantify the relative importance of uncertainty in these parameters on the overall predictive uncertainty. A nonlinear radiation-diffusion equation dramatically magnifies the coefficient of variation of the uncertain parameters, yielding a large coefficient of variation for the predicted energy deposition. This demonstrates that accurate prediction of the energy deposition requires a proper treatment of even small parametric uncertainty. Our analysis also reveals that SC outperforms standard Monte Carlo, but its relative efficiency decreases as the number of uncertain parameters increases from one to three. A robust GSA ameliorates this problem by reducing this number.

  4. Historical Data Analysis of Hospital Discharges Related to the Amerithrax Attack in Florida

    PubMed Central

    Burke, Lauralyn K.; Brown, C. Perry; Johnson, Tammie M.

    2016-01-01

    Interrupted time-series analysis (ITSA) can be used to identify, quantify, and evaluate the magnitude and direction of an event on the basis of time-series data. This study evaluates the impact of the bioterrorist anthrax attacks (“Amerithrax”) on hospital inpatient discharges in the metropolitan statistical area of Palm Beach, Broward, and Miami-Dade counties in the fourth quarter of 2001. Three statistical methods—standardized incidence ratio (SIR), segmented regression, and an autoregressive integrated moving average (ARIMA)—were used to determine whether Amerithrax influenced inpatient utilization. The SIR found a non–statistically significant 2 percent decrease in hospital discharges. Although the segmented regression test found a slight increase in the discharge rate during the fourth quarter, it was also not statistically significant; therefore, it could not be attributed to Amerithrax. Segmented regression diagnostics preparing for ARIMA indicated that the quarterly data time frame was not serially correlated and violated one of the assumptions for the use of the ARIMA method and therefore could not properly evaluate the impact on the time-series data. Lack of data granularity of the time frames hindered the successful evaluation of the impact by the three analytic methods. This study demonstrates that the granularity of the data points is as important as the number of data points in a time series. ITSA is important for the ability to evaluate the impact that any hazard may have on inpatient utilization. Knowledge of hospital utilization patterns during disasters offer healthcare and civic professionals valuable information to plan, respond, mitigate, and evaluate any outcomes stemming from biothreats. PMID:27843420

  5. Optical properties of mice skin for optical therapy relevant wavelengths: influence of gender and pigmentation

    NASA Astrophysics Data System (ADS)

    Sabino, C. P.; Deana, A. M.; Silva, D. F. T.; França, C. M.; Yoshimura, T. M.; Ribeiro, M. S.

    2015-03-01

    Red and near-infrared light have been widely employed in optical therapies. Skin is the most common optical barrier in non-invasive techniques and in many cases it is the target tissue itself. Consequently, to optimize the outcomes brought by lightbased therapies, the optical properties of skin tissue must be very well elucidated. In the present study, we evaluated the dorsal skin optical properties of albino (BALB/c) and pigmented (C57BL/6) mice using the Kubelka-Munk photon transport model. We evaluated samples from male and female young mice of both strains. Analysis was performed for wavelengths at 630, 660, 780, 810 and 905 nm due to their prevalent use in optical therapies, such as low-level light (or laser) and photodynamic therapies. Spectrophotometric measurements of diffuse transmittance and reflectance were performed using a single integrating sphere coupled to a proper spectrophotometer. Statistic analysis was made by two-way ANOVA, with Tukey as post-test and Levenne and Shapiro-Wilks as pre-tests. Statistical significance was considered when p<0.05. Our results show only a slight transmittance increment (<10 %) as wavelengths are increased from 630 to 905 nm, and no statistical significance was observed. Albino male mice present reduced transmittance levels for all wavelengths. The organization and abundance of skin composing tissues significantly influence its scattering optical properties although absorption remains constant. We conclude that factors such as subcutaneous adiposity and connective tissue structure can have statistically significant influence on mice skin optical properties and these factors have relevant variations among different gender and strains.

  6. In vivo Comet assay--statistical analysis and power calculations of mice testicular cells.

    PubMed

    Hansen, Merete Kjær; Sharma, Anoop Kumar; Dybdahl, Marianne; Boberg, Julie; Kulahci, Murat

    2014-11-01

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. CellTree: an R/bioconductor package to infer the hierarchical structure of cell populations from single-cell RNA-seq data.

    PubMed

    duVerle, David A; Yotsukura, Sohiya; Nomura, Seitaro; Aburatani, Hiroyuki; Tsuda, Koji

    2016-09-13

    Single-cell RNA sequencing is fast becoming one the standard method for gene expression measurement, providing unique insights into cellular processes. A number of methods, based on general dimensionality reduction techniques, have been suggested to help infer and visualise the underlying structure of cell populations from single-cell expression levels, yet their models generally lack proper biological grounding and struggle at identifying complex differentiation paths. Here we introduce cellTree: an R/Bioconductor package that uses a novel statistical approach, based on document analysis techniques, to produce tree structures outlining the hierarchical relationship between single-cell samples, while identifying latent groups of genes that can provide biological insights. With cellTree, we provide experimentalists with an easy-to-use tool, based on statistically and biologically-sound algorithms, to efficiently explore and visualise single-cell RNA data. The cellTree package is publicly available in the online Bionconductor repository at: http://bioconductor.org/packages/cellTree/ .

  8. Physiological time-series analysis: what does regularity quantify?

    NASA Technical Reports Server (NTRS)

    Pincus, S. M.; Goldberger, A. L.

    1994-01-01

    Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.

  9. Reconstructing the intermittent dynamics of the torque in wind turbines

    NASA Astrophysics Data System (ADS)

    Lind, Pedro G.; Wächter, Matthias; Peinke, Joachim

    2014-06-01

    We apply a framework introduced in the late nineties to analyze load measurements in off-shore wind energy converters (WEC). The framework is borrowed from statistical physics and properly adapted to the analysis of multivariate data comprising wind velocity, power production and torque measurements, taken at one single WEC. In particular, we assume that wind statistics drives the fluctuations of the torque produced in the wind turbine and show how to extract an evolution equation of the Langevin type for the torque driven by the wind velocity. It is known that the intermittent nature of the atmosphere, i.e. of the wind field, is transferred to the power production of a wind energy converter and consequently to the shaft torque. We show that the derived stochastic differential equation quantifies the dynamical coupling of the measured fluctuating properties as well as it reproduces the intermittency observed in the data. Finally, we discuss our approach in the light of turbine monitoring, a particular important issue in off-shore wind farms.

  10. Accounting for response misclassification and covariate measurement error improves power and reduces bias in epidemiologic studies.

    PubMed

    Cheng, Dunlei; Branscum, Adam J; Stamey, James D

    2010-07-01

    To quantify the impact of ignoring misclassification of a response variable and measurement error in a covariate on statistical power, and to develop software for sample size and power analysis that accounts for these flaws in epidemiologic data. A Monte Carlo simulation-based procedure is developed to illustrate the differences in design requirements and inferences between analytic methods that properly account for misclassification and measurement error to those that do not in regression models for cross-sectional and cohort data. We found that failure to account for these flaws in epidemiologic data can lead to a substantial reduction in statistical power, over 25% in some cases. The proposed method substantially reduced bias by up to a ten-fold margin compared to naive estimates obtained by ignoring misclassification and mismeasurement. We recommend as routine practice that researchers account for errors in measurement of both response and covariate data when determining sample size, performing power calculations, or analyzing data from epidemiological studies. 2010 Elsevier Inc. All rights reserved.

  11. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline

    PubMed Central

    2013-01-01

    Background As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. Results We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS A : DE genes with non-zero effect sizes in all studies, (2) HS B : DE genes with non-zero effect sizes in one or more studies and (3) HS r : DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. Conclusions The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS A , HS B , and HS r ). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author’s publication website. PMID:24359104

  12. Meta-analysis methods for combining multiple expression profiles: comparisons, statistical characterization and an application guideline.

    PubMed

    Chang, Lun-Ching; Lin, Hui-Min; Sibille, Etienne; Tseng, George C

    2013-12-21

    As high-throughput genomic technologies become accurate and affordable, an increasing number of data sets have been accumulated in the public domain and genomic information integration and meta-analysis have become routine in biomedical research. In this paper, we focus on microarray meta-analysis, where multiple microarray studies with relevant biological hypotheses are combined in order to improve candidate marker detection. Many methods have been developed and applied in the literature, but their performance and properties have only been minimally investigated. There is currently no clear conclusion or guideline as to the proper choice of a meta-analysis method given an application; the decision essentially requires both statistical and biological considerations. We performed 12 microarray meta-analysis methods for combining multiple simulated expression profiles, and such methods can be categorized for different hypothesis setting purposes: (1) HS(A): DE genes with non-zero effect sizes in all studies, (2) HS(B): DE genes with non-zero effect sizes in one or more studies and (3) HS(r): DE gene with non-zero effect in "majority" of studies. We then performed a comprehensive comparative analysis through six large-scale real applications using four quantitative statistical evaluation criteria: detection capability, biological association, stability and robustness. We elucidated hypothesis settings behind the methods and further apply multi-dimensional scaling (MDS) and an entropy measure to characterize the meta-analysis methods and data structure, respectively. The aggregated results from the simulation study categorized the 12 methods into three hypothesis settings (HS(A), HS(B), and HS(r)). Evaluation in real data and results from MDS and entropy analyses provided an insightful and practical guideline to the choice of the most suitable method in a given application. All source files for simulation and real data are available on the author's publication website.

  13. Proper Image Subtraction—Optimal Transient Detection, Photometry, and Hypothesis Testing

    NASA Astrophysics Data System (ADS)

    Zackay, Barak; Ofek, Eran O.; Gal-Yam, Avishay

    2016-10-01

    Transient detection and flux measurement via image subtraction stand at the base of time domain astronomy. Due to the varying seeing conditions, the image subtraction process is non-trivial, and existing solutions suffer from a variety of problems. Starting from basic statistical principles, we develop the optimal statistic for transient detection, flux measurement, and any image-difference hypothesis testing. We derive a closed-form statistic that: (1) is mathematically proven to be the optimal transient detection statistic in the limit of background-dominated noise, (2) is numerically stable, (3) for accurately registered, adequately sampled images, does not leave subtraction or deconvolution artifacts, (4) allows automatic transient detection to the theoretical sensitivity limit by providing credible detection significance, (5) has uncorrelated white noise, (6) is a sufficient statistic for any further statistical test on the difference image, and, in particular, allows us to distinguish particle hits and other image artifacts from real transients, (7) is symmetric to the exchange of the new and reference images, (8) is at least an order of magnitude faster to compute than some popular methods, and (9) is straightforward to implement. Furthermore, we present extensions of this method that make it resilient to registration errors, color-refraction errors, and any noise source that can be modeled. In addition, we show that the optimal way to prepare a reference image is the proper image coaddition presented in Zackay & Ofek. We demonstrate this method on simulated data and real observations from the PTF data release 2. We provide an implementation of this algorithm in MATLAB and Python.

  14. Multi-region statistical shape model for cochlear implantation

    NASA Astrophysics Data System (ADS)

    Romera, Jordi; Kjer, H. Martin; Piella, Gemma; Ceresa, Mario; González Ballester, Miguel A.

    2016-03-01

    Statistical shape models are commonly used to analyze the variability between similar anatomical structures and their use is established as a tool for analysis and segmentation of medical images. However, using a global model to capture the variability of complex structures is not enough to achieve the best results. The complexity of a proper global model increases even more when the amount of data available is limited to a small number of datasets. Typically, the anatomical variability between structures is associated to the variability of their physiological regions. In this paper, a complete pipeline is proposed for building a multi-region statistical shape model to study the entire variability from locally identified physiological regions of the inner ear. The proposed model, which is based on an extension of the Point Distribution Model (PDM), is built for a training set of 17 high-resolution images (24.5 μm voxels) of the inner ear. The model is evaluated according to its generalization ability and specificity. The results are compared with the ones of a global model built directly using the standard PDM approach. The evaluation results suggest that better accuracy can be achieved using a regional modeling of the inner ear.

  15. GIS-based spatial statistical analysis of risk areas for liver flukes in Surin Province of Thailand.

    PubMed

    Rujirakul, Ratana; Ueng-arporn, Naporn; Kaewpitoon, Soraya; Loyd, Ryan J; Kaewthani, Sarochinee; Kaewpitoon, Natthawut

    2015-01-01

    It is urgently necessary to be aware of the distribution and risk areas of liver fluke, Opisthorchis viverrini, for proper allocation of prevention and control measures. This study aimed to investigate the human behavior, and environmental factors influencing the distribution in Surin Province of Thailand, and to build a model using stepwise multiple regression analysis with a geographic information system (GIS) on environment and climate data. The relationship between the human behavior, attitudes (<50%; X111), environmental factors like population density (148-169 pop/km2; X73), and land use as wetland (X64), were correlated with the liver fluke disease distribution at 0.000, 0.034, and 0.006 levels, respectively. Multiple regression analysis, by equations OV=-0.599+0.005(population density (148-169 pop/km2); X73)+0.040 (human attitude (<50%); X111)+0.022 (land used (wetland; X64), was used to predict the distribution of liver fluke. OV is the patients of liver fluke infection, R Square=0.878, and, Adjust R Square=0.849. By GIS analysis, we found Si Narong, Sangkha, Phanom Dong Rak, Mueang Surin, Non Narai, Samrong Thap, Chumphon Buri, and Rattanaburi to have the highest distributions in Surin province. In conclusion, the combination of GIS and statistical analysis can help simulate the spatial distribution and risk areas of liver fluke, and thus may be an important tool for future planning of prevention and control measures.

  16. The use and misuse of aircraft and missile RCS statistics

    NASA Astrophysics Data System (ADS)

    Bishop, Lee R.

    1991-07-01

    Both static and dynamic radar cross sections measurements are used for RCS predictions, but the static data are less complete than the dynamic. Integrated dynamics RCS data also have limitations for prediction radar detection performance. When raw static data are properly used, good first-order detection estimates are possible. The research to develop more-usable RCS statistics is reviewed, and windowing techniques for creating probability density functions from static RCS data are discussed.

  17. Formulating appropriate statistical hypotheses for treatment comparison in clinical trial design and analysis.

    PubMed

    Huang, Peng; Ou, Ai-hua; Piantadosi, Steven; Tan, Ming

    2014-11-01

    We discuss the problem of properly defining treatment superiority through the specification of hypotheses in clinical trials. The need to precisely define the notion of superiority in a one-sided hypothesis test problem has been well recognized by many authors. Ideally designed null and alternative hypotheses should correspond to a partition of all possible scenarios of underlying true probability models P={P(ω):ω∈Ω} such that the alternative hypothesis Ha={P(ω):ω∈Ωa} can be inferred upon the rejection of null hypothesis Ho={P(ω):ω∈Ω(o)} However, in many cases, tests are carried out and recommendations are made without a precise definition of superiority or a specification of alternative hypothesis. Moreover, in some applications, the union of probability models specified by the chosen null and alternative hypothesis does not constitute a completed model collection P (i.e., H(o)∪H(a) is smaller than P). This not only imposes a strong non-validated assumption of the underlying true models, but also leads to different superiority claims depending on which test is used instead of scientific plausibility. Different ways to partition P fro testing treatment superiority often have different implications on sample size, power, and significance in both efficacy and comparative effectiveness trial design. Such differences are often overlooked. We provide a theoretical framework for evaluating the statistical properties of different specification of superiority in typical hypothesis testing. This can help investigators to select proper hypotheses for treatment comparison inclinical trial design. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Proper orthogonal decomposition analysis for cycle-to-cycle variations of engine flow. Effect of a control device in an inlet pipe

    NASA Astrophysics Data System (ADS)

    Vu, Trung-Thanh; Guibert, Philippe

    2012-06-01

    This paper aims to investigate cycle-to-cycle variations of non-reacting flow inside a motored single-cylinder transparent engine in order to judge the insertion amplitude of a control device able to displace linearly inside the inlet pipe. Three positions corresponding to three insertion amplitudes are implemented to modify the main aerodynamic properties from one cycle to the next. Numerous particle image velocimetry (PIV) two-dimensional velocity fields following cycle database are post-treated to discriminate specific contributions of the fluctuating flow. We performed a multiple snapshot proper orthogonal decomposition (POD) in the tumble plane of a pent roof SI engine. The analytical process consists of a triple decomposition for each instantaneous velocity field into three distinctive parts named mean part, coherent part and turbulent part. The 3rd- and 4th-centered statistical moments of the proper orthogonal decomposition (POD)-filtered velocity field as well as the probability density function of the PIV realizations proved that the POD extracts different behaviors of the flow. Especially, the cyclic variability is assumed to be contained essentially in the coherent part. Thus, the cycle-to-cycle variations of the engine flows might be provided from the corresponding POD temporal coefficients. It has been shown that the in-cylinder aerodynamic dispersions can be adapted and monitored by controlling the insertion depth of the control instrument inside the inlet pipe.

  19. An Anatomic Study on Whether the Immature Patella is Centered on an Anteroposterior Radiograph.

    PubMed

    Kyriakedes, James C; Liu, Raymond W

    2017-03-01

    In the operating room, after first obtaining a proper lateral radiograph with the condyles superimposed, a 90-degree rotation of the intraoperative fluoroscopy unit does not always produce an anteroposterior (AP) image with the patella centered. The orthogonality of these 2 views has not been well determined in children. This study was comprised of a radiographic group (35 knees) and a cadaveric group (59 knees). Both cadaveric and clinical images were obtained by resting or positioning the femur with the posterior condyles overlapped, and then taking an orthogonal AP image. Centering of the patella was calculated and multiple regression analysis was performed to determine the relationship between patellar centering and age, sex, ethnicity, mechanical lateral distal femoral angle (mLDFA), medial proximal tibial angle (MPTA), and contralateral centering. Mean patellar centering, expressed as the lateral position of the patella with respect to the total condylar width, was 0.08±0.10 in the radiographic group and 0.06±0.03 in the cadaveric group. Positive (lateral) patellar centering in 1 knee had a statistically significant correlation with positive patellar centering in the contralateral knee in both the radiographs and the cadavers. In the radiographic group, there was a statistically significant correlation between femoral varus and valgus deformities and positive patellar centering. In the cadaveric group, there was a statistically significant correlation between tibial valgus and negative (medial) patellar centering. The patella in an immature knee is rarely perfectly centered on a true AP image, and is usually seated slightly laterally within the femoral condyles. Obtaining a true AP intraoperative radiograph is critical to analyzing and correcting valgus and varus deformities, and in the proper placement of implants. When addressing knee deformity one should consider obtaining an AP view orthogonal either to a perfect lateral of the knee or orthogonal to the flexion axis of the knee, particularly when evaluating distal femoral deformity.

  20. Civil Rightsspeak.

    ERIC Educational Resources Information Center

    Williams, Walter E.

    1986-01-01

    Today's civil rights debate is clouded by ambiguities of language. The following frequently misused words are clarified in the text so the issues can be properly addressed: 1) segregation; 2) desegregation; 3) minority group; 4) civil rights; 5) compensatory; 6) statistical disparities; and 7) racist. (PS)

  1. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of... respect to implementation of risk adjustment software or as a result of data validation conducted pursuant... implementation of risk adjustment software or data validation. ...

  2. Epidemiologic methods in clinical trials.

    PubMed

    Rothman, K J

    1977-04-01

    Epidemiologic methods developed to control confounding in non-experimental studies are equally applicable for experiments. In experiments, most confounding is usually controlled by random allocation of subjects to treatment groups, but randomization does not preclude confounding except for extremely large studies, the degree of confounding expected being inversely related to the size of the treatment groups. In experiments, as in non-experimental studies, the extent of confounding for each risk indicator should be assessed, and if sufficiently large, controlled. Confounding is properly assessed by comparing the unconfounded effect estimate to the crude effect estimate; a common error is to assess confounding by statistical tests of significance. Assessment of confounding involves its control as a prerequisite. Control is most readily and cogently achieved by stratification of the data, though with many factors to control simultaneously, multivariate analysis or a combination of multivariate analysis and stratification might be necessary.

  3. Characterizing Tityus discrepans scorpion venom from a fractal perspective: Venom complexity, effects of captivity, sexual dimorphism, differences among species.

    PubMed

    D'Suze, Gina; Sandoval, Moisés; Sevcik, Carlos

    2015-12-15

    A characteristic of venom elution patterns, shared with many other complex systems, is that many their features cannot be properly described with statistical or euclidean concepts. The understanding of such systems became possible with Mandelbrot's fractal analysis. Venom elution patterns were produced using the reversed phase high performance liquid chromatography (HPLC) with 1 mg of venom. One reason for the lack of quantitative analyses of the sources of venom variability is parametrizing the venom chromatograms' complexity. We quantize this complexity by means of an algorithm which estimates the contortedness (Q) of a waveform. Fractal analysis was used to compare venoms and to measure inter- and intra-specific venom variability. We studied variations in venom complexity derived from gender, seasonal and environmental factors, duration of captivity in the laboratory, technique used to milk venom. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Testing Hypotheses about Sun-Climate Complexity Linking

    NASA Astrophysics Data System (ADS)

    Rypdal, M.; Rypdal, K.

    2010-03-01

    We reexamine observational evidence presented in support of the hypothesis of a sun-climate complexity linking by N. Scafetta and B. J. West, Phys. Rev. Lett. 90, 248701 (2003)PRLTAO0031-900710.1103/PhysRevLett.90.248701, which contended that the integrated solar flare index (SFI) and the global temperature anomaly (GTA) both follow Lévy walk statistics with the same waiting-time exponent μ≈2.1. However, their analysis does not account for trends in the signal, cannot deal correctly with infinite variance processes (Lévy flights), and suffers from considering only the second moment. Our analysis shows that properly detrended, the integrated SFI is well described as a Lévy flight, and the integrated GTA as a persistent fractional Brownian motion. These very different stochastic properties of the solar and climate records do not support the hypothesis of a sun-climate complexity linking.

  5. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  6. The polar cusp from a particle point of view: A statistical study based on Viking data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aparicio, B.; Thelin, B.; Lundin, R.

    1991-08-01

    The authors present results from the particle measurements made on board the Viking satellite. For the period of interest the Viking orbits covered at high latitudes the whole dayside sector. Data from the Viking V-3 particle experiment acquired during the Polar Region Outer Magnetospheric International Study period have been used to study the extension of the cusp and cleft in magnetic local time and invariant latitude, and furthermore, their dependence on solar wind and interplanetary magnetic field parameters. The study is limited to the MLT range from 0900 to 1500 and to invariant latitudes (ILAT) from 74{degree} to 82{degree}. Thismore » region is divided into bins of size. The authors concentrated on the region where magnetosheath solar wind plasma penetrates more directly into the magnetosphere and is measured at Viking altitudes. This region is called the cusp proper, to be distinguished from a broader region denoted the cleft, where more energetic particles are observed. Statistically, they find the cusp proper to extend from invariant latitudes of 75{degree} to 82{degree} and magnetic local times from 0930 to 1400 MLT. The width in ILAT is found to be on average {approx}2{degree} and in MLT {approx}2 hours. It is shown that a clear correlation exists between the densities in the cusp proper calculated from the Viking V-3 experiment in the cusp proper and those in the solar wind calculated from IMP 8 measurements. It is also shown that the position of the cusp proper in MLT depends on the sense of the By component of the interplanetary magnetic field (IMF By), giving a well-defined displacement of the region of maximum occurrence toward earlier MLTs for IMF By < 0 and a less defined displacement toward later MLTs for IMF By > 0.« less

  7. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    PubMed

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.

  8. Proper interpretation of chronic toxicity studies and their statistics: A critique of “Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example”

    PubMed Central

    Kissling, Grace E.; Haseman, Joseph K.; Zeiger, Errol

    2014-01-01

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP’s statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800 × 0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP’s decision making process, overstates the number of statistical comparisons made, and ignores that fact that that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus’ conclusion that such obvious responses merely “generate a hypothesis” rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. PMID:25261588

  9. A quantitative analysis of statistical power identifies obesity end points for improved in vivo preclinical study design.

    PubMed

    Selimkhanov, J; Thompson, W C; Guo, J; Hall, K D; Musante, C J

    2017-08-01

    The design of well-powered in vivo preclinical studies is a key element in building the knowledge of disease physiology for the purpose of identifying and effectively testing potential antiobesity drug targets. However, as a result of the complexity of the obese phenotype, there is limited understanding of the variability within and between study animals of macroscopic end points such as food intake and body composition. This, combined with limitations inherent in the measurement of certain end points, presents challenges to study design that can have significant consequences for an antiobesity program. Here, we analyze a large, longitudinal study of mouse food intake and body composition during diet perturbation to quantify the variability and interaction of the key metabolic end points. To demonstrate how conclusions can change as a function of study size, we show that a simulated preclinical study properly powered for one end point may lead to false conclusions based on secondary end points. We then propose the guidelines for end point selection and study size estimation under different conditions to facilitate proper power calculation for a more successful in vivo study design.

  10. Statistical Analysis of the Fractal Gating Motions of the Enzyme Acetylcholinesterase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, T Y.; Tai, Kaihsu; Mccammon, Andy

    The enzyme acetylcholinesterase has an active site that is accessible only by a gorge or main channel from the surface, and perhaps by secondary channels such as the back door. Molecular-dynamics simulations show that these channels are too narrow most of the time to admit substrate or other small molecules. Binding of substrates is therefore gated by structural fluctuations of the enzyme. Here, we analyze the fluctuations of these possible channels, as observed in the 10.8-ns trajectory of the simulation. The probability density function of the gorge proper radius (defined in the text) was calculated. A double-peak feature of themore » function was discovered and therefore two states with a threshold were identified. The relaxation (transition probability) functions of these two states were also calculated. The results revealed a power-law decay trend and an oscillation around it, which show properties of fractal dynamics with a complex exponent. The cross correlation of potential energy versus proper radius was also investigated. We discuss possible physical models behind the fractal protein dynamics; the dynamic hierarchical model for glassy systems is evaluated in detail.« less

  11. Network model of bilateral power markets based on complex networks

    NASA Astrophysics Data System (ADS)

    Wu, Yang; Liu, Junyong; Li, Furong; Yan, Zhanxin; Zhang, Li

    2014-06-01

    The bilateral power transaction (BPT) mode becomes a typical market organization with the restructuring of electric power industry, the proper model which could capture its characteristics is in urgent need. However, the model is lacking because of this market organization's complexity. As a promising approach to modeling complex systems, complex networks could provide a sound theoretical framework for developing proper simulation model. In this paper, a complex network model of the BPT market is proposed. In this model, price advantage mechanism is a precondition. Unlike other general commodity transactions, both of the financial layer and the physical layer are considered in the model. Through simulation analysis, the feasibility and validity of the model are verified. At same time, some typical statistical features of BPT network are identified. Namely, the degree distribution follows the power law, the clustering coefficient is low and the average path length is a bit long. Moreover, the topological stability of the BPT network is tested. The results show that the network displays a topological robustness to random market member's failures while it is fragile against deliberate attacks, and the network could resist cascading failure to some extent. These features are helpful for making decisions and risk management in BPT markets.

  12. Universal Algorithm for Identification of Fractional Brownian Motion. A Case of Telomere Subdiffusion

    PubMed Central

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-01-01

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912

  13. Brain MRI analysis for Alzheimer's disease diagnosis using an ensemble system of deep convolutional neural networks.

    PubMed

    Islam, Jyoti; Zhang, Yanqing

    2018-05-31

    Alzheimer's disease is an incurable, progressive neurological brain disorder. Earlier detection of Alzheimer's disease can help with proper treatment and prevent brain tissue damage. Several statistical and machine learning models have been exploited by researchers for Alzheimer's disease diagnosis. Analyzing magnetic resonance imaging (MRI) is a common practice for Alzheimer's disease diagnosis in clinical research. Detection of Alzheimer's disease is exacting due to the similarity in Alzheimer's disease MRI data and standard healthy MRI data of older people. Recently, advanced deep learning techniques have successfully demonstrated human-level performance in numerous fields including medical image analysis. We propose a deep convolutional neural network for Alzheimer's disease diagnosis using brain MRI data analysis. While most of the existing approaches perform binary classification, our model can identify different stages of Alzheimer's disease and obtains superior performance for early-stage diagnosis. We conducted ample experiments to demonstrate that our proposed model outperformed comparative baselines on the Open Access Series of Imaging Studies dataset.

  14. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  15. Determining the accuracy of maximum likelihood parameter estimates with colored residuals

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Klein, Vladislav

    1994-01-01

    An important part of building high fidelity mathematical models based on measured data is calculating the accuracy associated with statistical estimates of the model parameters. Indeed, without some idea of the accuracy of parameter estimates, the estimates themselves have limited value. In this work, an expression based on theoretical analysis was developed to properly compute parameter accuracy measures for maximum likelihood estimates with colored residuals. This result is important because experience from the analysis of measured data reveals that the residuals from maximum likelihood estimation are almost always colored. The calculations involved can be appended to conventional maximum likelihood estimation algorithms. Simulated data runs were used to show that the parameter accuracy measures computed with this technique accurately reflect the quality of the parameter estimates from maximum likelihood estimation without the need for analysis of the output residuals in the frequency domain or heuristically determined multiplication factors. The result is general, although the application studied here is maximum likelihood estimation of aerodynamic model parameters from flight test data.

  16. AstroML: Python-powered Machine Learning for Astronomy

    NASA Astrophysics Data System (ADS)

    Vander Plas, Jake; Connolly, A. J.; Ivezic, Z.

    2014-01-01

    As astronomical data sets grow in size and complexity, automated machine learning and data mining methods are becoming an increasingly fundamental component of research in the field. The astroML project (http://astroML.org) provides a common repository for practical examples of the data mining and machine learning tools used and developed by astronomical researchers, written in Python. The astroML module contains a host of general-purpose data analysis and machine learning routines, loaders for openly-available astronomical datasets, and fast implementations of specific computational methods often used in astronomy and astrophysics. The associated website features hundreds of examples of these routines being used for analysis of real astronomical datasets, while the associated textbook provides a curriculum resource for graduate-level courses focusing on practical statistics, machine learning, and data mining approaches within Astronomical research. This poster will highlight several of the more powerful and unique examples of analysis performed with astroML, all of which can be reproduced in their entirety on any computer with the proper packages installed.

  17. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  18. Recommendations for research design of telehealth studies.

    PubMed

    Chumbler, Neale R; Kobb, Rita; Brennan, David M; Rabinowitz, Terry

    2008-11-01

    Properly designed randomized controlled trials (RCTs) are the gold standard to use when examining the effectiveness of telehealth interventions on clinical outcomes. Some published telehealth studies have employed well-designed RCTs. However, such methods are not always feasible and practical in particular settings. This white paper addresses not only the need for properly designed RCTs, but also offers alternative research designs, such as quasi-experimental designs, and statistical techniques that can be employed to rigorously assess the effectiveness of telehealth studies. This paper further offers design and measurement recommendations aimed at and relevant to administrative decision-makers, policymakers, and practicing clinicians.

  19. Selecting statistical model and optimum maintenance policy: a case study of hydraulic pump.

    PubMed

    Ruhi, S; Karim, M R

    2016-01-01

    Proper maintenance policy can play a vital role for effective investigation of product reliability. Every engineered object such as product, plant or infrastructure needs preventive and corrective maintenance. In this paper we look at a real case study. It deals with the maintenance of hydraulic pumps used in excavators by a mining company. We obtain the data that the owner had collected and carry out an analysis and building models for pump failures. The data consist of both failure and censored lifetimes of the hydraulic pump. Different competitive mixture models are applied to analyze a set of maintenance data of a hydraulic pump. Various characteristics of the mixture models, such as the cumulative distribution function, reliability function, mean time to failure, etc. are estimated to assess the reliability of the pump. Akaike Information Criterion, adjusted Anderson-Darling test statistic, Kolmogrov-Smirnov test statistic and root mean square error are considered to select the suitable models among a set of competitive models. The maximum likelihood estimation method via the EM algorithm is applied mainly for estimating the parameters of the models and reliability related quantities. In this study, it is found that a threefold mixture model (Weibull-Normal-Exponential) fits well for the hydraulic pump failures data set. This paper also illustrates how a suitable statistical model can be applied to estimate the optimum maintenance period at a minimum cost of a hydraulic pump.

  20. The Influence of 16-year-old Students' Gender, Mental Abilities, and Motivation on their Reading and Drawing Submicrorepresentations Achievements

    NASA Astrophysics Data System (ADS)

    Devetak, Iztok; Aleksij Glažar, Saša

    2010-08-01

    Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.

  1. The Need for Speed in Rodent Locomotion Analyses

    PubMed Central

    Batka, Richard J.; Brown, Todd J.; Mcmillan, Kathryn P.; Meadows, Rena M.; Jones, Kathryn J.; Haulcomb, Melissa M.

    2016-01-01

    Locomotion analysis is now widely used across many animal species to understand the motor defects in disease, functional recovery following neural injury, and the effectiveness of various treatments. More recently, rodent locomotion analysis has become an increasingly popular method in a diverse range of research. Speed is an inseparable aspect of locomotion that is still not fully understood, and its effects are often not properly incorporated while analyzing data. In this hybrid manuscript, we accomplish three things: (1) review the interaction between speed and locomotion variables in rodent studies, (2) comprehensively analyze the relationship between speed and 162 locomotion variables in a group of 16 wild-type mice using the CatWalk gait analysis system, and (3) develop and test a statistical method in which locomotion variables are analyzed and reported in the context of speed. Notable results include the following: (1) over 90% of variables, reported by CatWalk, were dependent on speed with an average R2 value of 0.624, (2) most variables were related to speed in a nonlinear manner, (3) current methods of controlling for speed are insufficient, and (4) the linear mixed model is an appropriate and effective statistical method for locomotion analyses that is inclusive of speed-dependent relationships. Given the pervasive dependency of locomotion variables on speed, we maintain that valid conclusions from locomotion analyses cannot be made unless they are analyzed and reported within the context of speed. PMID:24890845

  2. Approximate Model of Zone Sedimentation

    NASA Astrophysics Data System (ADS)

    Dzianik, František

    2011-12-01

    The process of zone sedimentation is affected by many factors that are not possible to express analytically. For this reason, the zone settling is evaluated in practice experimentally or by application of an empirical mathematical description of the process. The paper presents the development of approximate model of zone settling, i.e. the general function which should properly approximate the behaviour of the settling process within its entire range and at the various conditions. Furthermore, the specification of the model parameters by the regression analysis of settling test results is shown. The suitability of the model is reviewed by graphical dependencies and by statistical coefficients of correlation. The approximate model could by also useful on the simplification of process design of continual settling tanks and thickeners.

  3. The Zombie Plot: A Simple Graphic Method for Visualizing the Efficacy of a Diagnostic Test.

    PubMed

    Richardson, Michael L

    2016-08-09

    One of the most important jobs of a radiologist is to pick the most appropriate imaging test for a particular clinical situation. Making a proper selection sometimes requires statistical analysis. The objective of this article is to introduce a simple graphic technique, an ROC plot that has been divided into zones of mostly bad imaging efficacy (ZOMBIE, hereafter referred to as the "zombie plot"), that transforms information about imaging efficacy from the numeric domain into the visual domain. The numeric rationale for the use of zombie plots is given, as are several examples of the clinical use of these plots. Two online calculators are described that simplify the process of producing a zombie plot.

  4. GIS based solid waste management information system for Nagpur, India.

    PubMed

    Vijay, Ritesh; Jain, Preeti; Sharma, N; Bhattacharyya, J K; Vaidya, A N; Sohony, R A

    2013-01-01

    Solid waste management is one of the major problems of today's world and needs to be addressed by proper utilization of technologies and design of effective, flexible and structured information system. Therefore, the objective of this paper was to design and develop a GIS based solid waste management information system as a decision making and planning tool for regularities and municipal authorities. The system integrates geo-spatial features of the city and database of existing solid waste management. GIS based information system facilitates modules of visualization, query interface, statistical analysis, report generation and database modification. It also provides modules like solid waste estimation, collection, transportation and disposal details. The information system is user-friendly, standalone and platform independent.

  5. The “Task B problem” and other considerations in developmental functional neuroimaging

    PubMed Central

    Church, Jessica A.; Petersen, Steven E.; Schlaggar, Bradley L.

    2012-01-01

    Functional neuroimaging provides a remarkable tool to allow us to study cognition across the lifespan and in special populations in a safe way. However, experimenters face a number of methodological issues, and these issues are particularly pertinent when imaging children. This brief article discusses assessing task performance, strategies for dealing with group performance differences, controlling for movement, statistical power, proper atlas registration, and data analysis strategies. In addition, there will be discussion of two other topics that have important implications for interpreting fMRI data: the question of whether functional neuroanatomical differences between adults and children are the consequence of putative developmental neurovascular differences, and the issue of interpreting negative blood oxygenation-level dependent (BOLD) signal change. PMID:20496376

  6. Cellular imaging using temporally flickering nanoparticles.

    PubMed

    Ilovitsh, Tali; Danan, Yossef; Meir, Rinat; Meiri, Amihai; Zalevsky, Zeev

    2015-02-04

    Utilizing the surface plasmon resonance effect in gold nanoparticles enables their use as contrast agents in a variety of applications for compound cellular imaging. However, most techniques suffer from poor signal to noise ratio (SNR) statistics due to high shot noise that is associated with low photon count in addition to high background noise. We demonstrate an effective way to improve the SNR, in particular when the inspected signal is indistinguishable in the given noisy environment. We excite the temporal flickering of the scattered light from gold nanoparticle that labels a biological sample. By preforming temporal spectral analysis of the received spatial image and by inspecting the proper spectral component corresponding to the modulation frequency, we separate the signal from the wide spread spectral noise (lock-in amplification).

  7. Communicating natural hazards. The case of marine extreme events and the importance of the forecast's errors.

    NASA Astrophysics Data System (ADS)

    Marone, Eduardo; Camargo, Ricardo

    2013-04-01

    Scientific knowledge has to fulfill some necessary conditions. Among them, it has to be properly communicated. Usually, scientists (mis)understand that the communication requirement is satisfied by publishing their results on peer reviewed journals. Society claims for information in other formats or languages and other tools and approaches have to be used, otherwise the scientific discoveries will not fulfill its social mean. However, scientists are not so well trained to do so. These facts are particularly relevant when the scientific work has to deal with natural hazards, which do not affect just a lab or a computer experiment, but the life and fate of human beings. We are actually working with marine extreme events related with sea level changes, waves and other coastal hazards. Primary, the work is developed on the classic scientific format, but focusing not only in the stochastic way of predicting such extreme events, but estimating the potential errors the forecasting methodologies intrinsically have. The scientific results are translated to a friendly format required by stakeholders (which are financing part of the work). Finally, we hope to produce a document prepared for the general public. Each of the targets has their own characteristics and we have to use the proper communication tools and languages. Also, when communicating such knowledge, we have to consider that stakeholders and general public have no obligation of understanding the scientific language, but scientists have the responsibility of translating their discoveries and predictions in a proper way. The information on coastal hazards is analyzed in statistical and numerical ways, departing from long term observation of, for instance, sea level. From the analysis it is possible to recognize different natural regimes and to present the return times of extreme events, while from the numerical models, properly tuned to reproduce the same past ocean behavior using hindcast approaches, it is possible to produce short and long term forecasts. While the statistic of extremes is useful for many stakeholders, short term forecasts could be of importance for the whole society. Whatever the case, the prediction errors have to be emphasizes even more than the forecasts. The most common forecast in terms of general public understanding is the weather prediction. Nowadays, general public knows it well enough to properly deal with the uncertainties, because after so many year of not perfect forecasts, society knows the limits. Other coastal hazards deserve to be presented more carefully, and some successful example of the use of the precautionary principle could be observed, for instance, on the Pacific Tsunami alert system. Nowadays, the preparedness of the coastal population is good enough (even in such big and diverse area) not to be bored to run up the hill, most of the times unnecessarily, because they know the uncertainty and accept it. The key issue we, scientists, have to work better at any level, is the need of properly estimate and communicate the uncertainties of our results, cause they are not obvious nor irrelevant.

  8. Va-Room: Motorcycle Safety.

    ERIC Educational Resources Information Center

    Keller, Rosanne

    One of a series of instructional materials produced by the Literacy Council of Alaska, this booklet provides information about motorcycle safety. Using a simplified vocabulary and shorter sentences, it offers statistics concerning motorcycle accidents; information on how to choose the proper machine; basic information about the operation of the…

  9. A Proper Perspective on the Twin Deficits

    DTIC Science & Technology

    1989-05-01

    deficit twins, the relation between them, and their consanguine parentage. The trade deficit or, to be more accurate, the current account deficit, is...In general, there is a small negative, but statistically significant, relationship between the size of the federal deficit in one year and the

  10. 77 FR 65358 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-26

    ... National Agricultural Statistics Service (NASS) primary function of collecting, processing, and...) Whether the collection of information is necessary for the proper performance of the functions of the...'' program (non-immigrants who enter the United States for temporary or seasonal agricultural labor) and for...

  11. INTEGRATION OF STATISTICS, REMOTE SENSING AND EXISTING DATA TO LOCATE CHANGES IN LAND RESOURCES

    EPA Science Inventory

    Stability of a nation is dependent on the availability of natural resources. When land is degraded and natural resources become limited, socioeconomic status declines and emigration increases in developing countries. Natural resource utilization without proper management may re...

  12. Civil construction work: The unseen contributor to the occupational and global disease burden

    PubMed Central

    Sitalakshmi, R.; Saikumar, P.; Jeyachandran, P.; Manoharan; Thangavel; Thomas, Jayakar

    2016-01-01

    Background: Construction industry is the second largest employment giving industry in India with many semi-skilled or unskilled workers taking up the occupation for livelihood without any training and proper guidance. Aim: To evaluate the pathogenic association of cement exposure to occupational contact dermatoses as evidenced by immune markers and to correlate their pulmonary functions with years of exposure to cement. Setting and Design: This was a cross-sectional study conducted among randomly selected cement workers. Methods and material: Evaluation of socioeconomic status (SES) and years of exposure of cement workers was done using a questionnaire. Clinical examination of skin lesions and strip patch test with application of potassium dichromate on unexposed skin was performed. Results were interpreted after 48 hours. Absolute eosinophil count (AEC) and IgE levels measured, and spirometric evaluation was performed. Statistical Analysis: Analysis of variance and Pearson's correlation test were used for data analysis. P < 0.05 was considered to be statistically significant. Results: Clinically, skin lesions were noticed in 51%, elevated AEC in 47%, and raised Anti IgE in 73%. Two participants developed positive reactions to the skin strip patch test. Duration of exposure to cement and SES were compared with clinical skin lesions. Spirometry result was normal in 81%, obstruction in 8%, restriction in 10%, and mixed pattern in 1%. Forced expiratory volume at 1.0 second, forced expiratory flow (25–75%), and  (PEFR) Peak Expiratory Flow Rate were markedly reduced with years of exposure. Workers who had greater skin lesions and with increase in exposure had increased AEC and IgE levels, although statistically not significant. Conclusions: Exposure to cement and poor SES is strongly correlated to increased prevalence of skin lesions and reduced pulmonary functions. PMID:28194084

  13. [Oral hygiene habits among tobacco-smoking and non-smoking students of the Medical University of Lublin--chosen aspects].

    PubMed

    Nakonieczna-Rudnicka, Marta; Bachanek, Teresa; Strycharz-Dudziak, Małgorzata; Kobyłecka, Elzbieta

    2010-01-01

    Among several etiologic factors for dental caries and periodontal diseases we can find dental plaque that forms on the teeth surfaces and prosthetic appliances. Elimination of dental plaque by proper oral hygiene procedures is crucial in caries and periodontal disease prevention. The aim of the study was evaluation of tobacco smoking prevalence among dental students of the Medical University of Lublin and the comparative analysis of oral hygiene habits among smoking and nonsmoking students. A questionnaire survey was carried out among 112 students of the Medical University of Lublin during the second, third, fourth and fifth year of their studies. The students were 20-28 years of age. The questions concerned cigarettes smoking habit and the ways of maintaining oral hygiene. Respondents were divided into smoking and non-smoking group. Statistical analysis was carried out. Obtained results were sent to statistical analysis. Cigarette smoking was reported by 16.67% of surveyed students. No significant differences between smoking and non-smoking students were stated in frequency of brushing, changing the toothbrush, density of toothbrush filaments, using manual and power toothbrush, using whitening toothpastes and frequency of using dental floss and toothpicks. Statistically significant difference was noted in gum chewing habit--smoking students chewed the gum more frequently (83.33%) than non-smoking students (40%). Significant differences occurred also in frequency of professional removal of dental deposits. Calculus removal performed twice a year was reported by 50% of smoking students, comparing with 17.8% of nonsmoking students. 37.78% of nonsmoking students declared professional teeth cleaning performed more often than twice a year comparing with 11.11% of respondents from the smokers group (p < 0.05).

  14. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  15. [Cotinine concentration in the saliva in relation to oral hygiene procedures].

    PubMed

    Bachanek, Teresa; Nakonieczna-Rudnicka, Marta; Piekarczyk, Wanda

    2014-01-01

    Cotinine is a biomarker of the exposure to the tobacco smoke, nicotine metabolite with half-life in the saliva which is 17 hours. Assessment of cotinine concentration enables among others verification of the questionnaire data as well as evaluation of both smokers and non-smokers exposure to the tobacco smoke. Practicing proper oral hygiene procedures is an essential factor of the prophylaxis of dental caries and periodontal diseases which influence general health state. The removal of dental calculus is achieved by proper teeth brushing and the use of additional oral aids. The aim of the study was evaluation of cotinine concentration in non-stimulated saliva in order to verify questionnaire data (smoker/non-smoker) and analysis of practicing oral hygiene procedures in relation to the status of cigarette smoking. Questionnaire and biochemical studies were conducted in the group of 116 people aged 20-54. In questionnaire survey 53 people (45.69%) confirmed cigarette smoking, 63 (54.31%) declared they had never smoked and never tried to smoke. Non-stimulated saliva was collected between 9(30) and 11(30), 1,5-2 hours after meal. Cotinine concentration was assayed with the use of Cotinine ELISA (Calbiotech, USA). Obtained study results were submitted to statistic analysis with the use of Chi2. Statistically essential test values were those with p<0,05. In the study group the mean value of cotinine concentration was 155.76 ng/ml. Brushing teeth once a day or less frequently was reported by 26.92% smokers and 4.76% non-smokers, brushing teeth at least twice a day was reported subsequently by 73.08% and 95.24% participants. Non-smokers in comparison with smokers considerably more frequently brushed their teeth, at least twice a day (XZ=11.11, p<0.001). Smokers used a toothbrush with medium hardness bristle (X2=6.05, p<0.05) as well as toothpicks to maintain hygiene of interdental spaces and teeth contact surfaces (X2=21.34, p<0.001) whereas they used dental floss less frequently (X2=10.64, p<0.01). Smokers more fre. quently brushed their teeth improperly (X2=1 3.41, p<0.001). Smokers in comparison with non-smokers did not practice proper oral hygiene which is an essential risk factor of the oral health. It is crucial for dental surgeons to conduct oral hygiene instructions in smokers as well as realization of health threats resulting from cigarette smoking.

  16. Characterization of interfade duration for satellite communication systems design and optimization in a temperate climate

    NASA Astrophysics Data System (ADS)

    Jorge, Flávio; Riva, Carlo; Rocha, Armando

    2016-03-01

    The characterization of the fade dynamics on Earth-satellite links is an important subject when designing the so called fade mitigation techniques that contribute to the proper reliability of the satellite communication systems and the customers' quality of service (QoS). The interfade duration, defined as the period between two consecutive fade events, has been only poorly analyzed using limited data sets, but its complete characterization would enable the design and optimization of the satellite communication systems by estimating the system requirements to recover in time before the next propagation impairment. Depending on this analysis, several actions can be taken ensuring the service maintenance. In this paper we present for the first time a detailed and comprehensive analysis of the interfade events statistical properties based on 9 years of in-excess attenuation measurements at Ka band (19.7 GHz) with very high availability that is required to build a reliable data set mainly for the longer interfade duration events. The number of years necessary to reach the statistical stability of interfade duration is also evaluated for the first time, providing a reference when accessing the relevance of the results published in the past. The study is carried out in Aveiro, Portugal, which is conditioned by temperate Mediterranean climate with Oceanic influences.

  17. Spatial and temporal variation of water quality of a segment of Marikina River using multivariate statistical methods.

    PubMed

    Chounlamany, Vanseng; Tanchuling, Maria Antonia; Inoue, Takanobu

    2017-09-01

    Payatas landfill in Quezon City, Philippines, releases leachate to the Marikina River through a creek. Multivariate statistical techniques were applied to study temporal and spatial variations in water quality of a segment of the Marikina River. The data set included 12 physico-chemical parameters for five monitoring stations over a year. Cluster analysis grouped the monitoring stations into four clusters and identified January-May as dry season and June-September as wet season. Principal components analysis showed that three latent factors are responsible for the data set explaining 83% of its total variance. The chemical oxygen demand, biochemical oxygen demand, total dissolved solids, Cl - and PO 4 3- are influenced by anthropogenic impact/eutrophication pollution from point sources. Total suspended solids, turbidity and SO 4 2- are influenced by rain and soil erosion. The highest state of pollution is at the Payatas creek outfall from March to May, whereas at downstream stations it is in May. The current study indicates that the river monitoring requires only four stations, nine water quality parameters and testing over three specific months of the year. The findings of this study imply that Payatas landfill requires a proper leachate collection and treatment system to reduce its impact on the Marikina River.

  18. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  19. Standardized residual as response function for order identification of multi input intervention analysis

    NASA Astrophysics Data System (ADS)

    Suhartono, Lee, Muhammad Hisyam; Rezeki, Sri

    2017-05-01

    Intervention analysis is a statistical model in the group of time series analysis which is widely used to describe the effect of an intervention caused by external or internal factors. An example of external factors that often occurs in Indonesia is a disaster, both natural or man-made disaster. The main purpose of this paper is to provide the results of theoretical studies on identification step for determining the order of multi inputs intervention analysis for evaluating the magnitude and duration of the impact of interventions on time series data. The theoretical result showed that the standardized residuals could be used properly as response function for determining the order of multi inputs intervention model. Then, these results are applied for evaluating the impact of a disaster on a real case in Indonesia, i.e. the magnitude and duration of the impact of the Lapindo mud on the volume of vehicles on the highway. Moreover, the empirical results showed that the multi inputs intervention model can describe and explain accurately the magnitude and duration of the impact of disasters on a time series data.

  20. CalFitter: a web server for analysis of protein thermal denaturation data.

    PubMed

    Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri

    2018-05-14

    Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.

  1. THE IMPORTANCE OF PROPER INTENSITY CALIBRATION FOR RAMAN ANALYSIS OF LOW-LEVEL ANALYTES IN WATER

    EPA Science Inventory

    Modern dispersive Raman spectroscopy offers unique advantages for the analysis of low-concentration analytes in aqueous solution. However, we have found that proper intensity calibration is critical for obtaining these benefits. This is true not only for producing spectra with ...

  2. Antitumor Efficacy Testing in Rodents

    PubMed Central

    2008-01-01

    The preclinical research and human clinical trials necessary for developing anticancer therapeutics are costly. One contributor to these costs is preclinical rodent efficacy studies, which, in addition to the costs associated with conducting them, often guide the selection of agents for clinical development. If inappropriate or inaccurate recommendations are made on the basis of these preclinical studies, then additional costs are incurred. In this commentary, I discuss the issues associated with preclinical rodent efficacy studies. These include the identification of proper preclinical efficacy models, the selection of appropriate experimental endpoints, and the correct statistical evaluation of the resulting data. I also describe important experimental design considerations, such as selecting the drug vehicle, optimizing the therapeutic treatment plan, properly powering the experiment by defining appropriate numbers of replicates in each treatment arm, and proper randomization. Improved preclinical selection criteria can aid in reducing unnecessary human studies, thus reducing the overall costs of anticancer drug development. PMID:18957675

  3. Proper Motion of Components in 4C 39.25

    NASA Technical Reports Server (NTRS)

    Guirado, J. C.; Marcaide, J. M.; Alberdi, A.; Elosegui, P.; Ratner, M. I.; Shapiro, I. I.; Kilger, R.; Mantovani, F.; Venturi, T.; Rius, A.; hide

    1995-01-01

    From a series of simultaneous 8.4 and 2.3 GHz VLBI observations of the quasar 4C 39.25 phase referenced to the radio source 0920+390, carried out in 1990-1992, we have measured the proper motion of component b in 4C 39.25: mu(sub alpha) = 90 +/- 43 (mu)as/yr, mu(sub beta) = 7 +/- 68 (mu)as/yr, where the quoted uncertainties account for the contribution of the statistical standard deviation and the errors assumed for the parameters related to the geometry of the interferometric array, the atmosphere, and the source structure. This proper motion is consistent with earlier interpretations of VLBI hybrid mapping results, which showed an internal motion of this component with respect to other structural components. Our differential astrometry analyses show component b to be the one in motion. Our results thus further constrain models of this quasar.

  4. 75 FR 69128 - Proposed Collection, Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-10

    ... ensure that requested data can be provided in the desired format, reporting burden (time and financial... requirements on respondents can be properly assessed. The Bureau of Labor Statistics (BLS) is soliciting... Goods and Services Survey.'' A copy of the proposed information collection request (ICR) can be obtained...

  5. 78 FR 35849 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-14

    ...) Whether the collection of information is necessary for the proper performance of the functions of the...: Fruits, Nut, and Specialty Crops. OMB Control Number: 0535-0039. Summary of Collection: The primary function of the National Agricultural Statistics Service (NASS) is to prepare and issue current official...

  6. 76 FR 55345 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-07

    ... primary functions of the National Agricultural Statistics Service (NASS) are to prepare and issue current..., filing of petitions and applications and agency #0;statements of organization and functions are examples... the proper performance of the functions of the agency, including whether the information will have...

  7. The National Health Insurance Scheme (NHIS): a survey of knowledge and opinions of Nigerian dentists' in Lagos.

    PubMed

    Adeniyi, A A; Onajole, A T

    2010-03-01

    This study was designed to assess the knowledge and perceptions of Nigerian dentists to the National Health Insurance scheme (NHIS). A cross-sectional descriptive survey was conducted amongst 250 dentists employed in private and public dental clinics in Lagos State, Nigeria. The survey instrument was a self-administered questionnaire designed to assess their knowledge and attitudes towards the scheme. Data analysis was done using the Epi-Info statistical software (version 6.04). Statistical tools used included measures of central tendency, frequency distribution and chi-square test. A total of 216 dentists (response rate of 82.4%) participated in this study. Most 132 (61.1%) of the respondents had a fair knowledge of the NHIS, while 22 (10.2%) and 62 (28.7%) had poor and good knowledge respectively. Majority (70.4%) viewed the NHIS as a good idea that will succeed if properly implemented. Most (76.6%) respondents also believed that the scheme will improve access to oral health services, affordability of services (71.4%), availability of the services (68.3%) and recognition of dentistry as a profession (62.4%). Most of the respondents (66.2%) considered oral health care as not properly positioned in the NHIS and 154 respondents (74.4%) found the current position of oral health on the NHIS unacceptable. A good number of the respondents (77.3%) would like dentistry to operate at the primary care level on the NHIS. Majority of the dentists involved in this study had some knowledge of the NHIS and were generally positively disposed towards the scheme and viewed it as a good idea.

  8. A New Sample of Cool Subdwarfs from SDSS: Properties and Kinematics

    NASA Astrophysics Data System (ADS)

    Savcheva, Antonia; West, Andrew A.; Bochanski, John J.

    2014-06-01

    We present a new sample of M subdwarfs compiled from the 7th data re- lease of the Sloan Digital Sky Survey. With 3517 new subdwarfs, this new sample significantly increases the number the existing sample of low-mass subdwarfs. This catalog includes unprecedentedly large numbers of extreme and ultra sudwarfs. Here, we present the catalog and the statistical analysis we perform. Subdwarf template spectra are derived. We show color-color and reduced proper motion diagrams of the three metallicity classes, which are shown to separate from the disk dwarf population. The extreme and ultra subdwarfs are seen at larger values of reduced proper motion as expected for more dynamically heated populations. We determine 3D kinematics for all of the stars with proper motions. The color-magnitude diagrams show a clear separation of the three metallicity classes with the ultra and extreme subdwarfs being significantly closer to the main sequence than the ordinary subdwarfs. All subdwarfs lie below and to the blue of the main sequence. Based on the average (U, V, W ) velocities and their dispersions, the extreme and ultra subdwarfs likely belong to the Galactic halo, while the ordinary subdwarfs are likely part of the old Galactic (or thick) disk. An extensive activity analy- sis of subdwarfs is performed using chromospheric Hα emission and 208 active subdwarfs are found. We show that while the activity fraction of subdwarfs rises with spectral class and levels off at the latest spectral classes, consistent with the behavior of M dwarfs, the extreme and ultra subdwarfs are basically flat.

  9. [Mobile phones and head tumours: it is time to read and highlight data in a proper way].

    PubMed

    Levis, Angelo G; Minicucci, Nadia; Ricci, Paolo; Gennaro, Valerio; Garbisa, Spiridione

    2011-01-01

    The uncertainty about the relationship between the use of mobile phones (MPs: analogue and digital cellulars, and cordless) and the increase of head tumour risk can be solved by a critical analysis of the methodological elements of both the positive and the negative studies. Results by Hardell indicate a cause/effect relationship: exposures for or latencies from ≥ 10 years to MPs increase by up to 100% the risk of tumour on the same side of the head preferred for phone use (ipsilateral tumours) - which is the only one significantly irradiated - with statistical significance for brain gliomas, meningiomas and acoustic neuromas. On the contrary, studies published under the Interphone project and others produced negative results and are characterised by the substantial underestimation of the risk of tumour. However, also in the Interphone studies a clear and statistically significant increase of ipsilateral head tumours (gliomas, neuromas and parotid gland tumours) is quite common in people having used MPs since or for ≥ 10 years. And also the metaanalyses by Hardell and other Authors, including only the literature data on ipsilateral tumours in people having used MPs since or for ≥ 10 years - and so also part of the Interphone data - still show statistically significant increases of head tumours.

  10. Sensitivity of the Hydrogen Epoch of Reionization Array and its build-out stages to one-point statistics from redshifted 21 cm observations

    NASA Astrophysics Data System (ADS)

    Kittiwisit, Piyanat; Bowman, Judd D.; Jacobs, Daniel C.; Beardsley, Adam P.; Thyagarajan, Nithyanandan

    2018-03-01

    We present a baseline sensitivity analysis of the Hydrogen Epoch of Reionization Array (HERA) and its build-out stages to one-point statistics (variance, skewness, and kurtosis) of redshifted 21 cm intensity fluctuation from the Epoch of Reionization (EoR) based on realistic mock observations. By developing a full-sky 21 cm light-cone model, taking into account the proper field of view and frequency bandwidth, utilizing a realistic measurement scheme, and assuming perfect foreground removal, we show that HERA will be able to recover statistics of the sky model with high sensitivity by averaging over measurements from multiple fields. All build-out stages will be able to detect variance, while skewness and kurtosis should be detectable for HERA128 and larger. We identify sample variance as the limiting constraint of the measurements at the end of reionization. The sensitivity can also be further improved by performing frequency windowing. In addition, we find that strong sample variance fluctuation in the kurtosis measured from an individual field of observation indicates the presence of outlying cold or hot regions in the underlying fluctuations, a feature that can potentially be used as an EoR bubble indicator.

  11. Computational Functional Analysis of Lipid Metabolic Enzymes.

    PubMed

    Bagnato, Carolina; Have, Arjen Ten; Prados, María B; Beligni, María V

    2017-01-01

    The computational analysis of enzymes that participate in lipid metabolism has both common and unique challenges when compared to the whole protein universe. Some of the hurdles that interfere with the functional annotation of lipid metabolic enzymes that are common to other pathways include the definition of proper starting datasets, the construction of reliable multiple sequence alignments, the definition of appropriate evolutionary models, and the reconstruction of phylogenetic trees with high statistical support, particularly for large datasets. Most enzymes that take part in lipid metabolism belong to complex superfamilies with many members that are not involved in lipid metabolism. In addition, some enzymes that do not have sequence similarity catalyze similar or even identical reactions. Some of the challenges that, albeit not unique, are more specific to lipid metabolism refer to the high compartmentalization of the routes, the catalysis in hydrophobic environments and, related to this, the function near or in biological membranes.In this work, we provide guidelines intended to assist in the proper functional annotation of lipid metabolic enzymes, based on previous experiences related to the phospholipase D superfamily and the annotation of the triglyceride synthesis pathway in algae. We describe a pipeline that starts with the definition of an initial set of sequences to be used in similarity-based searches and ends in the reconstruction of phylogenies. We also mention the main issues that have to be taken into consideration when using tools to analyze subcellular localization, hydrophobicity patterns, or presence of transmembrane domains in lipid metabolic enzymes.

  12. Analysis of Geometric Shifts and Proper Setup-Margin in Prostate Cancer Patients Treated With Pelvic Intensity-Modulated Radiotherapy Using Endorectal Ballooning and Daily Enema for Prostate Immobilization.

    PubMed

    Jeong, Songmi; Lee, Jong Hoon; Chung, Mi Joo; Lee, Sea Won; Lee, Jeong Won; Kang, Dae Gyu; Kim, Sung Hwan

    2016-01-01

    We evaluate geometric shifts of daily setup for evaluating the appropriateness of treatment and determining proper margins for the planning target volume (PTV) in prostate cancer patients.We analyzed 1200 sets of pretreatment megavoltage-CT scans that were acquired from 40 patients with intermediate to high-risk prostate cancer. They received whole pelvic intensity-modulated radiotherapy (IMRT). They underwent daily endorectal ballooning and enema to limit intrapelvic organ movement. The mean and standard deviation (SD) of daily translational shifts in right-to-left (X), anterior-to-posterior (Y), and superior-to-inferior (Z) were evaluated for systemic and random error.The mean ± SD of systemic error (Σ) in X, Y, Z, and roll was 2.21 ± 3.42 mm, -0.67 ± 2.27 mm, 1.05 ± 2.87 mm, and -0.43 ± 0.89°, respectively. The mean ± SD of random error (δ) was 1.95 ± 1.60 mm in X, 1.02 ± 0.50 mm in Y, 1.01 ± 0.48 mm in Z, and 0.37 ± 0.15° in roll. The calculated proper PTV margins that cover >95% of the target on average were 8.20 (X), 5.25 (Y), and 6.45 (Z) mm. Mean systemic geometrical shifts of IMRT were not statistically different in all transitional and three-dimensional shifts from early to late weeks. There was no grade 3 or higher gastrointestinal or genitourianry toxicity.The whole pelvic IMRT technique is a feasible and effective modality that limits intrapelvic organ motion and reduces setup uncertainties. Proper margins for the PTV can be determined by using geometric shifts data.

  13. Analysis of Geometric Shifts and Proper Setup-Margin in Prostate Cancer Patients Treated With Pelvic Intensity-Modulated Radiotherapy Using Endorectal Ballooning and Daily Enema for Prostate Immobilization

    PubMed Central

    Jeong, Songmi; Lee, Jong Hoon; Chung, Mi Joo; Lee, Sea Won; Lee, Jeong Won; Kang, Dae Gyu; Kim, Sung Hwan

    2016-01-01

    Abstract We evaluate geometric shifts of daily setup for evaluating the appropriateness of treatment and determining proper margins for the planning target volume (PTV) in prostate cancer patients. We analyzed 1200 sets of pretreatment megavoltage-CT scans that were acquired from 40 patients with intermediate to high-risk prostate cancer. They received whole pelvic intensity-modulated radiotherapy (IMRT). They underwent daily endorectal ballooning and enema to limit intrapelvic organ movement. The mean and standard deviation (SD) of daily translational shifts in right-to-left (X), anterior-to-posterior (Y), and superior-to-inferior (Z) were evaluated for systemic and random error. The mean ± SD of systemic error (Σ) in X, Y, Z, and roll was 2.21 ± 3.42 mm, −0.67 ± 2.27 mm, 1.05 ± 2.87 mm, and −0.43 ± 0.89°, respectively. The mean ± SD of random error (δ) was 1.95 ± 1.60 mm in X, 1.02 ± 0.50 mm in Y, 1.01 ± 0.48 mm in Z, and 0.37 ± 0.15° in roll. The calculated proper PTV margins that cover >95% of the target on average were 8.20 (X), 5.25 (Y), and 6.45 (Z) mm. Mean systemic geometrical shifts of IMRT were not statistically different in all transitional and three-dimensional shifts from early to late weeks. There was no grade 3 or higher gastrointestinal or genitourianry toxicity. The whole pelvic IMRT technique is a feasible and effective modality that limits intrapelvic organ motion and reduces setup uncertainties. Proper margins for the PTV can be determined by using geometric shifts data. PMID:26765418

  14. Toward improving hurricane forecasts using the JPL Tropical Cyclone Information System (TCIS): A framework to address the issues of Big Data

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Boothe, M.; Gopalakrishnan, S.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; montgomery, M. T.; Niamsuwan, N.; Tallapragada, V. S.; Tanelli, S.; Turk, J.; Vukicevic, T.

    2013-12-01

    Accurate forecasting of extreme weather requires the use of both regional models as well as global General Circulation Models (GCMs). The regional models have higher resolution and more accurate physics - two critical components needed for properly representing the key convective processes. GCMs, on the other hand, have better depiction of the large-scale environment and, thus, are necessary for properly capturing the important scale interactions. But how to evaluate the models, understand their shortcomings and improve them? Satellite observations can provide invaluable information. And this is where the issues of Big Data come: satellite observations are very complex and have large variety while model forecast are very voluminous. We are developing a system - TCIS - that addresses the issues of model evaluation and process understanding with the goal of improving the accuracy of hurricane forecasts. This NASA/ESTO/AIST-funded project aims at bringing satellite/airborne observations and model forecasts into a common system and developing on-line tools for joint analysis. To properly evaluate the models we go beyond the comparison of the geophysical fields. We input the model fields into instrument simulators (NEOS3, CRTM, etc.) and compute synthetic observations for a more direct comparison to the observed parameters. In this presentation we will start by describing the scientific questions. We will then outline our current framework to provide fusion of models and observations. Next, we will illustrate how the system can be used to evaluate several models (HWRF, GFS, ECMWF) by applying a couple of our analysis tools to several hurricanes observed during the 2013 season. Finally, we will outline our future plans. Our goal is to go beyond the image comparison and point-by-point statistics, by focusing instead on understanding multi-parameter correlations and providing robust statistics. By developing on-line analysis tools, our framework will allow for consistent model evaluation, providing results that are much more robust than those produced by case studies - the current paradigm imposed by the Big Data issues (voluminous data and incompatible analysis tools). We believe that this collaborative approach, with contributions of models, observations and analysis approaches used by the research and operational communities, will help untangle the complex interactions that lead to hurricane genesis and rapid intensity changes - two processes that still pose many unanswered questions. The developed framework for evaluation of the global models will also have implications for the improvement of the climate models, which output only a limited amount of information making it difficult to evaluate them. Our TCIS will help by investigating the GCMs under current weather scenarios and with much more detailed model output, making it possible to compare the models to multiple observed parameters to help narrow down the uncertainty in their performance. This knowledge could then be transferred to the climate models to lower the uncertainty in their predictions. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  15. Analysis of dual tree M-band wavelet transform based features for brain image classification.

    PubMed

    Ayalapogu, Ratna Raju; Pabboju, Suresh; Ramisetty, Rajeswara Rao

    2018-04-29

    The most complex organ in the human body is the brain. The unrestrained growth of cells in the brain is called a brain tumor. The cause of a brain tumor is still unknown and the survival rate is lower than other types of cancers. Hence, early detection is very important for proper treatment. In this study, an efficient computer-aided diagnosis (CAD) system is presented for brain image classification by analyzing MRI of the brain. At first, the MRI brain images of normal and abnormal categories are modeled by using the statistical features of dual tree m-band wavelet transform (DTMBWT). A maximum margin classifier, support vector machine (SVM) is then used for the classification and validated with k-fold approach. Results show that the system provides promising results on a repository of molecular brain neoplasia data (REMBRANDT) with 97.5% accuracy using 4 th level statistical features of DTMBWT. Viewing the experimental results, we conclude that the system gives a satisfactory performance for the brain image classification. © 2018 International Society for Magnetic Resonance in Medicine.

  16. The Schrödinger–Langevin equation with and without thermal fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, R., E-mail: roland.katz@subatech.in2p3.fr; Gossiaux, P.B., E-mail: Pol-Bernard.Gossiaux@subatech.in2p3.fr

    2016-05-15

    The Schrödinger–Langevin equation (SLE) is considered as an effective open quantum system formalism suitable for phenomenological applications involving a quantum subsystem interacting with a thermal bath. We focus on two open issues relative to its solutions: the stationarity of the excited states of the non-interacting subsystem when one considers the dissipation only and the thermal relaxation toward asymptotic distributions with the additional stochastic term. We first show that a proper application of the Madelung/polar transformation of the wave function leads to a non zero damping of the excited states of the quantum subsystem. We then study analytically and numerically themore » SLE ability to bring a quantum subsystem to the thermal equilibrium of statistical mechanics. To do so, concepts about statistical mixed states and quantum noises are discussed and a detailed analysis is carried with two kinds of noise and potential. We show that within our assumptions the use of the SLE as an effective open quantum system formalism is possible and discuss some of its limitations.« less

  17. Analysis of sequencing and scheduling methods for arrival traffic

    NASA Technical Reports Server (NTRS)

    Neuman, Frank; Erzberger, Heinz

    1990-01-01

    The air traffic control subsystem that performs scheduling is discussed. The function of the scheduling algorithms is to plan automatically the most efficient landing order and to assign optimally spaced landing times to all arrivals. Several important scheduling algorithms are described and the statistical performance of the scheduling algorithms is examined. Scheduling brings order to an arrival sequence for aircraft. First-come-first-served scheduling (FCFS) establishes a fair order, based on estimated times of arrival, and determines proper separations. Because of the randomness of the traffic, gaps will remain in the scheduled sequence of aircraft. These gaps are filled, or partially filled, by time-advancing the leading aircraft after a gap while still preserving the FCFS order. Tightly scheduled groups of aircraft remain with a mix of heavy and large aircraft. Separation requirements differ for different types of aircraft trailing each other. Advantage is taken of this fact through mild reordering of the traffic, thus shortening the groups and reducing average delays. Actual delays for different samples with the same statistical parameters vary widely, especially for heavy traffic.

  18. Basic principles of stability.

    PubMed

    Egan, William; Schofield, Timothy

    2009-11-01

    An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.

  19. Development of statistical linear regression model for metals from transportation land uses.

    PubMed

    Maniquiz, Marla C; Lee, Soyoung; Lee, Eunju; Kim, Lee-Hyung

    2009-01-01

    The transportation landuses possessing impervious surfaces such as highways, parking lots, roads, and bridges were recognized as the highly polluted non-point sources (NPSs) in the urban areas. Lots of pollutants from urban transportation are accumulating on the paved surfaces during dry periods and are washed-off during a storm. In Korea, the identification and monitoring of NPSs still represent a great challenge. Since 2004, the Ministry of Environment (MOE) has been engaged in several researches and monitoring to develop stormwater management policies and treatment systems for future implementation. The data over 131 storm events during May 2004 to September 2008 at eleven sites were analyzed to identify correlation relationships between particulates and metals, and to develop simple linear regression (SLR) model to estimate event mean concentration (EMC). Results indicate that there was no significant relationship between metals and TSS EMC. However, the SLR estimation models although not providing useful results are valuable indicators of high uncertainties that NPS pollution possess. Therefore, long term monitoring employing proper methods and precise statistical analysis of the data should be undertaken to eliminate these uncertainties.

  20. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    PubMed

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  1. Computers and Cognitive Development at Work

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael; Lee, Yew-Jin

    2006-01-01

    Data-logging exercises in science classrooms assume that with the proper scaffolding and provision of contexts by instructors, pupils are able to meaningfully comprehend the experimental variables under investigation. From a case study of knowing and learning in a fish hatchery using real-time computer statistical software, we show that…

  2. Investigating the determining factors for transit travel demand by bus mode in US metropolitan statistical areas.

    DOT National Transportation Integrated Search

    2015-05-01

    Proper understanding of the nature of the transit travel demand is at the heart of transportation policy making and the success of : transit systems. Unfortunately, most of the existing studies have focused on a single or few transit systems or metro...

  3. Roots and Rogues in German Child Language

    ERIC Educational Resources Information Center

    Duffield, Nigel

    2008-01-01

    This article is concerned with the proper characterization of subject omission at a particular stage in German child language. It focuses on post-verbal null subjects in finite clauses, here termed Rogues. It is argued that the statistically significant presence of Rogues, in conjunction with their distinct developmental profile, speaks against a…

  4. STATISTICAL EVALUATION OF CONFOCAL MICROSCOPY IMAGES

    EPA Science Inventory

    Abstract

    In this study the CV is defined as the Mean/SD of the population of beads or pixels. Flow cytometry uses the CV of beads to determine if the machine is aligned correctly and performing properly. This CV concept to determine machine performance has been adapted to...

  5. Protect Your Back: Guidelines for Safer Lifting.

    ERIC Educational Resources Information Center

    Cantu, Carolyn O.

    2002-01-01

    Examines back injury in teachers and child care providers; includes statistics, common causes of back pain (improper alignment, improper posture, improper lifting, and carrying), and types of back pain (acute and chronic). Focuses on preventing back injury, body mechanics for lifting and carrying, and proper lifting and carrying of children. (SD)

  6. Predicting Contextual Informativeness for Vocabulary Learning

    ERIC Educational Resources Information Center

    Kapelner, Adam; Soterwood, Jeanine; Nessaiver, Shalev; Adlof, Suzanne

    2018-01-01

    Vocabulary knowledge is essential to educational progress. High quality vocabulary instruction requires supportive contextual examples to teach word meaning and proper usage. Identifying such contexts by hand for a large number of words can be difficult. In this work, we take a statistical learning approach to engineer a system that predicts…

  7. Principles and Practice of Scaled Difference Chi-Square Testing

    ERIC Educational Resources Information Center

    Bryant, Fred B.; Satorra, Albert

    2012-01-01

    We highlight critical conceptual and statistical issues and how to resolve them in conducting Satorra-Bentler (SB) scaled difference chi-square tests. Concerning the original (Satorra & Bentler, 2001) and new (Satorra & Bentler, 2010) scaled difference tests, a fundamental difference exists in how to compute properly a model's scaling correction…

  8. A Constrained Linear Estimator for Multiple Regression

    ERIC Educational Resources Information Center

    Davis-Stober, Clintin P.; Dana, Jason; Budescu, David V.

    2010-01-01

    "Improper linear models" (see Dawes, Am. Psychol. 34:571-582, "1979"), such as equal weighting, have garnered interest as alternatives to standard regression models. We analyze the general circumstances under which these models perform well by recasting a class of "improper" linear models as "proper" statistical models with a single predictor. We…

  9. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  10. Robust Magnetotelluric Impedance Estimation

    NASA Astrophysics Data System (ADS)

    Sutarno, D.

    2010-12-01

    Robust magnetotelluric (MT) response function estimators are now in standard use by the induction community. Properly devised and applied, these have ability to reduce the influence of unusual data (outliers). The estimators always yield impedance estimates which are better than the conventional least square (LS) estimation because the `real' MT data almost never satisfy the statistical assumptions of Gaussian distribution and stationary upon which normal spectral analysis is based. This paper discuses the development and application of robust estimation procedures which can be classified as M-estimators to MT data. Starting with the description of the estimators, special attention is addressed to the recent development of a bounded-influence robust estimation, including utilization of the Hilbert Transform (HT) operation on causal MT impedance functions. The resulting robust performances are illustrated using synthetic as well as real MT data.

  11. Bayesian Network Meta-Analysis for Unordered Categorical Outcomes with Incomplete Data

    ERIC Educational Resources Information Center

    Schmid, Christopher H.; Trikalinos, Thomas A.; Olkin, Ingram

    2014-01-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of…

  12. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  13. The Correlation between Insertion Depth of Prodisc-C Artificial Disc and Postoperative Kyphotic Deformity: Clinical Importance of Insertion Depth of Artificial Disc.

    PubMed

    Lee, Do-Youl; Kim, Se-Hoon; Suh, Jung-Keun; Cho, Tai-Hyoung; Chung, Yong-Gu

    2012-09-01

    This study was designed to investigate the correlation between insertion depth of artificial disc and postoperative kyphotic deformity after Prodisc-C total disc replacement surgery, and the range of artificial disc insertion depth which is effective in preventing postoperative whole cervical or segmental kyphotic deformity. A retrospective radiological analysis was performed in 50 patients who had undergone single level total disc replacement surgery. Records were reviewed to obtain demographic data. Preoperative and postoperative radiographs were assessed to determine C2-7 Cobb's angle and segmental angle and to investigate postoperative kyphotic deformity. A formula was introduced to calculate insertion depth of Prodisc-C artificial disc. Statistical analysis was performed to search the correlation between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity, and to estimate insertion depth of Prodisc-C artificial disc to prevent postoperative kyphotic deformity. In this study no significant statistical correlation was observed between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity regarding C2-7 Cobb's angle. Statistical correlation between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity was observed regarding segmental angle (p<0.05). It failed to estimate proper insertion depth of Prodisc-C artificial disc effective in preventing postoperative kyphotic deformity. Postoperative segmental kyphotic deformity is associated with insertion depth of Prodisc-C artificial disc. Anterior located artificial disc leads to lordotic segmental angle and posterior located artificial disc leads to kyphotic segmental angle postoperatively. But C2-7 Cobb's angle is not affected by artificial disc location after the surgery.

  14. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  15. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  16. Universal algorithm for identification of fractional Brownian motion. A case of telomere subdiffusion.

    PubMed

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-11-07

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  17. Application of statistical experimental design to study the formulation variables influencing the coating process of lidocaine liposomes.

    PubMed

    González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M

    2007-06-07

    In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.

  18. [Prevalence of dental caries, gingivitis and periodontal disease in pregnant diabetic women].

    PubMed

    López-Pérez, R; Díaz-Romero, R M; Barranco-Jaubert, A; Borges-Yáñez, A; Avila-Rosas, H

    1996-01-01

    To determine the difference in the prevalence of dental caries, gingivitis, and periodontal disease among non-diabetic, type-II diabetic and pregnant diabetic women. In the period from June 1993 to January 1994, a cross-sectional study was carried out at the Instituto Nacional de Perinatología among 160 pregnant women; eighty non-diabetic women were included in the control group, while 40 type-II diabetic and 40 gestational diabetic women conformed the study group. In each patient the following variables were recorded: age, week of pregnancy, the Simplified Oral Hygiene Index, the Decayed, Missing and Filled Teeth Index, the Gingival Index, and the Extent and Severity Index. Statistical analysis was carried out using Analysis of Variance and the Multiple Range Test, with a 95% confidence internal. All of the groups had similar prevalences of dental caries (100%). Type-II diabetic women showed a higher prevalence of gingivitis (42.5%) than non-diabetic (36.25%) and gestational diabetic (10%) women, but the differences were not statistically significant between the non-diabetic and type-II diabetic women. Type-II diabetic women had a statistically significant higher prevalence of periodontal disease (12.5%) than the women in the other groups. It is very important to establish proper metabolic control and a fitting oral hygiene in pregnant diabetic women, since type-II diabetes was shown to be associated with a higher prevalence of periodontal disease. Besides, gestational diabetes is likely to pose a high risk of periodontal disease in the absence of preventive measures.

  19. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  20. Anomaly detection of turbopump vibration in Space Shuttle Main Engine using statistics and neural networks

    NASA Technical Reports Server (NTRS)

    Lo, C. F.; Wu, K.; Whitehead, B. A.

    1993-01-01

    The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.

  1. Statistical analysis of hail characteristics in the hail-protected western part of Croatia using data from hail suppression stations

    NASA Astrophysics Data System (ADS)

    Počakal, Damir; Štalec, Janez

    In the continental part of Croatia, operational hail suppression has been conducted for more than 30 years. The current protected area is 25,177 km 2 and has about 492 hail suppression stations which are managed with eight weather radar centres. This paper present a statistical analysis of parameters connected with hail occurrence on hail suppression stations in the western part of protected area in 1981-2000 period. This analysis compares data of two periods with different intensity of hail suppression activity and is made as a part of a project for assessment of hail suppression efficiency in Croatia. Because of disruption in hail suppression system during the independence war in Croatia (1991-1995), lack of rockets and other objective circumstances, it is considered that in the 1991-2000 period, hail suppression system could not act properly. Because of that, a comparison of hail suppression data for two periods was made. The first period (1981-1990), which is characterised with full application of hail suppression technology is compared with the second period (1991-2000). The protected area is divided into quadrants (9×9 km), such that every quadrant has at least one hail suppression station and intercomparison is more precise. Discriminant analysis was performed for the yearly values of each quadrant. These values included number of cases with solid precipitation, hail damage, heavy hail damage, number of active hail suppression stations, number of days with solid precipitation, solid precipitation damage, heavy solid precipitation damage and the number and duration of air traffic control bans. The discriminant analysis shows that there is a significant difference between the two periods. Average values of observed periods on isolated discriminant function 1 are for the first period (1981-1990) -0.36 and for the second period +0.23 standard deviation of all observations. The analysis for all eight variables shows statistically substantial differences in the number of hail suppression stations (which have a positive correlation) and in the number of cases with air traffic control ban, which have, like all other variables, a negative correlation. Results of statistical analysis for two periods show positive influence of hail suppression system. The discriminant analysis made for three periods shows that these three periods can not be compared because of the short time period, the difference in hail suppression technology, working conditions and possible differences in meteorological conditions. Therefore, neither the effectiveness nor ineffectiveness of hail suppression operations nor their efficiency can be statistically proven. For an exact assessment of hail suppression effectiveness, it is necessary to develop a project, which would take into consideration all the parameters used in such previous projects around the world—a hailpad polygon.

  2. Kidney function changes with aging in adults: comparison between cross-sectional and longitudinal data analyses in renal function assessment.

    PubMed

    Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas

    2015-12-01

    The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.

  3. The imprint of f(R) gravity on weak gravitational lensing - II. Information content in cosmic shear statistics

    NASA Astrophysics Data System (ADS)

    Shirasaki, Masato; Nishimichi, Takahiro; Li, Baojiu; Higuchi, Yuichi

    2017-04-01

    We investigate the information content of various cosmic shear statistics on the theory of gravity. Focusing on the Hu-Sawicki-type f(R) model, we perform a set of ray-tracing simulations and measure the convergence bispectrum, peak counts and Minkowski functionals. We first show that while the convergence power spectrum does have sensitivity to the current value of extra scalar degree of freedom |fR0|, it is largely compensated by a change in the present density amplitude parameter σ8 and the matter density parameter Ωm0. With accurate covariance matrices obtained from 1000 lensing simulations, we then examine the constraining power of the three additional statistics. We find that these probes are indeed helpful to break the parameter degeneracy, which cannot be resolved from the power spectrum alone. We show that especially the peak counts and Minkowski functionals have the potential to rigorously (marginally) detect the signature of modified gravity with the parameter |fR0| as small as 10-5 (10-6) if we can properly model them on small (˜1 arcmin) scale in a future survey with a sky coverage of 1500 deg2. We also show that the signal level is similar among the additional three statistics and all of them provide complementary information to the power spectrum. These findings indicate the importance of combining multiple probes beyond the standard power spectrum analysis to detect possible modifications to general relativity.

  4. Vibration-based structural health monitoring using adaptive statistical method under varying environmental condition

    NASA Astrophysics Data System (ADS)

    Jin, Seung-Seop; Jung, Hyung-Jo

    2014-03-01

    It is well known that the dynamic properties of a structure such as natural frequencies depend not only on damage but also on environmental condition (e.g., temperature). The variation in dynamic characteristics of a structure due to environmental condition may mask damage of the structure. Without taking the change of environmental condition into account, false-positive or false-negative damage diagnosis may occur so that structural health monitoring becomes unreliable. In order to address this problem, an approach to construct a regression model based on structural responses considering environmental factors has been usually used by many researchers. The key to success of this approach is the formulation between the input and output variables of the regression model to take into account the environmental variations. However, it is quite challenging to determine proper environmental variables and measurement locations in advance for fully representing the relationship between the structural responses and the environmental variations. One alternative (i.e., novelty detection) is to remove the variations caused by environmental factors from the structural responses by using multivariate statistical analysis (e.g., principal component analysis (PCA), factor analysis, etc.). The success of this method is deeply depending on the accuracy of the description of normal condition. Generally, there is no prior information on normal condition during data acquisition, so that the normal condition is determined by subjective perspective with human-intervention. The proposed method is a novel adaptive multivariate statistical analysis for monitoring of structural damage detection under environmental change. One advantage of this method is the ability of a generative learning to capture the intrinsic characteristics of the normal condition. The proposed method is tested on numerically simulated data for a range of noise in measurement under environmental variation. A comparative study with conventional methods (i.e., fixed reference scheme) demonstrates the superior performance of the proposed method for structural damage detection.

  5. Impact of quasar proper motions on the alignment between the International Celestial Reference Frame and the Gaia reference frame

    NASA Astrophysics Data System (ADS)

    Liu, J.-C.; Malkin, Z.; Zhu, Z.

    2018-03-01

    The International Celestial Reference Frame (ICRF) is currently realized by the very long baseline interferometry (VLBI) observations of extragalactic sources with the zero proper motion assumption, while Gaia will observe proper motions of these distant and faint objects to an accuracy of tens of microarcseconds per year. This paper investigates the difference between VLBI and Gaia quasar proper motions and it aims to understand the impact of quasar proper motions on the alignment of the ICRF and Gaia reference frame. We use the latest time series data of source coordinates from the International VLBI Service analysis centres operated at Goddard Space Flight Center (GSF2017) and Paris observatory (OPA2017), as well as the Gaia auxiliary quasar solution containing 2191 high-probability optical counterparts of the ICRF2 sources. The linear proper motions in right ascension and declination of VLBI sources are derived by least-squares fits while the proper motions for Gaia sources are simulated taking into account the acceleration of the Solar system barycentre and realistic uncertainties depending on the source brightness. The individual and global features of source proper motions in GSF2017 and OPA2017 VLBI data are found to be inconsistent, which may result from differences in VLBI observations, data reduction and analysis. A comparison of the VLBI and Gaia proper motions shows that the accuracies of the components of rotation and glide between the two systems are 2-4 μas yr- 1 based on about 600 common sources. For the future alignment of the ICRF and Gaia reference frames at different wavelengths, the proper motions of quasars must necessarily be considered.

  6. [Lake eutrophication modeling in considering climatic factors change: a review].

    PubMed

    Su, Jie-Qiong; Wang, Xuan; Yang, Zhi-Feng

    2012-11-01

    Climatic factors are considered as the key factors affecting the trophic status and its process in most lakes. Under the background of global climate change, to incorporate the variations of climatic factors into lake eutrophication models could provide solid technical support for the analysis of the trophic evolution trend of lake and the decision-making of lake environment management. This paper analyzed the effects of climatic factors such as air temperature, precipitation, sunlight, and atmosphere on lake eutrophication, and summarized the research results about the lake eutrophication modeling in considering in considering climatic factors change, including the modeling based on statistical analysis, ecological dynamic analysis, system analysis, and intelligent algorithm. The prospective approaches to improve the accuracy of lake eutrophication modeling with the consideration of climatic factors change were put forward, including 1) to strengthen the analysis of the mechanisms related to the effects of climatic factors change on lake trophic status, 2) to identify the appropriate simulation models to generate several scenarios under proper temporal and spatial scales and resolutions, and 3) to integrate the climatic factors change simulation, hydrodynamic model, ecological simulation, and intelligent algorithm into a general modeling system to achieve an accurate prediction of lake eutrophication under climatic change.

  7. Transparency of Outcome Reporting and Trial Registration of Randomized Controlled Trials Published in the Journal of Consulting and Clinical Psychology

    PubMed Central

    Azar, Marleine; Riehm, Kira E.; McKay, Dean; Thombs, Brett D.

    2015-01-01

    Background Confidence that randomized controlled trial (RCT) results accurately reflect intervention effectiveness depends on proper trial conduct and the accuracy and completeness of published trial reports. The Journal of Consulting and Clinical Psychology (JCCP) is the primary trials journal amongst American Psychological Association (APA) journals. The objectives of this study were to review RCTs recently published in JCCP to evaluate (1) adequacy of primary outcome analysis definitions; (2) registration status; and, (3) among registered trials, adequacy of outcome registrations. Additionally, we compared results from JCCP to findings from a recent study of top psychosomatic and behavioral medicine journals. Methods Eligible RCTs were published in JCCP in 2013–2014. For each RCT, two investigators independently extracted data on (1) adequacy of outcome analysis definitions in the published report, (2) whether the RCT was registered prior to enrolling patients, and (3) adequacy of outcome registration. Results Of 70 RCTs reviewed, 12 (17.1%) adequately defined primary or secondary outcome analyses, whereas 58 (82.3%) had multiple primary outcome analyses without statistical adjustment or undefined outcome analyses. There were 39 (55.7%) registered trials. Only two trials registered prior to patient enrollment with a single primary outcome variable and time point of assessment. However, in one of the two trials, registered and published outcomes were discrepant. No studies were adequately registered as per Standard Protocol Items: Recommendation for Interventional Trials guidelines. Compared to psychosomatic and behavioral medicine journals, the proportion of published trials with adequate outcome analysis declarations was significantly lower in JCCP (17.1% versus 32.9%; p = 0.029). The proportion of registered trials in JCCP (55.7%) was comparable to behavioral medicine journals (52.6%; p = 0.709). Conclusions The quality of published outcome analysis definitions and trial registrations in JCCP is suboptimal. Greater attention to proper trial registration and outcome analysis definition in published reports is needed. PMID:26581079

  8. Transparency of Outcome Reporting and Trial Registration of Randomized Controlled Trials Published in the Journal of Consulting and Clinical Psychology.

    PubMed

    Azar, Marleine; Riehm, Kira E; McKay, Dean; Thombs, Brett D

    2015-01-01

    Confidence that randomized controlled trial (RCT) results accurately reflect intervention effectiveness depends on proper trial conduct and the accuracy and completeness of published trial reports. The Journal of Consulting and Clinical Psychology (JCCP) is the primary trials journal amongst American Psychological Association (APA) journals. The objectives of this study were to review RCTs recently published in JCCP to evaluate (1) adequacy of primary outcome analysis definitions; (2) registration status; and, (3) among registered trials, adequacy of outcome registrations. Additionally, we compared results from JCCP to findings from a recent study of top psychosomatic and behavioral medicine journals. Eligible RCTs were published in JCCP in 2013-2014. For each RCT, two investigators independently extracted data on (1) adequacy of outcome analysis definitions in the published report, (2) whether the RCT was registered prior to enrolling patients, and (3) adequacy of outcome registration. Of 70 RCTs reviewed, 12 (17.1%) adequately defined primary or secondary outcome analyses, whereas 58 (82.3%) had multiple primary outcome analyses without statistical adjustment or undefined outcome analyses. There were 39 (55.7%) registered trials. Only two trials registered prior to patient enrollment with a single primary outcome variable and time point of assessment. However, in one of the two trials, registered and published outcomes were discrepant. No studies were adequately registered as per Standard Protocol Items: Recommendation for Interventional Trials guidelines. Compared to psychosomatic and behavioral medicine journals, the proportion of published trials with adequate outcome analysis declarations was significantly lower in JCCP (17.1% versus 32.9%; p = 0.029). The proportion of registered trials in JCCP (55.7%) was comparable to behavioral medicine journals (52.6%; p = 0.709). The quality of published outcome analysis definitions and trial registrations in JCCP is suboptimal. Greater attention to proper trial registration and outcome analysis definition in published reports is needed.

  9. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  10. Statistical inference in comparing DInSAR and GPS data in fault areas

    NASA Astrophysics Data System (ADS)

    Barzaghi, R.; Borghi, A.; Kunzle, A.

    2012-04-01

    DInSAR and GPS data are nowadays currently used in geophysical investigation, e.g. for estimating slip rate over the fault plane in seismogenic areas. This analysis is usually done by mapping the surface deformation rates as estimated by GPS and DInSAR over the fault plane using suitable geophysical models (e.g. the Okada model). Usually, DInSAR vertical velocities and GPS horizontal velocities are used for getting an integrated slip estimate. However, it is sometimes critical to merge the two kinds of information since they may reflect a common undergoing geophysical signal plus different disturbing signals that are not related to the fault dynamic. In GPS and DInSAR data analysis, these artifacts are mainly connected to signal propagation in the atmosphere and to hydrological phenomena (e.g. variation in the water table). Thus, some coherence test between the two information must be carried out in order to properly merge the GPS and DInSAR velocities in the inversion procedure. To this aim, statistical tests have been studied to check for the compatibility of the two deformation rate estimates coming from GPS and DInSAR data analysis. This has been done according both to standard and Bayesian testing methodology. The effectiveness of the proposed inference methods has been checked with numerical simulations in the case of a normal fault. The fault structure is defined following the Pollino fault model and both GPS and DInSAR data are simulated according to real data acquired in this area.

  11. Stability of tapered and parallel-walled dental implants: A systematic review and meta-analysis.

    PubMed

    Atieh, Momen A; Alsabeeha, Nabeel; Duncan, Warwick J

    2018-05-15

    Clinical trials have suggested that dental implants with a tapered configuration have improved stability at placement, allowing immediate placement and/or loading. The aim of this systematic review and meta-analysis was to evaluate the implant stability of tapered dental implants compared to standard parallel-walled dental implants. Applying the guidelines of Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) statement, randomized controlled trials (RCTs) were searched for in electronic databases and complemented by hand searching. The risk of bias was assessed using the Cochrane Collaboration's Risk of Bias tool and data were analyzed using statistical software. A total of 1199 studies were identified, of which, five trials were included with 336 dental implants in 303 participants. Overall meta-analysis showed that tapered dental implants had higher implant stability values than parallel-walled dental implants at insertion and 8 weeks but the difference was not statistically significant. Tapered dental implants had significantly less marginal bone loss compared to parallel-walled dental implants. No significant differences in implant failure rate were found between tapered and parallel-walled dental implants. There is limited evidence to demonstrate the effectiveness of tapered dental implants in achieving greater implant stability compared to parallel-walled dental implants. Superior short-term results in maintaining peri-implant marginal bone with tapered dental implants are possible. Further properly designed RCTs are required to endorse the supposed advantages of tapered dental implants in immediate loading protocol and other complex clinical scenarios. © 2018 Wiley Periodicals, Inc.

  12. A robust and efficient statistical method for genetic association studies using case and control samples from multiple cohorts

    PubMed Central

    2013-01-01

    Background The theoretical basis of genome-wide association studies (GWAS) is statistical inference of linkage disequilibrium (LD) between any polymorphic marker and a putative disease locus. Most methods widely implemented for such analyses are vulnerable to several key demographic factors and deliver a poor statistical power for detecting genuine associations and also a high false positive rate. Here, we present a likelihood-based statistical approach that accounts properly for non-random nature of case–control samples in regard of genotypic distribution at the loci in populations under study and confers flexibility to test for genetic association in presence of different confounding factors such as population structure, non-randomness of samples etc. Results We implemented this novel method together with several popular methods in the literature of GWAS, to re-analyze recently published Parkinson’s disease (PD) case–control samples. The real data analysis and computer simulation show that the new method confers not only significantly improved statistical power for detecting the associations but also robustness to the difficulties stemmed from non-randomly sampling and genetic structures when compared to its rivals. In particular, the new method detected 44 significant SNPs within 25 chromosomal regions of size < 1 Mb but only 6 SNPs in two of these regions were previously detected by the trend test based methods. It discovered two SNPs located 1.18 Mb and 0.18 Mb from the PD candidates, FGF20 and PARK8, without invoking false positive risk. Conclusions We developed a novel likelihood-based method which provides adequate estimation of LD and other population model parameters by using case and control samples, the ease in integration of these samples from multiple genetically divergent populations and thus confers statistically robust and powerful analyses of GWAS. On basis of simulation studies and analysis of real datasets, we demonstrated significant improvement of the new method over the non-parametric trend test, which is the most popularly implemented in the literature of GWAS. PMID:23394771

  13. Procedures for determination of detection limits: application to high-performance liquid chromatography analysis of fat-soluble vitamins in human serum.

    PubMed

    Browne, Richard W; Whitcomb, Brian W

    2010-07-01

    Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.

  14. Analysis strategies for high-resolution UHF-fMRI data.

    PubMed

    Polimeni, Jonathan R; Renvall, Ville; Zaretskaya, Natalia; Fischl, Bruce

    2018-03-01

    Functional MRI (fMRI) benefits from both increased sensitivity and specificity with increasing magnetic field strength, making it a key application for Ultra-High Field (UHF) MRI scanners. Most UHF-fMRI studies utilize the dramatic increases in sensitivity and specificity to acquire high-resolution data reaching sub-millimeter scales, which enable new classes of experiments to probe the functional organization of the human brain. This review article surveys advanced data analysis strategies developed for high-resolution fMRI at UHF. These include strategies designed to mitigate distortion and artifacts associated with higher fields in ways that attempt to preserve spatial resolution of the fMRI data, as well as recently introduced analysis techniques that are enabled by these extremely high-resolution data. Particular focus is placed on anatomically-informed analyses, including cortical surface-based analysis, which are powerful techniques that can guide each step of the analysis from preprocessing to statistical analysis to interpretation and visualization. New intracortical analysis techniques for laminar and columnar fMRI are also reviewed and discussed. Prospects for single-subject individualized analyses are also presented and discussed. Altogether, there are both specific challenges and opportunities presented by UHF-fMRI, and the use of proper analysis strategies can help these valuable data reach their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Analysis of shipboard aerosol optical thickness measurements from multiple sunphotometers aboard the R/V Ronald H. Brown during the Aerosol Characterization Experiment - Asia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mark A.; Knobelspiesse, Kirk; Frouin, Robert

    2005-06-20

    Marine sunphotometer measurements collected aboard the R/V Ronald H. Brown during the Aerosol Characterization Experiment - Asia (ACE-Asia) are used to evaluate the ability of complementary instrumentation to obtain the best possible estimates of aerosol optical thickness and Angstrom exponent from ships at sea. A wide range of aerosol conditions, including clean maritime conditions and highly polluted coastal environments, were encountered during the ACE-Asia cruise. The results of this study suggest that shipboard hand-held sunphotometers and fast-rotating shadow-band radiometers (FRSRs) yield similar measurements and uncertainties if proper measurement protocols are used and if the instruments are properly calibrated. The automatedmore » FRSR has significantly better temporal resolution (2 min) than the hand-held sunphotometers when standard measurement protocols are used, so it more faithfully represents the variability of the local aerosol structure in polluted regions. Conversely, results suggest that the hand-held sunphotometers may perform better in clean, maritime air masses for unknown reasons. Results also show that the statistical distribution of the Angstrom exponent measurements is different when the distributions from hand-held sunphotometers are compared with those from the FRSR and that the differences may arise from a combination of factors.« less

  16. Assessment of dietary habits, nutritional status and blood biochemical parameters in patients prepared for bariatric surgery: a preliminary study.

    PubMed

    Jastrzębska-Mierzyńska, Marta; Ostrowska, Lucyna; Hady, Hady Razak; Dadan, Jacek

    2012-08-01

    Morbid obesity needs to be treated by bariatric procedures. Proper dietary preparation of patients before surgery conditions their postoperative status. Assessment of dietary habits, nutritional status and biochemical parameters of the blood in patients being prepared for different bariatric procedures. The study involved a group of 27 obese adults: 19 women (mean age: 40.4 ±13.9 years) and 8 men (mean age: 39.6 ±12.7 years) qualified for bariatric procedures. Body composition, dietary habits and selected biochemical parameters of blood were assessed. Statistical analysis of the results was conducted using Statistica 9.0. Daily food rations consumed by women provided 1910.6 ±915.9 kcal/day, and by men 2631 ±1463.2 kcal/day on average. In both groups, the consumption of major nutrients was found to be inadequate. In both groups, deficiency was observed in the dietary intake of folic acid and potassium. Additionally, there was a decrease in the intake of vitamin D(3), calcium and iron in women and magnesium in men. In the two groups, disturbances were noted in lipid and carbohydrate metabolism. Our study indicates the necessity for dietary instructions in bariatric patients with regard to proper dietary habits and to reduce the risk of malnutrition before and after surgery.

  17. Bond strength of repaired amalgam restorations.

    PubMed

    Rey, Rosalia; Mondragon, Eduardo; Shen, Chiayi

    2015-01-01

    This in vitro study investigated the interfacial flexural strength (FS) of amalgam repairs and the optimal combination of repair materials and mechanical retention required for a consistent and durable repair bond. Amalgam bricks were created, each with 1 end roughened to expose a fresh surface before repair. Four groups followed separate repair protocols: group 1, bonding agent with amalgam; group 2, bonding agent with composite resin; group 3, mechanical retention (slot) with amalgam; and group 4, slot with bonding agent and amalgam. Repaired specimens were stored in artificial saliva for 1, 10, 30, 120, or 360 days before being loaded to failure in a 3-point bending test. Statistical analysis showed significant changes in median FS over time in groups 2 and 4. The effect of the repair method on the FS values after each storage period was significant for most groups except the 30-day storage groups. Amalgam-amalgam repair with adequate condensation yielded the most consistent and durable bond. An amalgam bonding agent could be beneficial when firm condensation on the repair surface cannot be achieved or when tooth structure is involved. Composite resin can be a viable option for amalgam repair in an esthetically demanding region, but proper mechanical modification of the amalgam surface and selection of the proper bonding system are essential.

  18. Multimedia Presentations in Educational Measurement and Statistics: Design Considerations and Instructional Approaches

    ERIC Educational Resources Information Center

    Sklar, Jeffrey C.; Zwick, Rebecca

    2009-01-01

    Proper interpretation of standardized test scores is a crucial skill for K-12 teachers and school personnel; however, many do not have sufficient knowledge of measurement concepts to appropriately interpret and communicate test results. In a recent four-year project funded by the National Science Foundation, three web-based instructional…

  19. Estimation of the Prevalence of Autism Spectrum Disorder in South Korea, Revisited

    ERIC Educational Resources Information Center

    Pantelis, Peter C.; Kennedy, Daniel P.

    2016-01-01

    Two-phase designs in epidemiological studies of autism prevalence introduce methodological complications that can severely limit the precision of resulting estimates. If the assumptions used to derive the prevalence estimate are invalid or if the uncertainty surrounding these assumptions is not properly accounted for in the statistical inference…

  20. The Consequences of Model Misidentification in the Interrupted Time-Series Experiment.

    ERIC Educational Resources Information Center

    Padia, William L.

    Campbell (l969) argued for the interrupted time-series experiment as a useful methodology for testing intervention effects in the social sciences. The validity of the statistical hypothesis testing of time-series, is, however, dependent upon the proper identification of the underlying stochastic nature of the data. Several types of model…

  1. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...

  2. Attention-Deficit/Hyperactivity Disorder Symptoms in Preschool Children: Examining Psychometric Properties Using Item Response Theory

    ERIC Educational Resources Information Center

    Purpura, David J.; Wilson, Shauna B.; Lonigan, Christopher J.

    2010-01-01

    Clear and empirically supported diagnostic symptoms are important for proper diagnosis and treatment of psychological disorders. Unfortunately, the symptoms of many disorders presented in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) lack sufficient psychometric…

  3. Coherent vorticity extraction in resistive drift-wave turbulence: Comparison of orthogonal wavelets versus proper orthogonal decomposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Futatani, S.; Bos, W.J.T.; Del-Castillo-Negrete, Diego B

    2011-01-01

    We assess two techniques for extracting coherent vortices out of turbulent flows: the wavelet based Coherent Vorticity Extraction (CVE) and the Proper Orthogonal Decomposition (POD). The former decomposes the flow field into an orthogonal wavelet representation and subsequent thresholding of the coefficients allows one to split the flow into organized coherent vortices with non-Gaussian statistics and an incoherent random part which is structureless. POD is based on the singular value decomposition and decomposes the flow into basis functions which are optimal with respect to the retained energy for the ensemble average. Both techniques are applied to direct numerical simulation datamore » of two-dimensional drift-wave turbulence governed by Hasegawa Wakatani equation, considering two limit cases: the quasi-hydrodynamic and the quasi-adiabatic regimes. The results are compared in terms of compression rate, retained energy, retained enstrophy and retained radial flux, together with the enstrophy spectrum and higher order statistics. (c) 2010 Published by Elsevier Masson SAS on behalf of Academie des sciences.« less

  4. Hybrid pairwise likelihood analysis of animal behavior experiments.

    PubMed

    Cattelan, Manuela; Varin, Cristiano

    2013-12-01

    The study of the determinants of fights between animals is an important issue in understanding animal behavior. For this purpose, tournament experiments among a set of animals are often used by zoologists. The results of these tournament experiments are naturally analyzed by paired comparison models. Proper statistical analysis of these models is complicated by the presence of dependence between the outcomes of fights because the same animal is involved in different contests. This paper discusses two different model specifications to account for between-fights dependence. Models are fitted through the hybrid pairwise likelihood method that iterates between optimal estimating equations for the regression parameters and pairwise likelihood inference for the association parameters. This approach requires the specification of means and covariances only. For this reason, the method can be applied also when the computation of the joint distribution is difficult or inconvenient. The proposed methodology is investigated by simulation studies and applied to real data about adult male Cape Dwarf Chameleons. © 2013, The International Biometric Society.

  5. Scanpath-based analysis of objects conspicuity in context of human vision physiology.

    PubMed

    Augustyniak, Piotr

    2007-01-01

    This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.

  6. Return to Play in Elite Contact Athletes After Anterior Cervical Discectomy and Fusion: A Meta-Analysis

    PubMed Central

    McAnany, Steven J.; Overley, Samuel; Andelman, Steven; Patterson, Diana C.; Cho, Samuel K.; Qureshi, Sheeraz; Hsu, Wellington K.

    2017-01-01

    Study Design: Systematic literature review and meta-analysis of studies published in English language. Objective: Return to play after anterior cervical discectomy and fusion (ACDF) in contact athletes remains a controversial topic with no consensus opinion in the literature. Additional information is needed to properly advise and treat this population of patients. This study is a meta-analysis assessing return to competitive contact sports after undergoing an ACDF. Methods: A literature search of Medline, Embase, and Cochrane Reviews was performed to identify investigations reporting return to play following ACDF in professional contact athletes. The pooled results were performed by calculating the effect size based on the logic event rate. Studies were weighted by the inverse of the variance, which included both within and between-study error. Confidence intervals (CIs) were reported at 95%. Heterogeneity was assessed using the Q statistic and I 2. Sensitivity analysis and publication bias calculations were performed. Results: The initial literature search resulted in 166 articles, of which 5 were determined relevant. Overall, return to play data was provided for 48 patients. The pooled clinical success rate for return to play was 73.5% (CI = 56.7%, 85.8%). The logit event rate was calculated to be 1.036 (CI = 0.270, 1.802), which was statistically significant (P = .008). The studies included in this meta-analysis demonstrated minimal heterogeneity with Q value of 4.038 and I 2 value of 0.956. Conclusions: Elite contact athletes return to competition 73.5% of the time after undergoing ACDF. As this is the first study to pool results from existing studies, it provides strong evidence to guide decision making and expectations in this patient population. PMID:28894685

  7. Mutation Screening of the Krüppel-like Factor 1 Gene in Individuals With Increased Fetal Hemoglobin Referred for Hemoglobinopathy Investigation in South of Iran.

    PubMed

    Hamid, Mohammad; Ershadi Oskouei, Sanaz; Shariati, Gholamreza; Babaei, Esmaeil; Galehdari, Hamid; Saberi, Alihossein; Sedaghat, Alireza

    2018-04-01

    Any mutation in the Krüppel-like factor 1 (KLF1) gene may interfere with its proper related function in the erythropoiesis process and lead to alterations in proper activation of its downstream protein through globin switching, which results in an increase in fetal hemoglobin (HbF). This study aimed to investigate whether KLF1 mutation can associate with high level of HbF in individuals with increased fetal hemoglobin referred for screening of hemoglobinopathies in south of Iran. The human KLF1 gene was amplified via the polymerase chain reaction procedure, and sequencing was used to determine any mutation in these patients. Moreover, XmnI polymorphisms in the position of -158 of γ-globin gene promoter were analyzed in all patients by polymerase chain reaction restriction fragment length polymorphism. Analysis of sequencing revealed a missense mutation in the KLF1 gene, p.Ser102Pro (c.304T>C), which was detectable in 10 of 23 cases with elevated HbF level. This mutation was only detected in individuals who had a HbF level between 3.1% and 25.6%. Statistical analysis showed that the frequency of C allele is significantly correlated with a high level of HbF (P<0.05). The allele frequency of positive result of XmnI polymorphism in individuals with increased HbF level was also significant, which showed an association with increased HbF level (P<0.05). To the best of our knowledge, this is the first report of p.Ser102Pro (c.304T>C) in the KLF1 gene in β-thalassemia patients with increased level of fetal hemoglobin. According to statistical results of p.Ser102Pro mutation and XmnI polymorphism, it has been strongly suggested that both polymorphisms have an association with increased HbF samples. These nucleotide changes alone may not be the only elements raising the level of HbF, and other regulatory and modifying factors also play a role in HbF production.

  8. Performance evaluation of dispersion parameterization schemes in the plume simulation of FFT-07 diffusion experiment

    NASA Astrophysics Data System (ADS)

    Pandey, Gavendra; Sharan, Maithili

    2018-01-01

    Application of atmospheric dispersion models in air quality analysis requires a proper representation of the vertical and horizontal growth of the plume. For this purpose, various schemes for the parameterization of dispersion parameters σ‧s are described in both stable and unstable conditions. These schemes differ on the use of (i) extent of availability of on-site measurements (ii) formulations developed for other sites and (iii) empirical relations. The performance of these schemes is evaluated in an earlier developed IIT (Indian Institute of Technology) dispersion model with the data set in single and multiple releases conducted at Fusion Field Trials, Dugway Proving Ground, Utah 2007. Qualitative and quantitative evaluation of the relative performance of all the schemes is carried out in both stable and unstable conditions in the light of (i) peak/maximum concentrations, and (ii) overall concentration distribution. The blocked bootstrap resampling technique is adopted to investigate the statistical significance of the differences in performances of each of the schemes by computing 95% confidence limits on the parameters FB and NMSE. The various analysis based on some selected statistical measures indicated consistency in the qualitative and quantitative performances of σ schemes. The scheme which is based on standard deviation of wind velocity fluctuations and Lagrangian time scales exhibits a relatively better performance in predicting the peak as well as the lateral spread.

  9. On the cause of the non-Gaussian distribution of residuals in geomagnetism

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Khokhlov, A.

    2017-12-01

    To describe errors in the data, Gaussian distributions naturally come to mind. In many practical instances, indeed, Gaussian distributions are appropriate. In the broad field of geomagnetism, however, it has repeatedly been noted that residuals between data and models often display much sharper distributions, sometimes better described by a Laplace distribution. In the present study, we make the case that such non-Gaussian behaviors are very likely the result of what is known as mixture of distributions in the statistical literature. Mixtures arise as soon as the data do not follow a common distribution or are not properly normalized, the resulting global distribution being a mix of the various distributions followed by subsets of the data, or even individual datum. We provide examples of the way such mixtures can lead to distributions that are much sharper than Gaussian distributions and discuss the reasons why such mixtures are likely the cause of the non-Gaussian distributions observed in geomagnetism. We also show that when properly selecting sub-datasets based on geophysical criteria, statistical mixture can sometimes be avoided and much more Gaussian behaviors recovered. We conclude with some general recommendations and point out that although statistical mixture always tends to sharpen the resulting distribution, it does not necessarily lead to a Laplacian distribution. This needs to be taken into account when dealing with such non-Gaussian distributions.

  10. Factor Analysis with EM Algorithm Never Gives Improper Solutions when Sample Covariance and Initial Parameter Matrices Are Proper

    ERIC Educational Resources Information Center

    Adachi, Kohei

    2013-01-01

    Rubin and Thayer ("Psychometrika," 47:69-76, 1982) proposed the EM algorithm for exploratory and confirmatory maximum likelihood factor analysis. In this paper, we prove the following fact: the EM algorithm always gives a proper solution with positive unique variances and factor correlations with absolute values that do not exceed one,…

  11. On the systematics in apparent proper motions of radio sources observed by VLBI

    NASA Astrophysics Data System (ADS)

    Raposo-Pulido, V.; Lambert, S.; Capitaine, N.; Nilsson, T.; Heinkelmann, R.; Schuh, H.

    2015-08-01

    For about twenty years, several authors have been investigating the systematics in the apparent proper motions of radio source positions. In some cases, the theoretical work developed (Pyne et al., 1996) could not be assessed due to the few number of VLBI observations. In other cases, the effects attributed to apparent proper motion could not be related successfully because there were no significant evidences from a statistical point of view (MacMillan, 2005). In this work we provide considerations about the estimation of the coefficients of spherical harmonics, based on a three-step procedure used by Titov et al. (2011) and Titov and Lambert (2013). The early stage of this work has been to compare step by step the computations and estimation processes between the Calc/Solve (http://gemini.gsfc.nasa.gov/solve/) and VieVS software (Böhm et al., 2012). To achieve this, the results were analyzed and compared with the previous study done by Titov and Lambert (2013).

  12. Nonlinear time-periodic models of the longitudinal flight dynamics of desert locusts Schistocerca gregaria

    PubMed Central

    Taylor, Graham K; Żbikowski, Rafał

    2005-01-01

    Previous studies of insect flight control have been statistical in approach, simply correlating wing kinematics with body kinematics or force production. Kinematics and forces are linked by Newtonian mechanics, so adopting a dynamics-based approach is necessary if we are to place the study of insect flight on its proper physical footing. Here we develop semi-empirical models of the longitudinal flight dynamics of desert locusts Schistocerca gregaria. We use instantaneous force–moment measurements from individual locusts to parametrize the nonlinear rigid body equations of motion. Since the instantaneous forces are approximately periodic, we represent them using Fourier series, which are embedded in the equations of motion to give a nonlinear time-periodic (NLTP) model. This is a proper mathematical generalization of an earlier linear-time invariant (LTI) model of locust flight dynamics, developed using previously published time-averaged versions of the instantaneous force recordings. We perform various numerical simulations, within the fitted range of the model, and across the range of body angles used by free-flying locusts, to explore the likely behaviour of the locusts upon release from the tether. Solutions of the NLTP models are compared with solutions of the nonlinear time-invariant (NLTI) models to which they reduce when the periodic terms are dropped. Both sets of models are unstable and therefore fail to explain locust flight stability fully. Nevertheless, whereas the measured forces include statistically significant harmonic content up to about the eighth harmonic, the simulated flight trajectories display no harmonic content above the fundamental forcing frequency. Hence, manoeuvre control in locusts will not directly reflect subtle changes in the higher harmonics of the wing beat, but must operate on a coarser time-scale. A state-space analysis of the NLTP models reveals orbital trajectories that are impossible to capture in the LTI and NLTI models, and inspires the hypothesis that asymptotic orbital stability is the proper definition of stability in flapping flight. Manoeuvre control on the scale of more than one wing beat would then consist in exciting transients from one asymptotically stable orbit to another. We summarize these hypotheses by proposing a limit-cycle analogy for flapping flight control and suggest experiments for verification of the limit-cycle control analogy hypothesis. PMID:16849180

  13. Photometric detection of high proper motions in dense stellar fields using difference image analysis

    NASA Astrophysics Data System (ADS)

    Eyer, L.; Woźniak, P. R.

    2001-10-01

    The difference image analysis (DIA) of the images obtained by the Optical Gravitational Lensing Experiment (OGLE-II) revealed a peculiar artefact in the sample of stars proposed as variable by Woźniak in one of the Galactic bulge fields: the occurrence of pairs of candidate variables showing anti-correlated light curves monotonic over a period of 3yr. This effect can be understood, quantified and related to the stellar proper motions. DIA photometry supplemented with a simple model offers an effective and easy way to detect high proper motion stars in very dense stellar fields, where conventional astrometric searches are extremely inefficient.

  14. Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H

    2016-06-01

    Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.

  15. An Alternative Approach to Analyze Ipsative Data. Revisiting Experiential Learning Theory.

    PubMed

    Batista-Foguet, Joan M; Ferrer-Rosell, Berta; Serlavós, Ricard; Coenders, Germà; Boyatzis, Richard E

    2015-01-01

    The ritualistic use of statistical models regardless of the type of data actually available is a common practice across disciplines which we dare to call type zero error. Statistical models involve a series of assumptions whose existence is often neglected altogether, this is specially the case with ipsative data. This paper illustrates the consequences of this ritualistic practice within Kolb's Experiential Learning Theory (ELT) operationalized through its Learning Style Inventory (KLSI). We show how using a well-known methodology in other disciplines-compositional data analysis (CODA) and log ratio transformations-KLSI data can be properly analyzed. In addition, the method has theoretical implications: a third dimension of the KLSI is unveiled providing room for future research. This third dimension describes an individual's relative preference for learning by prehension rather than by transformation. Using a sample of international MBA students, we relate this dimension with another self-assessment instrument, the Philosophical Orientation Questionnaire (POQ), and with an observer-assessed instrument, the Emotional and Social Competency Inventory (ESCI-U). Both show plausible statistical relationships. An intellectual operating philosophy (IOP) is linked to a preference for prehension, whereas a pragmatic operating philosophy (POP) is linked to transformation. Self-management and social awareness competencies are linked to a learning preference for transforming knowledge, whereas relationship management and cognitive competencies are more related to approaching learning by prehension.

  16. An Alternative Approach to Analyze Ipsative Data. Revisiting Experiential Learning Theory

    PubMed Central

    Batista-Foguet, Joan M.; Ferrer-Rosell, Berta; Serlavós, Ricard; Coenders, Germà; Boyatzis, Richard E.

    2015-01-01

    The ritualistic use of statistical models regardless of the type of data actually available is a common practice across disciplines which we dare to call type zero error. Statistical models involve a series of assumptions whose existence is often neglected altogether, this is specially the case with ipsative data. This paper illustrates the consequences of this ritualistic practice within Kolb's Experiential Learning Theory (ELT) operationalized through its Learning Style Inventory (KLSI). We show how using a well-known methodology in other disciplines—compositional data analysis (CODA) and log ratio transformations—KLSI data can be properly analyzed. In addition, the method has theoretical implications: a third dimension of the KLSI is unveiled providing room for future research. This third dimension describes an individual's relative preference for learning by prehension rather than by transformation. Using a sample of international MBA students, we relate this dimension with another self-assessment instrument, the Philosophical Orientation Questionnaire (POQ), and with an observer-assessed instrument, the Emotional and Social Competency Inventory (ESCI-U). Both show plausible statistical relationships. An intellectual operating philosophy (IOP) is linked to a preference for prehension, whereas a pragmatic operating philosophy (POP) is linked to transformation. Self-management and social awareness competencies are linked to a learning preference for transforming knowledge, whereas relationship management and cognitive competencies are more related to approaching learning by prehension. PMID:26617561

  17. Regularly arranged indium islands on glass/molybdenum substrates upon femtosecond laser and physical vapor deposition processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringleb, F.; Eylers, K.; Teubner, Th.

    2016-03-14

    A bottom-up approach is presented for the production of arrays of indium islands on a molybdenum layer on glass, which can serve as micro-sized precursors for indium compounds such as copper-indium-gallium-diselenide used in photovoltaics. Femtosecond laser ablation of glass and a subsequent deposition of a molybdenum film or direct laser processing of the molybdenum film both allow the preferential nucleation and growth of indium islands at the predefined locations in a following indium-based physical vapor deposition (PVD) process. A proper choice of laser and deposition parameters ensures the controlled growth of indium islands exclusively at the laser ablated spots. Basedmore » on a statistical analysis, these results are compared to the non-structured molybdenum surface, leading to randomly grown indium islands after PVD.« less

  18. REACT: Resettable Hold Down and Release Actuator for Space Applications

    NASA Astrophysics Data System (ADS)

    Nava, Nestor; Collado, Marcelo; Cabás, Ramiro

    2014-07-01

    A new HDRA based on SMA technology, called REACT, has been designed for development of loads and appendixes in space applications. This design involves a rod supported by spheres that block its axial movement during a preload application. The rod shape allows misalignment and blocks the rotation around axial axis for a proper installation of the device. Because of the high preload requirements for this type of actuators, finite element analysis (FEA) has been developed in order to check the structure resistance. The results of the FEA have constrained the REACT design, in terms of dimensions, materials, and shape of the mechanical parts. A complete test campaign for qualification of REACT is proposed. Several qualification models are intended to be built for testing in parallel. Therefore, it is a way to demonstrate margins which allows getting some statistics.

  19. Fault Detection of Bearing Systems through EEMD and Optimization Algorithm

    PubMed Central

    Lee, Dong-Han; Ahn, Jong-Hyo; Koh, Bong-Hwan

    2017-01-01

    This study proposes a fault detection and diagnosis method for bearing systems using ensemble empirical mode decomposition (EEMD) based feature extraction, in conjunction with particle swarm optimization (PSO), principal component analysis (PCA), and Isomap. First, a mathematical model is assumed to generate vibration signals from damaged bearing components, such as the inner-race, outer-race, and rolling elements. The process of decomposing vibration signals into intrinsic mode functions (IMFs) and extracting statistical features is introduced to develop a damage-sensitive parameter vector. Finally, PCA and Isomap algorithm are used to classify and visualize this parameter vector, to separate damage characteristics from healthy bearing components. Moreover, the PSO-based optimization algorithm improves the classification performance by selecting proper weightings for the parameter vector, to maximize the visualization effect of separating and grouping of parameter vectors in three-dimensional space. PMID:29143772

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yale, S H

    A survey was conducted of x-ray facilities in 2000 dental offices under actual operating conditions. Each of 10 dental schools in the United States collected data on 200 local dental offices to implement geographic analysis of the status of radiation hygiene in the offices. The data provided records of roentgen (r) output of each machine, relative r dose to patient, and dose to operator. In addition, specific information relating to both operator and machine was coiiected and evaluated. Some dentists were found to be operating under unsafe conditions, but the average dentist covered in the survey was statistically safe. Onmore » the basis of the survey, it was concluded that the probiem of radiation hazards in dentistry will be resolved when all dental x-ray machines are properly filtered and collimated and high-speed dental x-ray film is used. (P.C.H.)« less

  1. Analysis of dental supportive structures in orthodontic therapy.

    PubMed

    Pavicin, Ivana Savić; Ivosević-Magdalenić, Natasa; Badel, Tomislav; Basić, Kresimir; Keros, Jadranka

    2012-09-01

    The purpose was to define the impact of orthodontic appliances on the density of the underlying dental bone tissue. Radiographic images of teeth were made in 27 study subjects before and twelve months after fixed orthodontic appliances were carried. The radiographs were digitalized and the levels of gray at sites where the greatest bone resorption was expected were transformed into optic density. In the standardization and comparison of values from the first and the second measurements the copper calibration wedge--a stepwedge--was used. Optic densities in the observed sites were compared with optic densities of the calibration wedge and expressed as their thickness equivalent. The study results showed no statistically significant difference in bone densities, indicating that the orthodontic therapy was properly planned and carried out and that excessive forces were not used in the applied correctional procedures.

  2. Hiding in plain sight

    NASA Astrophysics Data System (ADS)

    Riedel, Adric Richard

    2012-05-01

    Since the first successful measurements of stellar trigonometric parallax in the 1830s, the study of nearby stars has focused on the highest proper motion stars (micro > 0.18″ yr-1). Those high proper motion stars have formed the backbone of the last 150 years of study of the Solar Neighborhood and the composition of the Galaxy. Statistically speaking, though, there is a population of stars that will have low proper motions when their space motions have been projected onto the sky. At the same time, over the last twenty years, populations of relatively young stars (less than ˜ 100 Myr), most of them with low proper motions, have been revealed near (< 100 pc) the Sun. This dissertation is the result of two related projects: A photometric search for nearby (< 25pc) southern-hemisphere M dwarf stars with low proper motions (micro < 0.18″ yr-1), and a search for nearby (< 100 pc) pre-main-sequence (< 125 Myr old) M dwarf systems. The projects rely on a variety of photometric, spectroscopic, and astrometric analyses (including parallaxes from our program) using data from telescopes at CTIO via the SMARTS Consortium and at Lowell Observatory. Within this dissertation, I describe the identification and confirmation of 23 new nearby low proper motion M dwarf systems within 25 pc, 8 of which are within 15 pc (50% of the anticipated low-proper-motion 15 pc sample). I also report photometric, spectroscopic, and astrometric parameters and identifications for a selection of 25 known and new candidate nearby young M dwarfs, including new low-mass members of the TW Hydra, beta Pictoris, Tucana-Horologium, Argus, and AB Doradus associations, following the methods of my Riedel et al. (2011) paper and its discovery of AP Col, the closest pre-main-sequence star to the Solar System. These low proper motion and nearby star discoveries are put into the context of the Solar Neighborhood as a whole by means of the new RECONS 25 pc Database, to which I have now added (including my Riedel et al. (2010) paper) 81 star systems (4% of the total). INDEX WORDS: Astronomy, Astrometry, Photometry, Spectroscopy, Kinematics, Proper motion, Parallax, Nearby stars, Low-mass stars, Young stars, Pre-main-sequence stars.

  3. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  4. Using Bloom's Taxonomy to Evaluate the Cognitive Levels of Master Class Textbook's Questions

    ERIC Educational Resources Information Center

    Assaly, Ibtihal R.; Smadi, Oqlah M.

    2015-01-01

    This study aimed at evaluating the cognitive levels of the questions following the reading texts of Master Class textbook. A checklist based on Bloom's Taxonomy was the instrument used to categorize the cognitive levels of these questions. The researchers used proper statistics to rank the cognitive levels of the comprehension questions. The…

  5. Predictor sort sampling and one-sided confidence bounds on quantiles

    Treesearch

    Steve Verrill; Victoria L. Herian; David W. Green

    2002-01-01

    Predictor sort experiments attempt to make use of the correlation between a predictor that can be measured prior to the start of an experiment and the response variable that we are investigating. Properly designed and analyzed, they can reduce necessary sample sizes, increase statistical power, and reduce the lengths of confidence intervals. However, if the non- random...

  6. Brady, Our Firstborn Son, Has Autism

    ERIC Educational Resources Information Center

    Yeh-Kennedy, Mei

    2008-01-01

    Autism awareness is spreading like wildfire. Diagnoses have increased at an astounding rate. The statistic most often quoted is that 1 child in 150 has autism. As if the high rate of autism diagnoses were not worrisome enough, many doctors are not properly trained, or kept up to date, on how to detect autism at the earliest possible age. In many…

  7. Computer Access and Computer Use for Science Performance of Racial and Linguistic Minority Students

    ERIC Educational Resources Information Center

    Chang, Mido; Kim, Sunha

    2009-01-01

    This study examined the effects of computer access and computer use on the science achievement of elementary school students, with focused attention on the effects for racial and linguistic minority students. The study used the Early Childhood Longitudinal Study (ECLS-K) database and conducted statistical analyses with proper weights and…

  8. FORTRAN IV Program to Determine the Proper Sequence of Records in a Datafile

    ERIC Educational Resources Information Center

    Jones, Michael P.; Yoshida, Roland K.

    1975-01-01

    This FORTRAN IV program executes an essential editing procedure which determines whether a datafile contains an equal number of records (cards) per case which are also in the intended sequential order. The program which requires very little background in computer programming is designed primarily for the user of packaged statistical procedures.…

  9. 77 FR 3477 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-24

    ... collection for the proper performance of the agency's functions; (2) the accuracy of the estimated burden; (3... submitted to CMS through the 372 web-based form. The report is used by CMS to compare actual data in the... provided is compared to that in the Medicaid Statistical Information System (CMS-R-284, OCN 0938-0345...

  10. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...

  11. 76 FR 66875 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... to properly assess duties on the merchandise and collect accurate statistics with respect to the.... In Sec. 10.1: a. Introductory paragraph (a) is amended by removing the word ``shall'' and adding in... removing the word ``shall'' and adding in its place the word ``must''; m. Introductory paragraph (h)(4) is...

  12. The effect of center-of-mass motion on photon statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yang; Zhang, Jun; Wu, Shao-xiong

    2015-10-15

    We analyze the photon statistics of a weakly driven cavity quantum electrodynamics system and discuss the effects of photon blockade and photon-induced tunneling by effectively utilizing instead of avoiding the center-of-mass motion of a two-level atom trapped in the cavity. With the resonant interaction between atom, photon and phonon, it is shown that the bunching and anti-bunching of photons can occur with properly driving frequency. Our study shows the influence of the imperfect cooling of atom on the blockade and provides an attempt to take advantage of the center-of-mass motion.

  13. Quantum statistical mechanics of dense partially ionized hydrogen

    NASA Technical Reports Server (NTRS)

    Dewitt, H. E.; Rogers, F. J.

    1972-01-01

    The theory of dense hydrogen plasmas beginning with the two component quantum grand partition function is reviewed. It is shown that ionization equilibrium and molecular dissociation equilibrium can be treated in the same manner with proper consideration of all two-body states. A quantum perturbation expansion is used to give an accurate calculation of the equation of state of the gas for any degree of dissociation and ionization. The statistical mechanical calculation of the plasma equation of state is intended for stellar interiors. The general approach is extended to the calculation of the equation of state of the outer layers of large planets.

  14. Comparative analysis of a nontraditional general chemistry textbook and selected traditional textbooks used in Texas community colleges

    NASA Astrophysics Data System (ADS)

    Salvato, Steven Walter

    The purpose of this study was to analyze questions within the chapters of a nontraditional general chemistry textbook and the four general chemistry textbooks most widely used by Texas community colleges in order to determine if the questions require higher- or lower-order thinking according to Bloom's taxonomy. The study employed quantitative methods. Bloom's taxonomy (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) was utilized as the main instrument in the study. Additional tools were used to help classify the questions into the proper category of the taxonomy (McBeath, 1992; Metfessel, Michael, & Kirsner, 1969). The top four general chemistry textbooks used in Texas community colleges and Chemistry: A Project of the American Chemical Society (Bell et al., 2005) were analyzed during the fall semester of 2010 in order to categorize the questions within the chapters into one of the six levels of Bloom's taxonomy. Two coders were used to assess reliability. The data were analyzed using descriptive and inferential methods. The descriptive method involved calculation of the frequencies and percentages of coded questions from the books as belonging to the six categories of the taxonomy. Questions were dichotomized into higher- and lower-order thinking questions. The inferential methods involved chi-square tests of association to determine if there were statistically significant differences among the four traditional college general chemistry textbooks in the proportions of higher- and lower-order questions and if there were statistically significant differences between the nontraditional chemistry textbook and the four traditional general chemistry textbooks. Findings indicated statistically significant differences among the four textbooks frequently used in Texas community colleges in the number of higher- and lower-level questions. Statistically significant differences were also found among the four textbooks and the nontraditional textbook. After the analysis of the data, conclusions were drawn, implications for practice were delineated, and recommendations for future research were given.

  15. [Analysis of women nutritional status during pregnancy--a survey].

    PubMed

    Selwet, Monika; Machura, Mariola; Sipiński, Adam; Kuna, Anna; Kazimierczak, Małgorzata

    2004-01-01

    The proper diet is one of the most important factor during pregnancy. The general knowledge about proper nourishment during pregnancy allows the women to avoid quantitative and qualitative nourishment mistakes. Because of this--the salubrious education in this aspect is very important. The aim of the study is to analyze the proper nourishment during pregnancy particularly in professionally active women and those who don't work during pregnancy.

  16. Type 1 diabetes mellitus effects on dental enamel formation revealed by microscopy and microanalysis.

    PubMed

    Silva, Bruna Larissa Lago; Medeiros, Danila Lima; Soares, Ana Prates; Line, Sérgio Roberto Peres; Pinto, Maria das Graças Farias; Soares, Telma de Jesus; do Espírito Santo, Alexandre Ribeiro

    2018-03-01

    Type 1 diabetes mellitus (T1DM) largely affects children, occurring therefore at the same period of deciduous and permanent teeth development. The aim of this work was to investigate birefringence and morphology of the secretory stage enamel organic extracellular matrix (EOECM), and structural and mechanical features of mature enamel from T1DM rats. Adult Wistar rats were maintained alive for a period of 56 days after the induction of experimental T1DM with a single dose of streptozotocin (60 mg/kg). After proper euthanasia of the animals, fixed upper incisors were accurately processed, and secretory stage EOECM and mature enamel were analyzed by transmitted polarizing and bright field light microscopies (TPLM and BFLM), energy-dispersive x-ray (EDX) analysis, scanning electron microscopy (SEM), and microhardness testing. Bright field light microscopies and transmitted polarizing light microscopies showed slight morphological changes in the secretory stage EOECM from diabetic rats, which also did not exhibit statistically significant alterations in birefringence brightness when compared to control animals (P > .05). EDX analysis showed that T1DM induced statistically significant little increases in the amount of calcium and phosphorus in outer mature enamel (P < .01) with preservation of calcium/phosphorus ratio in that structure (P > .05). T1DM also caused important ultrastructural alterations in mature enamel as revealed by SEM and induced a statistically significant reduction of about 13.67% in its microhardness at 80 μm from dentin-enamel junction (P < .01). This study shows that T1DM may disturb enamel development, leading to alterations in mature enamel ultrastructure and in its mechanical features. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Validation of Geno-Sen's Scrub Typhus Real Time Polymerase Chain Reaction Kit by its Comparison with a Serological ELISA Test

    PubMed Central

    Anitharaj, Velmurugan; Stephen, Selvaraj; Pradeep, Jothimani; Pooja, Pratheesh; Preethi, Sridharan

    2017-01-01

    Background: In the recent past, scrub typhus (ST) has been reported from different parts of India, based on Weil-Felix/enzyme-linked immunosorbent assay (ELISA)/indirect immunofluorescence assay (IFA). Molecular tests are applied only by a few researchers. Aims: Evaluation of a new commercial real time polymerase chain reaction (PCR) kit for molecular diagnosis of ST by comparing it with the commonly used IgM ELISA is our aim. Settings and Design: ST has been reported all over India including Puducherry and surrounding Tamil Nadu and identified as endemic for ST. This study was designed to correlate antibody detection by IgM ELISA and Orientia tsutsugamushi DNA in real time PCR. Materials and Methods: ST IgM ELISA (InBios Inc., USA) was carried out for 170 consecutive patients who presented with the symptoms of acute ST during 11 months (November, 2015– September, 2016). All 77 of these patients with IgM ELISA positivity and 49 of 93 IgM ELISA negative patients were subjected to real time PCR (Geno-Sen's ST real time PCR, Himachal Pradesh, India). Statistical Analysis: Statistical analysis for clinical and laboratory results was performed using IBM SPSS Statistics 17 for Windows (SPSS Inc., Chicago, USA). Chi-square test with Yates correction (Fisher's test) was employed for a small number of samples. Results and Conclusion: Among 77 suspected cases of acute ST with IgM ELISA positivity and 49 IgM negative patients, 42 and 7 were positive, respectively, for O. tsutsugamushi 56-kDa type-specific gene in real time PCR kit. Until ST IFA, the gold standard diagnostic test, is properly validated in India, diagnosis of acute ST will depend on both ELISA and quantitative PCR. PMID:28878522

  18. Differences in psychopathology and behavioral characteristics of patients affected by conversion motor disorder and organic dystonia.

    PubMed

    Pastore, Adriana; Pierri, Grazia; Fabio, Giada; Ferramosca, Silvia; Gigante, Angelo; Superbo, Maria; Pellicciari, Roberta; Margari, Francesco

    2018-01-01

    Typically, the diagnosis of conversion motor disorder (CMD) is achieved by the exclusion of a wide range of organic illnesses rather than by applying positive criteria. New diagnostic criteria are highly needed in this scenario. The main aim of this study was to explore the use of behavioral features as an inclusion criterion for CMD, taking into account the relationship of the patients with physicians, and comparing the results with those from patients affected by organic dystonia (OD). Patients from the outpatient Movement Disorder Service were assigned to either the CMD or the OD group based on Fahn and Williams criteria. Differences in sociodemographics, disease history, psychopathology, and degree of satisfaction about care received were assessed. Patient-neurologist agreement about the etiological nature of the disorder was also assessed using the k -statistic. A logistic regression analysis estimated the discordance status as a predictor to case/control status. In this study, 31 CMD and 31 OD patients were included. CMD patients showed a longer illness life span, involvement of more body regions, higher comorbidity with anxiety, depression, and borderline personality disorder, as well as higher negative opinions about physicians' delivering of proper care. Contrary to our expectations, CMD disagreement with neurologists about the etiological nature of the disorder was not statistically significant. Additional analysis showed that having at least one personality disorder was statistically associated with the discordance status. This study suggests that CMD patients show higher conflicting behavior toward physicians. Contrary to our expectations, they show awareness of their psychological needs, suggesting a possible lack of recognition of psychological distress in the neurological setting.

  19. Differences in psychopathology and behavioral characteristics of patients affected by conversion motor disorder and organic dystonia

    PubMed Central

    Pastore, Adriana; Pierri, Grazia; Fabio, Giada; Ferramosca, Silvia; Gigante, Angelo; Superbo, Maria; Pellicciari, Roberta; Margari, Francesco

    2018-01-01

    Purpose Typically, the diagnosis of conversion motor disorder (CMD) is achieved by the exclusion of a wide range of organic illnesses rather than by applying positive criteria. New diagnostic criteria are highly needed in this scenario. The main aim of this study was to explore the use of behavioral features as an inclusion criterion for CMD, taking into account the relationship of the patients with physicians, and comparing the results with those from patients affected by organic dystonia (OD). Patients and methods Patients from the outpatient Movement Disorder Service were assigned to either the CMD or the OD group based on Fahn and Williams criteria. Differences in sociodemographics, disease history, psychopathology, and degree of satisfaction about care received were assessed. Patient–neurologist agreement about the etiological nature of the disorder was also assessed using the k-statistic. A logistic regression analysis estimated the discordance status as a predictor to case/control status. Results In this study, 31 CMD and 31 OD patients were included. CMD patients showed a longer illness life span, involvement of more body regions, higher comorbidity with anxiety, depression, and borderline personality disorder, as well as higher negative opinions about physicians’ delivering of proper care. Contrary to our expectations, CMD disagreement with neurologists about the etiological nature of the disorder was not statistically significant. Additional analysis showed that having at least one personality disorder was statistically associated with the discordance status. Conclusion This study suggests that CMD patients show higher conflicting behavior toward physicians. Contrary to our expectations, they show awareness of their psychological needs, suggesting a possible lack of recognition of psychological distress in the neurological setting. PMID:29849460

  20. Statistical properties of the polarized emission of Planck Galactic cold clumps

    NASA Astrophysics Data System (ADS)

    Ristorcelli, Isabelle; Planck Collaboration

    2015-08-01

    The Galactic magnetic fields are considered as one of the key components regulating star formation, but their actual role on the dense cores formation and evolution remains today an open question.Dust polarized continuum emission is particularly well suited to probe the dense and cold medium and study the magnetic field structure. Such observations also provide tight constraints to better understand the efficiency of the dust alignment along the magnetic field lines, which in turn relate on our grasp to properly interpret the B-field properties.With the Planck all-sky survey of dust submillimeter emission in intensity and polarization, we can investigate the intermediate scales, between that of molecular cloud and of prestellar cores, and perform a statistical analysis on the polarization properties of cold clumps.Combined with the IRAS map at 100microns, the Planck survey has allowed to build the first all-sky catalogue of Galactic Cold Clumps (PGCC, Planck 2015 results XXVIII 2015). The corresponding 13188 sources cover a broad range in physical properties, and correspond to different evolutionary stages, from cold and starless clumps, nearby cores, to young protostellar objects still embedded in their cold surrounding cloud.I will present the main results of our polarization analysis obtained on different samples of sources from the PGCC catalogue, based on the 353GHz polarized emission measured with Planck. The statistical properties are derived from a stacking method, using optimized estimators for the polarization fraction and angle parameters. These properties are determined and compared according to the nature of the sources (starless or YSOs), their size or density range. Finally, I will present a comparison of our results with predictions from MHD simulations of clumps including radiative transfer and the dust radiative torque alignment mechanism.

  1. ASSESSMENT OF GOOD PRACTICES IN HOSPITAL FOOD SERVICE BY COMPARING EVALUATION TOOLS.

    PubMed

    Macedo Gonçalves, Juliana; Lameiro Rodrigues, Kelly; Santiago Almeida, Ângela Teresinha; Pereira, Giselda Maria; Duarte Buchweitz, Márcia Rúbia

    2015-10-01

    since food service in hospitals complements medical treatment, it should be produced in proper hygienic and sanitary conditions. It is a well-known fact that food-transmitted illnesses affect with greater severity hospitalized and immunosuppressed patients. good practices in hospital food service are evaluated by comparing assessment instruments. good practices were evaluated by a verification list following Resolution of Collegiate Directory n. 216 of the Brazilian Agency for Sanitary Vigilance. Interpretation of listed items followed parameters of RCD 216 and the Brazilian Association of Collective Meals Enterprises (BACME). Fisher's exact test was applied to detect whether there were statistically significant differences. Analysis of data grouping was undertaken with Unweighted Pair-group using Arithmetic Averages, coupled to a correlation study between dissimilarity matrixes to verify disagreement between the two methods. Good Practice was classified with mean total rates above 75% by the two methods. There were statistically significant differences between services and food evaluated by BACME instrument. Hospital Food Services have proved to show conditions of acceptable good practices. the comparison of interpretation tools based on RCD n. 216 and BACME provided similar results for the two classifications. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  2. Accounting for interim safety monitoring of an adverse event upon termination of a clinical trial.

    PubMed

    Dallas, Michael J

    2008-01-01

    Upon termination of a clinical trial that uses interim evaluations to determine whether the trial can be stopped, a proper statistical analysis must account for the interim evaluations. For example, in a group-sequential design where the efficacy of a treatment regimen is evaluated at interim stages, and the opportunity to stop the trial based on positive efficacy findings exists, the terminal p-value, point estimate, and confidence limits of the outcome of interest must be adjusted to eliminate bias. While it is standard practice to adjust terminal statistical analyses due to opportunities to stop for "positive" findings, adjusting due to opportunities to stop for "negative" findings is also important. Stopping rules for negative findings are particularly useful when monitoring a specific rare serious adverse event in trials designed to show safety with respect to the event. In these settings, establishing conservative stopping rules are appropriate, and therefore accounting for the interim monitoring can have a substantial effect on the final results. Here I present a method to account for interim safety monitoring and illustrate its usefulness. The method is demonstrated to have advantages over methodology that does not account for interim monitoring.

  3. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    PubMed

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  4. MSblender: a probabilistic approach for integrating peptide identifications from multiple database search engines

    PubMed Central

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.

    2011-01-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  5. Utilization of Dental Services in Public Health Center: Dental Attendance, Awareness and Felt Needs.

    PubMed

    Pewa, Preksha; Garla, Bharath K; Dagli, Rushabh; Bhateja, Geetika Arora; Solanki, Jitendra

    2015-10-01

    In rural India, dental diseases occur due to many factors, which includes inadequate or improper use of fluoride and a lack of knowledge regarding oral health and oral hygiene, which prevent proper screening and dental care of oral diseases. The objective of the study was to evaluate the dental attendance, awareness and utilization of dental services in public health center. A cross-sectional study was conducted among 251 study subjects who were visiting dental outpatient department (OPD) of public health centre (PHC), Guda Bishnoi, and Jodhpur using a pretested proforma from month of July 2014 to October 2014. A pretested questionnaire was used to collect the data regarding socioeconomic status and demographic factors affecting the utilization of dental services. Pearson's Chi-square test and step-wise logistic regression were applied for the analysis. Statistically significant results were found in relation to age, educational status, socioeconomic status and gender with dental attendance, dental awareness and felt needs. p-value <0.05 was kept as statistically significant. The services provided in public health center should be based on the felt need of the population to increase attendance as well as utilization of dental services, thereby increasing the oral health status of the population.

  6. Design of Neural Networks for Fast Convergence and Accuracy: Dynamics and Control

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Sparks, Dean W., Jr.

    1997-01-01

    A procedure for the design and training of artificial neural networks, used for rapid and efficient controls and dynamics design and analysis for flexible space systems, has been developed. Artificial neural networks are employed, such that once properly trained, they provide a means of evaluating the impact of design changes rapidly. Specifically, two-layer feedforward neural networks are designed to approximate the functional relationship between the component/spacecraft design changes and measures of its performance or nonlinear dynamics of the system/components. A training algorithm, based on statistical sampling theory, is presented, which guarantees that the trained networks provide a designer-specified degree of accuracy in mapping the functional relationship. Within each iteration of this statistical-based algorithm, a sequential design algorithm is used for the design and training of the feedforward network to provide rapid convergence to the network goals. Here, at each sequence a new network is trained to minimize the error of previous network. The proposed method should work for applications wherein an arbitrary large source of training data can be generated. Two numerical examples are performed on a spacecraft application in order to demonstrate the feasibility of the proposed approach.

  7. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  8. A subregion-based burden test for simultaneous identification of susceptibility loci and subregions within.

    PubMed

    Zhu, Bin; Mirabello, Lisa; Chatterjee, Nilanjan

    2018-06-22

    In rare variant association studies, aggregating rare and/or low frequency variants, may increase statistical power for detection of the underlying susceptibility gene or region. However, it is unclear which variants, or class of them, in a gene contribute most to the association. We proposed a subregion-based burden test (REBET) to simultaneously select susceptibility genes and identify important underlying subregions. The subregions are predefined by shared common biologic characteristics, such as the protein domain or functional impact. Based on a subset-based approach considering local correlations between combinations of test statistics of subregions, REBET is able to properly control the type I error rate while adjusting for multiple comparisons in a computationally efficient manner. Simulation studies show that REBET can achieve power competitive to alternative methods when rare variants cluster within subregions. In two case studies, REBET is able to identify known disease susceptibility genes, and more importantly pinpoint the unreported most susceptible subregions, which represent protein domains essential for gene function. R package REBET is available at https://dceg.cancer.gov/tools/analysis/rebet. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  9. Design of neural networks for fast convergence and accuracy: dynamics and control.

    PubMed

    Maghami, P G; Sparks, D R

    2000-01-01

    A procedure for the design and training of artificial neural networks, used for rapid and efficient controls and dynamics design and analysis for flexible space systems, has been developed. Artificial neural networks are employed, such that once properly trained, they provide a means of evaluating the impact of design changes rapidly. Specifically, two-layer feedforward neural networks are designed to approximate the functional relationship between the component/spacecraft design changes and measures of its performance or nonlinear dynamics of the system/components. A training algorithm, based on statistical sampling theory, is presented, which guarantees that the trained networks provide a designer-specified degree of accuracy in mapping the functional relationship. Within each iteration of this statistical-based algorithm, a sequential design algorithm is used for the design and training of the feedforward network to provide rapid convergence to the network goals. Here, at each sequence a new network is trained to minimize the error of previous network. The proposed method should work for applications wherein an arbitrary large source of training data can be generated. Two numerical examples are performed on a spacecraft application in order to demonstrate the feasibility of the proposed approach.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John R.; Brooks, Dusty Marie

    In pressurized water reactors, the prevention, detection, and repair of cracks within dissimilar metal welds is essential to ensure proper plant functionality and safety. Weld residual stresses, which are difficult to model and cannot be directly measured, contribute to the formation and growth of cracks due to primary water stress corrosion cracking. Additionally, the uncertainty in weld residual stress measurements and modeling predictions is not well understood, further complicating the prediction of crack evolution. The purpose of this document is to develop methodology to quantify the uncertainty associated with weld residual stress that can be applied to modeling predictions andmore » experimental measurements. Ultimately, the results can be used to assess the current state of uncertainty and to build confidence in both modeling and experimental procedures. The methodology consists of statistically modeling the variation in the weld residual stress profiles using functional data analysis techniques. Uncertainty is quantified using statistical bounds (e.g. confidence and tolerance bounds) constructed with a semi-parametric bootstrap procedure. Such bounds describe the range in which quantities of interest, such as means, are expected to lie as evidenced by the data. The methodology is extended to provide direct comparisons between experimental measurements and modeling predictions by constructing statistical confidence bounds for the average difference between the two quantities. The statistical bounds on the average difference can be used to assess the level of agreement between measurements and predictions. The methodology is applied to experimental measurements of residual stress obtained using two strain relief measurement methods and predictions from seven finite element models developed by different organizations during a round robin study.« less

  11. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    PubMed

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  12. Geochemical prospecting for Cu mineralization in an arid terrain-central Iran

    NASA Astrophysics Data System (ADS)

    Mokhtari, Ahmad Reza; Roshani Rodsari, Parisa; Fatehi, Moslem; Shahrestani, Shahed; Pournik, Peyman

    2014-12-01

    Geochemical sampling and data processing were implemented for prospecting Cu mineralization through catchment basin approach in central Iran, Yazd province, over drainage systems in order to determine areas of interest for the detailed exploration program. The target zone, inside an area called Kalout-e-Ashrafa in Yazd province-Iran, was characterized by the collection of 107 stream sediment samples. Catchment basin modeling was conducted based on digital elevation model (DEM) and geological map of the study area. Samples were studied by univariate and multivariate statistical techniques of exploratory data analysis, classical statistical analysis and cluster analysis. The results showed that only Cu had anomalous behavior and it did not exhibit a considerable correlation with other elements. Geochemical maps were prepared for Cu and anomalous zones and separated for potential copper mineralization. It was concluded that due to especial geomorphological and geographical characteristics (smooth topography, negligible annual precipitation and insufficient thickness of silicified Cu-bearing outcrops of the area), low concentrations of Cu would be expected for the delineation of promising zones in similar trains. Using cluster analysis showed that there was a strong correlation between Ag, Sr and S. Calcium and Pb present moderate correlation with Cu. Additionally, there was a strong correlation between Zn and Li, thereby indicating a meaningful correlation with Fe, P, Ti and Mg. Aluminum, Sc and V had a correlation with Be and K. Applying threshold value according to MAD (median absolute deviation) helped us to distinguish anomalous catchments more properly. Finally, there was a significant kind of conformity among anomalous catchment basins and silicified veins and veinlets (as validating index) at the central part of the area.

  13. Validating the WRF-Chem model for wind energy applications using High Resolution Doppler Lidar data from a Utah 2012 field campaign

    NASA Astrophysics Data System (ADS)

    Mitchell, M. J.; Pichugina, Y. L.; Banta, R. M.

    2015-12-01

    Models are important tools for assessing potential of wind energy sites, but the accuracy of these projections has not been properly validated. In this study, High Resolution Doppler Lidar (HRDL) data obtained with high temporal and spatial resolution at heights of modern turbine rotors were compared to output from the WRF-chem model in order to help improve the performance of the model in producing accurate wind forecasts for the industry. HRDL data were collected from January 23-March 1, 2012 during the Uintah Basin Winter Ozone Study (UBWOS) field campaign. A model validation method was based on the qualitative comparison of the wind field images, time-series analysis and statistical analysis of the observed and modeled wind speed and direction, both for case studies and for the whole experiment. To compare the WRF-chem model output to the HRDL observations, the model heights and forecast times were interpolated to match the observed times and heights. Then, time-height cross-sections of the HRDL and WRF-Chem wind speed and directions were plotted to select case studies. Cross-sections of the differences between the observed and forecasted wind speed and directions were also plotted to visually analyze the model performance in different wind flow conditions. A statistical analysis includes the calculation of vertical profiles and time series of bias, correlation coefficient, root mean squared error, and coefficient of determination between two datasets. The results from this analysis reveals where and when the model typically struggles in forecasting winds at heights of modern turbine rotors so that in the future the model can be improved for the industry.

  14. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Adaptive Kalman filtering for real-time mapping of the visual field

    PubMed Central

    Ward, B. Douglas; Janik, John; Mazaheri, Yousef; Ma, Yan; DeYoe, Edgar A.

    2013-01-01

    This paper demonstrates the feasibility of real-time mapping of the visual field for clinical applications. Specifically, three aspects of this problem were considered: (1) experimental design, (2) statistical analysis, and (3) display of results. Proper experimental design is essential to achieving a successful outcome, particularly for real-time applications. A random-block experimental design was shown to have less sensitivity to measurement noise, as well as greater robustness to error in modeling of the hemodynamic impulse response function (IRF) and greater flexibility than common alternatives. In addition, random encoding of the visual field allows for the detection of voxels that are responsive to multiple, not necessarily contiguous, regions of the visual field. Due to its recursive nature, the Kalman filter is ideally suited for real-time statistical analysis of visual field mapping data. An important feature of the Kalman filter is that it can be used for nonstationary time series analysis. The capability of the Kalman filter to adapt, in real time, to abrupt changes in the baseline arising from subject motion inside the scanner and other external system disturbances is important for the success of clinical applications. The clinician needs real-time information to evaluate the success or failure of the imaging run and to decide whether to extend, modify, or terminate the run. Accordingly, the analytical software provides real-time displays of (1) brain activation maps for each stimulus segment, (2) voxel-wise spatial tuning profiles, (3) time plots of the variability of response parameters, and (4) time plots of activated volume. PMID:22100663

  16. Detecting seasonal and cyclical trends in agricultural runoff water quality-hypothesis tests and block bootstrap power analysis.

    PubMed

    Uddameri, Venkatesh; Singaraju, Sreeram; Hernandez, E Annette

    2018-02-21

    Seasonal and cyclic trends in nutrient concentrations at four agricultural drainage ditches were assessed using a dataset generated from a multivariate, multiscale, multiyear water quality monitoring effort in the agriculturally dominant Lower Rio Grande Valley (LRGV) River Watershed in South Texas. An innovative bootstrap sampling-based power analysis procedure was developed to evaluate the ability of Mann-Whitney and Noether tests to discern trends and to guide future monitoring efforts. The Mann-Whitney U test was able to detect significant changes between summer and winter nutrient concentrations at sites with lower depths and unimpeded flows. Pollutant dilution, non-agricultural loadings, and in-channel flow structures (weirs) masked the effects of seasonality. The detection of cyclical trends using the Noether test was highest in the presence of vegetation mainly for total phosphorus and oxidized nitrogen (nitrite + nitrate) compared to dissolved phosphorus and reduced nitrogen (total Kjeldahl nitrogen-TKN). Prospective power analysis indicated that while increased monitoring can lead to higher statistical power, the effect size (i.e., the total number of trend sequences within a time-series) had a greater influence on the Noether test. Both Mann-Whitney and Noether tests provide complementary information on seasonal and cyclic behavior of pollutant concentrations and are affected by different processes. The results from these statistical tests when evaluated in the context of flow, vegetation, and in-channel hydraulic alterations can help guide future data collection and monitoring efforts. The study highlights the need for long-term monitoring of agricultural drainage ditches to properly discern seasonal and cyclical trends.

  17. A framework for building hypercubes using MapReduce

    NASA Astrophysics Data System (ADS)

    Tapiador, D.; O'Mullane, W.; Brown, A. G. A.; Luri, X.; Huedo, E.; Osuna, P.

    2014-05-01

    The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalog will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigm but without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

  18. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  19. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  20. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  1. Searching cause of death through different autopsy methods: A new initiative

    PubMed Central

    Das, Abhishek; Chowdhury, Ranadip

    2017-01-01

    A lawful disposal of human dead body is only possible after establishment of proper and valid cause of death. If the cause is obscure, autopsy is the only mean of search. Inadequacy and unavailability of health care facility often makes this situation more complicated in developing countries where many deaths remain unexplained and proper mortality statistics is missing, especially for infant and children. Tissue sampling by needle autopsy or use of various imaging technique in virtopsy have been tried globally to find out an easier alternative. An exclusive and unique initiative, by limited autopsy through tissue biopsy and body fluid analysis, has been taken to meet this dire need in African and some of Asian developing countries, as worldwide accepted institutional data are even missing or conflicting at times. Traditional autopsy has changed little in last century, consisting of external examination and evisceration, dissection of organs with identification of macroscopic pathologies and injuries, followed by histopathology. As some population groups have religious objections to autopsy, demand for minimally invasive alternative has increased of late. But assessment of cause of death is most important for medico-legal, epidemiological and research purposes. Thus minimally invasive technique is of high importance in primary care settings too. In this article, we have made a journey through different autopsy methods, their relevance and applicability in modern day perspective considering scientific research articles, textbooks and interviews. PMID:29302514

  2. Validity of the SAT® for Predicting First-Year Grades: 2011 SAT Validity Sample. Statistical Report 2013-3

    ERIC Educational Resources Information Center

    Patterson, Brian F.; Mattern, Krista D.

    2013-01-01

    The continued accumulation of validity evidence for the intended uses of educational assessments is critical to ensure that proper inferences will be made for those purposes. To that end, the College Board has continued to collect college outcome data to evaluate the relationship between SAT® scores and college success. This report provides…

  3. Transitioning Florida NPs to opioid prescribing.

    PubMed

    Craig-Rodriguez, Alicia; Gordon, Glenna; Kaplan, Louise; Grubbs, Laurie

    2017-09-21

    Prior to statutory changes in prescriptive authority for controlled substances, this study examined the knowledge gaps and prescribing limitations of Florida advanced registered nurse practitioners regarding opioids. Study results revealed statistically significant knowledge gaps in the areas of federal and state guidelines; opioid classes and proper doses; risk assessment skills; monitoring of treatment; and confidence in dealing with challenges of opioid prescribing.

  4. Cross Contamination: Are Hospital Gloves Reservoirs for Nosocomial Infections?

    PubMed

    Moran, Vicki; Heuertz, Rita

    2017-01-01

    Use of disposable nonsterile gloves in the hospital setting is second only to proper hand washing in reducing contamination during patient contact. Because proper handwashing is not consistently practiced, added emphasis on glove use is warranted. There is a growing body of evidence that glove boxes and dispensers available to healthcare workers are contaminated by daily exposure to environmental organisms. This finding, in conjunction with new and emerging antibiotic-resistant bacteria, poses a threat to patients and healthcare workers alike. A newly designed glove dispenser may reduce contamination of disposable gloves. The authors investigated contamination of nonsterile examination gloves in an Emergency Department setting according to the type of dispenser used to access gloves. A statistically significant difference existed between the number of bacterial colonies and the type of dispenser: the downward-facing glove dispenser had a lower number of bacteria on the gloves. There was no statistically significant difference in the number of gloves contaminated between the two types of glove dispensers. The study demonstrated that contamination of disposable gloves existed. Additional research using a larger sample size would validate a difference in the contamination of disposable gloves using outward or downward glove dispensers.

  5. Removing an intersubject variance component in a general linear model improves multiway factoring of event-related spectral perturbations in group EEG studies.

    PubMed

    Spence, Jeffrey S; Brier, Matthew R; Hart, John; Ferree, Thomas C

    2013-03-01

    Linear statistical models are used very effectively to assess task-related differences in EEG power spectral analyses. Mixed models, in particular, accommodate more than one variance component in a multisubject study, where many trials of each condition of interest are measured on each subject. Generally, intra- and intersubject variances are both important to determine correct standard errors for inference on functions of model parameters, but it is often assumed that intersubject variance is the most important consideration in a group study. In this article, we show that, under common assumptions, estimates of some functions of model parameters, including estimates of task-related differences, are properly tested relative to the intrasubject variance component only. A substantial gain in statistical power can arise from the proper separation of variance components when there is more than one source of variability. We first develop this result analytically, then show how it benefits a multiway factoring of spectral, spatial, and temporal components from EEG data acquired in a group of healthy subjects performing a well-studied response inhibition task. Copyright © 2011 Wiley Periodicals, Inc.

  6. Does RAIM with Correct Exclusion Produce Unbiased Positions?

    PubMed Central

    Teunissen, Peter J. G.; Imparato, Davide; Tiberius, Christian C. J. M.

    2017-01-01

    As the navigation solution of exclusion-based RAIM follows from a combination of least-squares estimation and a statistically based exclusion-process, the computation of the integrity of the navigation solution has to take the propagated uncertainty of the combined estimation-testing procedure into account. In this contribution, we analyse, theoretically as well as empirically, the effect that this combination has on the first statistical moment, i.e., the mean, of the computed navigation solution. It will be shown, although statistical testing is intended to remove biases from the data, that biases will always remain under the alternative hypothesis, even when the correct alternative hypothesis is properly identified. The a posteriori exclusion of a biased satellite range from the position solution will therefore never remove the bias in the position solution completely. PMID:28672862

  7. Managing Large Datasets for Atmospheric Research

    NASA Technical Reports Server (NTRS)

    Chen, Gao

    2015-01-01

    Since the mid-1980s, airborne and ground measurements have been widely used to provide comprehensive characterization of atmospheric composition and processes. Field campaigns have generated a wealth of insitu data and have grown considerably over the years in terms of both the number of measured parameters and the data volume. This can largely be attributed to the rapid advances in instrument development and computing power. The users of field data may face a number of challenges spanning data access, understanding, and proper use in scientific analysis. This tutorial is designed to provide an introduction to using data sets, with a focus on airborne measurements, for atmospheric research. The first part of the tutorial provides an overview of airborne measurements and data discovery. This will be followed by a discussion on the understanding of airborne data files. An actual data file will be used to illustrate how data are reported, including the use of data flags to indicate missing data and limits of detection. Retrieving information from the file header will be discussed, which is essential to properly interpreting the data. Field measurements are typically reported as a function of sampling time, but different instruments often have different sampling intervals. To create a combined data set, the data merge process (interpolation of all data to a common time base) will be discussed in terms of the algorithm, data merge products available from airborne studies, and their application in research. Statistical treatment of missing data and data flagged for limit of detection will also be covered in this section. These basic data processing techniques are applicable to both airborne and ground-based observational data sets. Finally, the recently developed Toolsets for Airborne Data (TAD) will be introduced. TAD (tad.larc.nasa.gov) is an airborne data portal offering tools to create user defined merged data products with the capability to provide descriptive statistics and the option to treat measurement uncertainty.

  8. Dietary diversity and associated factors among children of Orthodox Christian mothers/caregivers during the fasting season in Dejen District, North West Ethiopia.

    PubMed

    Kumera, Gemechu; Tsedal, Endalkachew; Ayana, Mulatu

    2018-01-01

    Proper feeding practices during early childhood is fundamental for optimal child growth and development. However, scientific evidences on the determinants of dietary diversity are scanty. Particularly, the impact of fasting on children`s dietary diversity is not explored in Ethiopia. The aim of this study was to assess dietary diversity and associated factors among children aged 6-23 months, whose mothers/care-givers were Orthodox Christians during the fasting season (Lent), in Dejen District, North West Ethiopia, 2016. A community based cross-sectional study was conducted during the fasting season from March to April, 2016. The study sample were children aged 6-23 months, whose mothers/care-givers were Orthodox Christians. A systematic random sampling technique was used to select a sample of 967 children proportionally from all selected kebeles. Data was entered using Epi data and statistical analysis were done using logistic regression. P-value < 0.05 at 95% confidence interval was taken as statistically significant. Only 13.6% of children surveyed met the minimum requirement for dietary diversity. Unsatisfactory exposure to media [AOR = 5.22] and low household monthly income [AOR = 2.20] were negatively associated with dietary diversity. As compared to economic related reasons, mothers/caregivers who do not feed diet of animal origin to their children due to fear of utensil contamination for family food preparation were 1.5 times [AOR=1.5; 95% CI (1.05 - 2.53)] less likely to feed the recommended dietary diversity. The findings of this study revealed that the diet of children in the study area lacked diversity. Promoting mass media and socioeconomic empowerment of women have positive contribution to optimal child feeding practice. Sustained nutrition education to mothers regarding proper infant and young child feeding practice in collaboration with the respective religious leaders is highly recommended.

  9. Ages of the Bulge Globular Clusters NGC 6522 and NGC 6626 (M28) from HST Proper-motion-cleaned Color–Magnitude Diagrams

    NASA Astrophysics Data System (ADS)

    Kerber, L. O.; Nardiello, D.; Ortolani, S.; Barbuy, B.; Bica, E.; Cassisi, S.; Libralato, M.; Vieira, R. G.

    2018-01-01

    Bulge globular clusters (GCs) with metallicities [Fe/H] ≲ ‑1.0 and blue horizontal branches are candidates to harbor the oldest populations in the Galaxy. Based on the analysis of HST proper-motion-cleaned color–magnitude diagrams in filters F435W and F625W, we determine physical parameters for the old bulge GCs NGC 6522 and NGC 6626 (M28), both with well-defined blue horizontal branches. We compare these results with similar data for the inner halo cluster NGC 6362. These clusters have similar metallicities (‑1.3 ≤ [Fe/H] ≤ ‑1.0) obtained from high-resolution spectroscopy. We derive ages, distance moduli, and reddening values by means of statistical comparisons between observed and synthetic fiducial lines employing likelihood statistics and the Markov chain Monte Carlo method. The synthetic fiducial lines were generated using α-enhanced BaSTI and Dartmouth stellar evolutionary models, adopting both canonical (Y ∼ 0.25) and enhanced (Y ∼ 0.30–0.33) helium abundances. RR Lyrae stars were employed to determine the HB magnitude level, providing an independent indicator to constrain the apparent distance modulus and the helium enhancement. The shape of the observed fiducial line could be compatible with some helium enhancement for NGC 6522 and NGC 6626, but the average magnitudes of RR Lyrae stars tend to rule out this hypothesis. Assuming canonical helium abundances, BaSTI and Dartmouth models indicate that all three clusters are coeval, with ages between ∼12.5 and 13.0 Gyr. The present study also reveals that NGC 6522 has at least two stellar populations, since its CMD shows a significantly wide subgiant branch compatible with 14% ± 2% and 86% ± 5% for first and second generations, respectively. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute.

  10. Evaluation of the level of depression among medical students from Poland, Portugal and Germany.

    PubMed

    Seweryn, Mariusz; Tyrała, Kinga; Kolarczyk-Haczyk, Aleksandra; Bonk, Magdalena; Bulska, Weronika; Krysta, Krzysztof

    2015-09-01

    Depression is a serious illness affecting health, family and professional life of many people of all sectors of society. It also concerns students, regardless of their geographical location. The Beck Depression Inventory (BDI) is a proper tool to brief check of the level of depression because it has high correlation with depression. The aim of this study was to assess and compare the level of depression among medical students from Poland, Portugal and Germany. Students from different countries were asked to fill in an electronic form containing the BDI. The form was created separately for each country, using official translation of the BDI, approved by the competent psychiatric association. Google Drive software was used for the electronic form, and Stat soft Statistica v10 software for statistical analysis. There were statistically significant differences (p<0.05) in terms of average score of the BDI and of the proportion of the scores more than 10 points of medical and technology students among kinds of studies and countries. The average score of the BDI of medical students: Poland: 13.76±9.99 points; Germany: 8.49±7.64 points; Portugal: 7.37±7.67 points. The average score of the BDI of technology students: Poland: 12.42±9.66 points; Germany: 10.51±8.49 points; Portugal: 9.25±8.97 points. The proportion of the scores more than 10 points of medical students: Poland 56.32% (285/506) Germany 34.92% (154/441) Portugal 26.03% (82/315). The proportion of the scores more than 10 points of technology students: Poland 55.01% (368/669) Germany 43.82% (156/356) Portugal 37.57% (136/362). The highest depression score among medical and technology students according the BDI was found in Poland. A proper monitoring of depression is required, as well as rapid and appropriate help for those who suffer from it.

  11. Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2009-09-01

    In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in different geographical zones of Mediterranean area. Time series have been selected on the basis of the availability of at least 50 years in the time period 1921-1985, chosen as calibration period, and of all the years of observation in the subsequent validation period 1986-2005, whose daily rainfall occurrence process variability is under hypothesis. Firstly, for each time series and for each fixed threshold value, parameters estimation of the non-homogeneous Poisson model is carried out, referred to calibration period. As second step, in order to test the hypothesis that daily rainfall occurrence process preserves the same behaviour in more recent time periods, the intensity distribution evaluated for calibration period is also adopted for the validation period. Starting from this and using a Monte Carlo approach, 1000 synthetic generations of daily rainfall occurrences, of length equal to validation period, have been carried out, and for each simulation sample ?(t) has been evaluated. This procedure is adopted because of the complexity of determining analytical statistical confidence limits referred to the sample intensity ?(t). Finally, sample intensity, theoretical function of the calibration period and 95% statistical band, evaluated by Monte Carlo approach, are matching, together with considering, for each threshold value, the mean square error (MSE) between the theoretical ?(t) and the sample one of recorded data, and his correspondent 95% one tail statistical band, estimated from the MSE values between the sample ?(t) of each synthetic series and the theoretical one. The results obtained may be very useful in the context of the identification and calibration of stochastic rainfall models based on historical precipitation data. Further applications of the non-homogeneous Poisson model will concern the joint analyses of the storm occurrence process with the rainfall height marks, interpreted by using a temporally homogeneous model in proper sub-year intervals.

  12. Statistical inference for extended or shortened phase II studies based on Simon's two-stage designs.

    PubMed

    Zhao, Junjun; Yu, Menggang; Feng, Xi-Ping

    2015-06-07

    Simon's two-stage designs are popular choices for conducting phase II clinical trials, especially in the oncology trials to reduce the number of patients placed on ineffective experimental therapies. Recently Koyama and Chen (2008) discussed how to conduct proper inference for such studies because they found that inference procedures used with Simon's designs almost always ignore the actual sampling plan used. In particular, they proposed an inference method for studies when the actual second stage sample sizes differ from planned ones. We consider an alternative inference method based on likelihood ratio. In particular, we order permissible sample paths under Simon's two-stage designs using their corresponding conditional likelihood. In this way, we can calculate p-values using the common definition: the probability of obtaining a test statistic value at least as extreme as that observed under the null hypothesis. In addition to providing inference for a couple of scenarios where Koyama and Chen's method can be difficult to apply, the resulting estimate based on our method appears to have certain advantage in terms of inference properties in many numerical simulations. It generally led to smaller biases and narrower confidence intervals while maintaining similar coverages. We also illustrated the two methods in a real data setting. Inference procedures used with Simon's designs almost always ignore the actual sampling plan. Reported P-values, point estimates and confidence intervals for the response rate are not usually adjusted for the design's adaptiveness. Proper statistical inference procedures should be used.

  13. INTERNAL PROPER MOTIONS IN THE ESKIMO NEBULA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Díaz, Ma. T.; Gutiérrez, L.; Steffen, W.

    We present measurements of internal proper motions at more than 500 positions of NGC 2392, the Eskimo Nebula, based on images acquired with WFPC2 on board the Hubble Space Telescope at two epochs separated by 7.695 yr. Comparisons of the two observations clearly show the expansion of the nebula. We measured the amplitude and direction of the motion of local structures in the nebula by determining their relative shift during that interval. In order to assess the potential uncertainties in the determination of proper motions in this object, in general, the measurements were performed using two different methods, used previously in themore » literature. We compare the results from the two methods, and to perform the scientific analysis of the results we choose one, the cross-correlation method, because it is more reliable. We go on to perform a ''criss-cross'' mapping analysis on the proper motion vectors, which helps in the interpretation of the velocity pattern. By combining our results of the proper motions with radial velocity measurements obtained from high resolution spectroscopic observations, and employing an existing 3D model, we estimate the distance to the nebula to be 1.3 kpc.« less

  14. Food waste prevention in Athens, Greece: The effect of family characteristics.

    PubMed

    Abeliotis, Konstadinos; Lasaridi, Katia; Chroni, Christina

    2016-12-01

    Food waste is a stream that becomes increasingly important in terms of its prevention potential. There is a large number of behaviours that can be associated with food waste generation and the efforts towards food waste prevention. A questionnaire study was carried in order to study consumer behaviour related to food provision and wastage in Greece. Proper practices of the respondents that can prevent the generation of food waste were investigated using nine behavioural scales, which were defined on the basis of similar studies in other countries. A structured questionnaire was utilised in order to test those behaviours against the socio-demographic characteristics of respondents. The results of the study indicate that in terms of inferential statistical analysis, among the numerous variables examined, those that enhance food waste prevention are the involvement of the respondent in cooking, the annoyance towards food waste generation and the education level. © The Author(s) 2016.

  15. Review of the results of the in vivo dosimetry during total skin electron beam therapy

    PubMed Central

    Guidi, Gabriele; Gottardi, Giovanni; Ceroni, Paola; Costi, Tiziana

    2013-01-01

    This work reviews results of in vivo dosimetry (IVD) for total skin electron beam (TSEB) therapy, focusing on new methods, data emerged within 2012. All quoted data are based on a careful review of the literature reporting IVD results for patients treated by means of TSEB therapy. Many of the reviewed papers refer mainly to now old studies and/or old guidelines and recommendations (by IAEA, AAPM and EORTC), because (due to intrinsic rareness of TSEB-treated pathologies) only a limited number of works and reports with a large set of numerical data and proper statistical analysis is up-to-day available in scientific literature. Nonetheless, a general summary of the results obtained by the now numerous IVD techniques available is reported; innovative devices and methods, together with areas of possible further and possibly multicenter investigations for TSEB therapies are highlighted. PMID:24936333

  16. Hidden scaling patterns and universality in written communication

    NASA Astrophysics Data System (ADS)

    Formentin, M.; Lovison, A.; Maritan, A.; Zanzotto, G.

    2014-07-01

    The temporal statistics exhibited by written correspondence appear to be media dependent, with features which have so far proven difficult to characterize. We explain the origin of these difficulties by disentangling the role of spontaneous activity from decision-based prioritizing processes in human dynamics, clocking all waiting times through each agent's "proper time" measured by activity. This unveils the same fundamental patterns in written communication across all media (letters, email, sms), with response times displaying truncated power-law behavior and average exponents near -3/2. When standard time is used, the response time probabilities are theoretically predicted to exhibit a bimodal character, which is empirically borne out by our newly collected years-long data on email. These perspectives on the temporal dynamics of human correspondence should aid in the analysis of interaction phenomena in general, including resource management, optimal pricing and routing, information sharing, and emergency handling.

  17. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.

    PubMed

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2017-01-01

    This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.

  18. Ratio-based lengths of intervals to improve fuzzy time series forecasting.

    PubMed

    Huarng, Kunhuang; Yu, Tiffany Hui-Kuang

    2006-04-01

    The objective of this study is to explore ways of determining the useful lengths of intervals in fuzzy time series. It is suggested that ratios, instead of equal lengths of intervals, can more properly represent the intervals among observations. Ratio-based lengths of intervals are, therefore, proposed to improve fuzzy time series forecasting. Algebraic growth data, such as enrollments and the stock index, and exponential growth data, such as inventory demand, are chosen as the forecasting targets, before forecasting based on the various lengths of intervals is performed. Furthermore, sensitivity analyses are also carried out for various percentiles. The ratio-based lengths of intervals are found to outperform the effective lengths of intervals, as well as the arbitrary ones in regard to the different statistical measures. The empirical analysis suggests that the ratio-based lengths of intervals can also be used to improve fuzzy time series forecasting.

  19. Fluorescent characteristics of estrogenic compounds in landfill leachate.

    PubMed

    Zhanga, Hua; Changb, Cheng-Hsuan; Lü, Fan; Su, Ay; Lee, Duu-Jong; He, Pin-Jing; Shao, Li-Ming

    2009-08-01

    Estrogens in landfill leachate could probably contaminate receiving water sources if not properly polished before discharge. This work measured, using an estrogen receptor-alpha competitor screening assay, the estrogenic potentials of leachate samples collected at a local sanitary landfill in Shanghai, China and their compounds fractionated by molecular weights. The chemical structures of the constituent compounds were characterized using fluorescence excitation and emission matrix (EEM). The organic matters of molecular weight <600 Da and of 3000-14,000 Da contributed most of the estrogenic potentials of the raw leachates. The former were considered as the typical endocrine disrupting compounds in dissolved state; while the latter the fulvic acids with high aromaticity that were readily adsorbed with estrogens (bound state). Statistical analysis on EEM peaks revealed that the chemical structures of noted estrogens in dissolved state and in bound state were not identical. Aerobic treatment effectively removed dissolved estrogens, but rarely removed those bound estrogens.

  20. Study on the impulsive pressure of tank oscillating by force towards multiple degrees of freedom

    NASA Astrophysics Data System (ADS)

    Hibi, Shigeyuki

    2018-06-01

    Impulsive loads should be excited under nonlinear phenomena with free surface fluctuating severely such as sloshing and slamming. Estimating impulsive loads properly are important to recent numerical simulations. But it is still difficult to rely on the results of simulations perfectly because of the nonlinearity of the phenomena. In order to develop the algorithm of numerical simulations experimental results of nonlinear phenomena are needed. In this study an apparatus which can oscillate a tank by force was introduced in order to investigate impulsive pressure on the wall of the tank. This apparatus can oscillate it simultaneously towards 3 degrees of freedom with each phase differences. The impulsive pressure under the various combinations of oscillation direction was examined and the specific phase differences to appear the largest peak values of pressure were identified. Experimental results were verified through FFT analysis and statistical methods.

  1. Rayleigh-Taylor mixing in supernova experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swisher, N. C.; Abarzhi, S. I., E-mail: snezhana.abarzhi@gmail.com; Kuranz, C. C.

    We report a scrupulous analysis of data in supernova experiments that are conducted at high power laser facilities in order to study core-collapse supernova SN1987A. Parameters of the experimental system are properly scaled to investigate the interaction of a blast-wave with helium-hydrogen interface, and the induced Rayleigh-Taylor instability and Rayleigh-Taylor mixing of the denser and lighter fluids with time-dependent acceleration. We analyze all available experimental images of the Rayleigh-Taylor flow in supernova experiments and measure delicate features of the interfacial dynamics. A new scaling is identified for calibration of experimental data to enable their accurate analysis and comparisons. By properlymore » accounting for the imprint of the experimental conditions, the data set size and statistics are substantially increased. New theoretical solutions are reported to describe asymptotic dynamics of Rayleigh-Taylor flow with time-dependent acceleration by applying theoretical analysis that considers symmetries and momentum transport. Good qualitative and quantitative agreement is achieved of the experimental data with the theory and simulations. Our study indicates that in supernova experiments Rayleigh-Taylor flow is in the mixing regime, the interface amplitude contributes substantially to the characteristic length scale for energy dissipation; Rayleigh-Taylor mixing keeps order.« less

  2. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  3. Spectral Properties and Dynamics of Gold Nanorods Revealed by EMCCD Based Spectral-Phasor Method

    PubMed Central

    Chen, Hongtao; Digman, Michelle A.

    2015-01-01

    Gold nanorods (NRs) with tunable plasmon-resonant absorption in the near-infrared region have considerable advantages over organic fluorophores as imaging agents. However, the luminescence spectral properties of NRs have not been fully explored at the single particle level in bulk due to lack of proper analytic tools. Here we present a global spectral phasor analysis method which allows investigations of NRs' spectra at single particle level with their statistic behavior and spatial information during imaging. The wide phasor distribution obtained by the spectral phasor analysis indicates spectra of NRs are different from particle to particle. NRs with different spectra can be identified graphically in corresponding spatial images with high spectral resolution. Furthermore, spectral behaviors of NRs under different imaging conditions, e.g. different excitation powers and wavelengths, were carefully examined by our laser-scanning multiphoton microscope with spectral imaging capability. Our results prove that the spectral phasor method is an easy and efficient tool in hyper-spectral imaging analysis to unravel subtle changes of the emission spectrum. Moreover, we applied this method to study the spectral dynamics of NRs during direct optical trapping and by optothermal trapping. Interestingly, spectral shifts were observed in both trapping phenomena. PMID:25684346

  4. Quantitative image analysis for evaluating the coating thickness and pore distribution in coated small particles.

    PubMed

    Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K

    2009-04-01

    This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.

  5. Truly random number generation: an example

    NASA Astrophysics Data System (ADS)

    Frauchiger, Daniela; Renner, Renato

    2013-10-01

    Randomness is crucial for a variety of applications, ranging from gambling to computer simulations, and from cryptography to statistics. However, many of the currently used methods for generating randomness do not meet the criteria that are necessary for these applications to work properly and safely. A common problem is that a sequence of numbers may look random but nevertheless not be truly random. In fact, the sequence may pass all standard statistical tests and yet be perfectly predictable. This renders it useless for many applications. For example, in cryptography, the predictability of a "andomly" chosen password is obviously undesirable. Here, we review a recently developed approach to generating true | and hence unpredictable | randomness.

  6. Variability aware compact model characterization for statistical circuit design optimization

    NASA Astrophysics Data System (ADS)

    Qiao, Ying; Qian, Kun; Spanos, Costas J.

    2012-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose an efficient variabilityaware compact model characterization methodology based on the linear propagation of variance. Hierarchical spatial variability patterns of selected compact model parameters are directly calculated from transistor array test structures. This methodology has been implemented and tested using transistor I-V measurements and the EKV-EPFL compact model. Calculation results compare well to full-wafer direct model parameter extractions. Further studies are done on the proper selection of both compact model parameters and electrical measurement metrics used in the method.

  7. A deep proper motion catalog within the Sloan digital sky survey footprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munn, Jeffrey A.; Harris, Hugh C.; Tilleman, Trudy M.

    2014-12-01

    A new proper motion catalog is presented, combining the Sloan Digital Sky Survey (SDSS) with second epoch observations in the r band within a portion of the SDSS imaging footprint. The new observations were obtained with the 90prime camera on the Steward Observatory Bok 90 inch telescope, and the Array Camera on the U.S. Naval Observatory, Flagstaff Station, 1.3 m telescope. The catalog covers 1098 square degrees to r = 22.0, an additional 1521 square degrees to r = 20.9, plus a further 488 square degrees of lesser quality data. Statistical errors in the proper motions range from 5 masmore » year{sup −1} at the bright end to 15 mas year{sup −1} at the faint end, for a typical epoch difference of six years. Systematic errors are estimated to be roughly 1 mas year{sup −1} for the Array Camera data, and as much as 2–4 mas year{sup −1} for the 90prime data (though typically less). The catalog also includes a second epoch of r band photometry.« less

  8. Determination of priority among air pollution factors in preventing COPD in residents of Shanghai City proper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xuguang, Tao; Chuan-jie, Hong; Shun-zhang, Yu

    The purpose of our study is to determine the priority among ambient sulphur dioxide (SO{sub 2}), inhalable particulates (IP), and indoor use of coal for cooking or heating to prevent COPD in residents of Shanghai city proper. We describe spatial and temporal distribution of the concentration of ambient SO{sub 2}, IP, and the proportion of families who use coal (1980-1985) by the trend surface simulating method and other statistics. Stratified by two sets of extreme levels of ambient SO{sub 2}, IP, and the proportion of coal-using families, we selected eight groups with different combinations of exposure levels. We analyzed themore » relationship between air pollution factors and their health effects at levels of mortality (1978-1987, 232,459 person years), prevalence (1987, 12,037 persons), lung function and local immunologic function (1987, 514 women) with logistic and stepwise regression, and ridit analysis. After controlling for possible confounders, e.g., tobacco smoking and occupational exposure, we found that indoor use of coal is a more important risk factor than ambient SO{sub 2} and IP. We then used canonical correlation analysis to evaluate the overall exposure-effect relationship between one set of air pollution and confounding factors and the other set of health effect indices. High correlation is found between the two. The indoor use of coal is more important for the overall health effects than the ambient SO{sub 2} and IP, to change from coal to gas could reduce the environmental exposure canonical variable more readily, with an effect equivalent to a reduction of 0.1839 mg/m{sup 3} for ambient SO{sub 2}, or 0.2806 mg/m{sup 3} for ambient IP in concentration.« less

  9. Stratified randomization controls better for batch effects in 450K methylation analysis: a cautionary tale.

    PubMed

    Buhule, Olive D; Minster, Ryan L; Hawley, Nicola L; Medvedovic, Mario; Sun, Guangyun; Viali, Satupaitea; Deka, Ranjan; McGarvey, Stephen T; Weeks, Daniel E

    2014-01-01

    Batch effects in DNA methylation microarray experiments can lead to spurious results if not properly handled during the plating of samples. Two pilot studies examining the association of DNA methylation patterns across the genome with obesity in Samoan men were investigated for chip- and row-specific batch effects. For each study, the DNA of 46 obese men and 46 lean men were assayed using Illumina's Infinium HumanMethylation450 BeadChip. In the first study (Sample One), samples from obese and lean subjects were examined on separate chips. In the second study (Sample Two), the samples were balanced on the chips by lean/obese status, age group, and census region. We used methylumi, watermelon, and limma R packages, as well as ComBat, to analyze the data. Principal component analysis and linear regression were, respectively, employed to identify the top principal components and to test for their association with the batches and lean/obese status. To identify differentially methylated positions (DMPs) between obese and lean males at each locus, we used a moderated t-test. Chip effects were effectively removed from Sample Two but not Sample One. In addition, dramatic differences were observed between the two sets of DMP results. After "removing" batch effects with ComBat, Sample One had 94,191 probes differentially methylated at a q-value threshold of 0.05 while Sample Two had zero differentially methylated probes. The disparate results from Sample One and Sample Two likely arise due to the confounding of lean/obese status with chip and row batch effects. Even the best possible statistical adjustments for batch effects may not completely remove them. Proper study design is vital for guarding against spurious findings due to such effects.

  10. Stratified randomization controls better for batch effects in 450K methylation analysis: a cautionary tale

    PubMed Central

    Buhule, Olive D.; Minster, Ryan L.; Hawley, Nicola L.; Medvedovic, Mario; Sun, Guangyun; Viali, Satupaitea; Deka, Ranjan; McGarvey, Stephen T.; Weeks, Daniel E.

    2014-01-01

    Background: Batch effects in DNA methylation microarray experiments can lead to spurious results if not properly handled during the plating of samples. Methods: Two pilot studies examining the association of DNA methylation patterns across the genome with obesity in Samoan men were investigated for chip- and row-specific batch effects. For each study, the DNA of 46 obese men and 46 lean men were assayed using Illumina's Infinium HumanMethylation450 BeadChip. In the first study (Sample One), samples from obese and lean subjects were examined on separate chips. In the second study (Sample Two), the samples were balanced on the chips by lean/obese status, age group, and census region. We used methylumi, watermelon, and limma R packages, as well as ComBat, to analyze the data. Principal component analysis and linear regression were, respectively, employed to identify the top principal components and to test for their association with the batches and lean/obese status. To identify differentially methylated positions (DMPs) between obese and lean males at each locus, we used a moderated t-test. Results: Chip effects were effectively removed from Sample Two but not Sample One. In addition, dramatic differences were observed between the two sets of DMP results. After “removing” batch effects with ComBat, Sample One had 94,191 probes differentially methylated at a q-value threshold of 0.05 while Sample Two had zero differentially methylated probes. The disparate results from Sample One and Sample Two likely arise due to the confounding of lean/obese status with chip and row batch effects. Conclusion: Even the best possible statistical adjustments for batch effects may not completely remove them. Proper study design is vital for guarding against spurious findings due to such effects. PMID:25352862

  11. Randomized Trials Built on Sand: Examples from COPD, Hormone Therapy, and Cancer

    PubMed Central

    Suissa, Samy

    2012-01-01

    The randomized controlled trial is the fundamental study design to evaluate the effectiveness of medications and receive regulatory approval. Observational studies, on the other hand, are essential to address post-marketing drug safety issues but have also been used to uncover new indications or new benefits for already marketed drugs. Hormone replacement therapy (HRT) for instance, effective for menopausal symptoms, was reported in several observational studies during the 1980s and 1990s to also significantly reduce the incidence of coronary heart disease. This claim was refuted in 2002 by the large-scale Women’s Health Initiative randomized trial. An example of a new indication for an old drug is that of metformin, an anti-diabetic medication, which is being hailed as a potential anti-cancer agent, primarily on the basis of several recent observational studies that reported impressive reductions in cancer incidence and mortality with its use. These observational studies have now sparked the conduct of large-scale randomized controlled trials currently ongoing in cancer. We show in this paper that the spectacular effects on new indications or new outcomes reported in many observational studies in chronic obstructive pulmonary disease (COPD), HRT, and cancer are the result of time-related biases, such as immortal time bias, that tend to seriously exaggerate the benefits of a drug and that eventually disappear with the proper statistical analysis. In all, while observational studies are central to assess the effects of drugs, their proper design and analysis are essential to avoid bias. The scientific evidence on the potential beneficial effects in new indications of existing drugs will need to be more carefully assessed before embarking on long and expensive unsubstantiated trials. PMID:23908838

  12. Pulse amplitude of intracranial pressure waveform in hydrocephalus.

    PubMed

    Czosnyka, Z; Keong, N; Kim, D J; Radolovich, D; Smielewski, P; Lavinio, A; Schmidt, E A; Momjian, S; Owler, B; Pickard, J D; Czosnyka, M

    2008-01-01

    There is increasing interest in evaluation of the pulse amplitude of intracranial pressure (AMP) in explaining dynamic aspects of hydrocephalus. We reviewed a large number of ICP recordings in a group of hydrocephalic patients to assess utility of AMP. From a database including approximately 2,100 cases of infusion studies (either lumbar or intraventricular) and overnight ICP monitoring in patients suffering from hydrocephalus of various types (both communicating and non-communicating), etiology and stage of management (non-shunted or shunted) pressure recordings were evaluated. For subgroup analysis we selected 60 patients with idiopathic NPH with full follow-up after shunting. In 29 patients we compared pulse amplitude during an infusion study performed before and after shunting with a properly functioning shunt. Amplitude was calculated from ICP waveforms using spectral analysis methodology. A large amplitude was associated with good outcome after shunting (positive predictive value of clinical improvement for AMP above 2.5 mmHg was 95%). However, low amplitude did not predict poor outcome (for AMP below 2.5 mmHg 52% of patients improved). Correlations of AMP with ICP and Rcsf were positive and statistically significant (N = 131 with idiopathic NPH; R = 0.21 for correlation with mean ICP and 0.22 with Rcsf; p< 0.01). Correlation with the brain elastance coefficient (or PVI) was not significant. There was also no significant correlation between pulse amplitude and width of the ventricles. The pulse amplitude decreased (p < 0.005) after shunting. Interpretation of the ICP pulse waveform may be clinically useful in patients suffering from hydrocephalus. Elevated amplitude seems to be a positive predictor for clinical improvement after shunting. A properly functioning shunt reduces the pulse amplitude.

  13. The questioned p value: clinical, practical and statistical significance.

    PubMed

    Jiménez-Paneque, Rosa

    2016-09-09

    The use of p-value and statistical significance have been questioned since the early 80s in the last century until today. Much has been discussed about it in the field of statistics and its applications, especially in Epidemiology and Public Health. As a matter of fact, the p-value and its equivalent, statistical significance, are difficult concepts to grasp for the many health professionals some way involved in research applied to their work areas. However, its meaning should be clear in intuitive terms although it is based on theoretical concepts of the field of Statistics. This paper attempts to present the p-value as a concept that applies to everyday life and therefore intuitively simple but whose proper use cannot be separated from theoretical and methodological elements of inherent complexity. The reasons behind the criticism received by the p-value and its isolated use are intuitively explained, mainly the need to demarcate statistical significance from clinical significance and some of the recommended remedies for these problems are approached as well. It finally refers to the current trend to vindicate the p-value appealing to the convenience of its use in certain situations and the recent statement of the American Statistical Association in this regard.

  14. Spatial variation of statistical properties of extreme water levels along the eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Pindsoo, Katri; Soomere, Tarmo; Rocha, Eugénio

    2016-04-01

    Most of existing projections of future extreme water levels rely on the use of classic generalised extreme value distributions. The choice to use a particular distribution is often made based on the absolute value of the shape parameter of the Generalise Extreme Value distribution. If this parameter is small, the Gumbel distribution is most appropriate while in the opposite case the Weibull or Frechet distribution could be used. We demonstrate that the alongshore variation in the statistical properties of numerically simulated high water levels along the eastern coast of the Baltic Sea is so large that the use of a single distribution for projections of extreme water levels is highly questionable. The analysis is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. The output of the Rossby Centre Ocean model is sampled with a resolution of 6 h and the output of the circulation model NEMO with a resolution of 1 h. As the maxima of water levels of subsequent years may be correlated in the Baltic Sea, we also employ maxima for stormy seasons. We provide a detailed analysis of spatial variation of the parameters of the family of extreme value distributions along an approximately 600 km long coastal section from the north-western shore of Latvia in the Baltic Proper until the eastern Gulf of Finland. The parameters are evaluated using maximum likelihood method and method of moments. The analysis also covers the entire Gulf of Riga. The core parameter of this family of distributions, the shape parameter of the Generalised Extreme Value distribution, exhibits extensive variation in the study area. Its values evaluated using the Hydrognomon software and maximum likelihood method, vary from about -0.1 near the north-western coast of Latvia in the Baltic Proper up to about 0.05 in the eastern Gulf of Finland. This parameter is very close to zero near Tallinn in the western Gulf of Finland. Thus, it is natural that the Gumbel distribution gives adequate projections of extreme water levels for the vicinity of Tallinn. More importantly, this feature indicates that the use of a single distribution for the projections of extreme water levels and their return periods for the entire Baltic Sea coast is inappropriate. The physical reason is the interplay of the complex shape of large subbasins (such as the Gulf of Riga and Gulf of Finland) of the sea and highly anisotropic wind regime. The 'impact' of this anisotropy on the statistics of water level is amplified by the overall anisotropy of the distributions of the frequency of occurrence of high and low water levels. The most important conjecture is that long-term behaviour of water level extremes in different coastal sections of the Baltic Sea may be fundamentally different.

  15. Pesticides and public health: an analysis of the regulatory approach to assessing the carcinogenicity of glyphosate in the European Union.

    PubMed

    Clausing, Peter; Robinson, Claire; Burtscher-Schaden, Helmut

    2018-03-13

    The present paper scrutinises the European authorities' assessment of the carcinogenic hazard posed by glyphosate based on Regulation (EC) 1272/2008. We use the authorities' own criteria as a benchmark to analyse their weight of evidence (WoE) approach. Therefore, our analysis goes beyond the comparison of the assessments made by the European Food Safety Authority and the International Agency for Research on Cancer published by others. We show that not classifying glyphosate as a carcinogen by the European authorities, including the European Chemicals Agency, appears to be not consistent with, and in some instances, a direct violation of the applicable guidance and guideline documents. In particular, we criticise an arbitrary attenuation by the authorities of the power of statistical analyses; their disregard of existing dose-response relationships; their unjustified claim that the doses used in the mouse carcinogenicity studies were too high and their contention that the carcinogenic effects were not reproducible by focusing on quantitative and neglecting qualitative reproducibility. Further aspects incorrectly used were historical control data, multisite responses and progression of lesions to malignancy. Contrary to the authorities' evaluations, proper application of statistical methods and WoE criteria inevitably leads to the conclusion that glyphosate is 'probably carcinogenic' (corresponding to category 1B in the European Union). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Technical tips and advancements in pediatric minimally invasive surgical training on porcine based simulations.

    PubMed

    Narayanan, Sarath Kumar; Cohen, Ralph Clinton; Shun, Albert

    2014-06-01

    Minimal access techniques have transformed the way pediatric surgery is practiced. Due to various constraints, surgical residency programs have not been able to tutor adequate training skills in the routine setting. The advent of new technology and methods in minimally invasive surgery (MIS), has similarly contributed to the need for systematic skills' training in a safe, simulated environment. To enable the training of the proper technique among pediatric surgery trainees, we have advanced a porcine non-survival model for endoscopic surgery. The technical advancements over the past 3 years and a subjective validation of the porcine model from 114 participating trainees using a standard questionnaire and a 5-point Likert scale have been described here. Mean attitude scores and analysis of variance (ANOVA) were used for statistical analysis of the data. Almost all trainees agreed or strongly agreed that the animal-based model was appropriate (98.35%) and also acknowledged that such workshops provided adequate practical experience before attempting on human subjects (96.6%). Mean attitude score for respondents was 19.08 (SD 3.4, range 4-20). Attitude scores showed no statistical association with years of experience or the level of seniority, indicating a positive attitude among all groups of respondents. Structured porcine-based MIS training should be an integral part of skill acquisition for pediatric surgery trainees and the experience gained can be transferred into clinical practice. We advocate that laparoscopic training should begin in a controlled workshop setting before procedures are attempted on human patients.

  17. Forensic Hair Differentiation Using Attenuated Total Reflection Fourier Transform Infrared (ATR FT-IR) Spectroscopy.

    PubMed

    Manheim, Jeremy; Doty, Kyle C; McLaughlin, Gregory; Lednev, Igor K

    2016-07-01

    Hair and fibers are common forms of trace evidence found at crime scenes. The current methodology of microscopic examination of potential hair evidence is absent of statistical measures of performance, and examiner results for identification can be subjective. Here, attenuated total reflection (ATR) Fourier transform-infrared (FT-IR) spectroscopy was used to analyze synthetic fibers and natural hairs of human, cat, and dog origin. Chemometric analysis was used to differentiate hair spectra from the three different species, and to predict unknown hairs to their proper species class, with a high degree of certainty. A species-specific partial least squares discriminant analysis (PLSDA) model was constructed to discriminate human hair from cat and dog hairs. This model was successful in distinguishing between the three classes and, more importantly, all human samples were correctly predicted as human. An external validation resulted in zero false positive and false negative assignments for the human class. From a forensic perspective, this technique would be complementary to microscopic hair examination, and in no way replace it. As such, this methodology is able to provide a statistical measure of confidence to the identification of a sample of human, cat, and dog hair, which was called for in the 2009 National Academy of Sciences report. More importantly, this approach is non-destructive, rapid, can provide reliable results, and requires no sample preparation, making it of ample importance to the field of forensic science. © The Author(s) 2016.

  18. Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis

    NASA Astrophysics Data System (ADS)

    Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.

    As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.

  19. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.

    We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expressionmore » that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.« less

  20. Characterization of Eyeball Loss in Four Cities of Colombia.

    PubMed

    Moreno-Caviedes, F Hernán; Velez Cuellar, Nórida; Caicedo Zapata, Margarita; Triana Reina, Gabriel; Sánchez, Azucena

    2017-09-11

    Describe the socio-demographic characteristics of anophthalmic patients examined at specialized centers of four cities in Colombia to know the different causes of eyeball loss. A transversal retrospective study was done of 511 medical records from the specialized practices of four cities in Colombia. Socio-demographic data of patients who were seen between January 2011 and December 2013 were compiled. SOFA Statistics software v1.4.6 was used for this analysis. An analysis throughout the measures of central tendency for numerical variables was developed, and the descriptive statistics were used for the categorical variables. Almost 63% of the data belonged to male patients. Eyeball loss was more frequent in patients over 40 years of age. Fifty-one percent of the patients suffered eyeball loss due to traumatic causes, 40.2% due to pathological causes, and 4.6% due to congenital anomalies. The most frequent specific causes were glaucoma (19%), ocular cancer (15.4%), and home accidents (11,2%). Around 60% of the anophthalmic patients belonged to low socioeconomic level. It is important to highlight that more than half of the analyzed anophthalmia cases originated in some type of trauma; this means that they could be considered potentially avoidable losses. Complications deriving from glaucoma became the most frequent cause of anophthalmia in the pathological origin group, which suggests a reflection regarding the strategies of early detection of the disease and access to proper treatment. It is also showed the need to develop an efficient system to manage information.

  1. FINDING THE BALANCE - QUALITY ASSURANCE REQUIREMENTS VS. RESEARCH NEEDS

    EPA Science Inventory

    Investigators often misapply quality assurance (QA) procedures and may consider QA as a hindrance to developing test plans for sampling and analysis. If used properly, however, QA is the driving force for collecting the right kind and proper amount of data. Researchers must use Q...

  2. FINDING THE BALANCE - QUALITY ASSURANCE REQUIREMENTS VS. RESEARCH NEEDS

    EPA Science Inventory

    Investigators often misapply quality assurance (QA) procedures and may consider QA as a hindrance to developing test plans for
    sampling and analysis. If used properly, however, QA is the driving force for collecting the right kind and proper amount of data.
    Researchers must...

  3. Methods for processing microarray data.

    PubMed

    Ares, Manuel

    2014-02-01

    Quality control must be maintained at every step of a microarray experiment, from RNA isolation through statistical evaluation. Here we provide suggestions for analyzing microarray data. Because the utility of the results depends directly on the design of the experiment, the first critical step is to ensure that the experiment can be properly analyzed and interpreted. What is the biological question? What is the best way to perform the experiment? How many replicates will be required to obtain the desired statistical resolution? Next, the samples must be prepared, pass quality controls for integrity and representation, and be hybridized and scanned. Also, slides with defects, missing data, high background, or weak signal must be rejected. Data from individual slides must be normalized and combined so that the data are as free of systematic bias as possible. The third phase is to apply statistical filters and tests to the data to determine genes (1) expressed above background, (2) whose expression level changes in different samples, and (3) whose RNA-processing patterns or protein associations change. Next, a subset of the data should be validated by an alternative method, such as reverse transcription-polymerase chain reaction (RT-PCR). Provided that this endorses the general conclusions of the array analysis, gene sets whose expression, splicing, polyadenylation, protein binding, etc. change in different samples can be classified with respect to function, sequence motif properties, as well as other categories to extract hypotheses for their biological roles and regulatory logic.

  4. Kerr Reservoir LANDSAT experiment analysis for November 1980

    NASA Technical Reports Server (NTRS)

    Lecroy, S. R.

    1982-01-01

    An experiment was conducted on the waters of Kerr Reservoir to determine if reliable algorithms could be developed that relate water quality parameters to remotely sensed data. LANDSAT radiance data was used in the analysis since it is readily available and covers the area of interest on a regular basis. By properly designing the experiment, many of the unwanted variations due to atmosphere, solar, and hydraulic changes were minimized. The algorithms developed were constrained to satisfy rigorous statistical criteria before they could be considered dependable in predicting water quality parameters. A complete mix of different types of algorithms using the LANDSAT bands was generated to provide a thorough understanding of the relationships among the data involved. The study demonstrated that for the ranges measured, the algorithms that satisfactorily represented the data are mostly linear and only require a maximum of one or two LANDSAT bands. Rationing techniques did not improve the results since the initial design of the experiment minimized the errors that this procedure is effective against. Good correlations were established for inorganic suspended solids, iron, turbidity, and secchi depth.

  5. Comparing the Efficiency of Two Different Extraction Techniques in Removal of Maxillary Third Molars: A Randomized Controlled Trial.

    PubMed

    Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K

    2017-12-01

    Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.

  6. Monitoring early phases of orthodontic treatment by means of Raman spectroscopies

    NASA Astrophysics Data System (ADS)

    d'Apuzzo, Fabrizia; Perillo, Letizia; Delfino, Ines; Portaccio, Marianna; Lepore, Maria; Camerlingo, Carlo

    2017-11-01

    Gingival crevicular fluid (GCF) is a site-specific exudate in the gingival sulcus. GCF composition changes in response to diseases or mechanical stimuli, such as those occurring during orthodontic treatments. Raman microspectroscopy (μ-RS) and surface-enhanced Raman spectroscopy (SERS) were adopted for a GCF analysis during different initial phases of orthodontic force application. GCF samples were pooled from informed patients using paper cones. SERS spectra were obtained from GCF extracted from these cones, whereas μ-RS spectra were directly acquired on paper cones without any manipulation. The spectral characteristics of the main functional groups and the changes in cytochrome, amide III, and amide I contributions were highlighted in the different phases of orthodontic treatment with both SERS and μ-RS analysis. μ-RS directly performed on the paper cones together with proper statistical methods can offer an effective approach for the development of a tool for monitoring the processes occurring during orthodontic treatments, which may help the clinician in the choice of type of treatment individually for each patient and accelerate and improve the orthodontic therapy.

  7. Detecting Non-Gaussian and Lognormal Characteristics of Temperature and Water Vapor Mixing Ratio

    NASA Astrophysics Data System (ADS)

    Kliewer, A.; Fletcher, S. J.; Jones, A. S.; Forsythe, J. M.

    2017-12-01

    Many operational data assimilation and retrieval systems assume that the errors and variables come from a Gaussian distribution. This study builds upon previous results that shows that positive definite variables, specifically water vapor mixing ratio and temperature, can follow a non-Gaussian distribution and moreover a lognormal distribution. Previously, statistical testing procedures which included the Jarque-Bera test, the Shapiro-Wilk test, the Chi-squared goodness-of-fit test, and a composite test which incorporated the results of the former tests were employed to determine locations and time spans where atmospheric variables assume a non-Gaussian distribution. These tests are now investigated in a "sliding window" fashion in order to extend the testing procedure to near real-time. The analyzed 1-degree resolution data comes from the National Oceanic and Atmospheric Administration (NOAA) Global Forecast System (GFS) six hour forecast from the 0Z analysis. These results indicate the necessity of a Data Assimilation (DA) system to be able to properly use the lognormally-distributed variables in an appropriate Bayesian analysis that does not assume the variables are Gaussian.

  8. Knowledge Discovery from Vibration Measurements

    PubMed Central

    Li, Jian; Wang, Daoyao

    2014-01-01

    The framework as well as the particular algorithms of pattern recognition process is widely adopted in structural health monitoring (SHM). However, as a part of the overall process of knowledge discovery from data bases (KDD), the results of pattern recognition are only changes and patterns of changes of data features. In this paper, based on the similarity between KDD and SHM and considering the particularity of SHM problems, a four-step framework of SHM is proposed which extends the final goal of SHM from detecting damages to extracting knowledge to facilitate decision making. The purposes and proper methods of each step of this framework are discussed. To demonstrate the proposed SHM framework, a specific SHM method which is composed by the second order structural parameter identification, statistical control chart analysis, and system reliability analysis is then presented. To examine the performance of this SHM method, real sensor data measured from a lab size steel bridge model structure are used. The developed four-step framework of SHM has the potential to clarify the process of SHM to facilitate the further development of SHM techniques. PMID:24574933

  9. Guidelines for collecting and maintaining archives for genetic monitoring

    USGS Publications Warehouse

    Jackson, Jennifer A.; Laikre, Linda; Baker, C. Scott; Kendall, Katherine C.; ,

    2012-01-01

    Rapid advances in molecular genetic techniques and the statistical analysis of genetic data have revolutionized the way that populations of animals, plants and microorganisms can be monitored. Genetic monitoring is the practice of using molecular genetic markers to track changes in the abundance, diversity or distribution of populations, species or ecosystems over time, and to follow adaptive and non-adaptive genetic responses to changing external conditions. In recent years, genetic monitoring has become a valuable tool in conservation management of biological diversity and ecological analysis, helping to illuminate and define cryptic and poorly understood species and populations. Many of the detected biodiversity declines, changes in distribution and hybridization events have helped to drive changes in policy and management. Because a time series of samples is necessary to detect trends of change in genetic diversity and species composition, archiving is a critical component of genetic monitoring. Here we discuss the collection, development, maintenance, and use of archives for genetic monitoring. This includes an overview of the genetic markers that facilitate effective monitoring, describes how tissue and DNA can be stored, and provides guidelines for proper practice.

  10. Estimating errors in least-squares fitting

    NASA Technical Reports Server (NTRS)

    Richter, P. H.

    1995-01-01

    While least-squares fitting procedures are commonly used in data analysis and are extensively discussed in the literature devoted to this subject, the proper assessment of errors resulting from such fits has received relatively little attention. The present work considers statistical errors in the fitted parameters, as well as in the values of the fitted function itself, resulting from random errors in the data. Expressions are derived for the standard error of the fit, as a function of the independent variable, for the general nonlinear and linear fitting problems. Additionally, closed-form expressions are derived for some examples commonly encountered in the scientific and engineering fields, namely ordinary polynomial and Gaussian fitting functions. These results have direct application to the assessment of the antenna gain and system temperature characteristics, in addition to a broad range of problems in data analysis. The effects of the nature of the data and the choice of fitting function on the ability to accurately model the system under study are discussed, and some general rules are deduced to assist workers intent on maximizing the amount of information obtained form a given set of measurements.

  11. The Use of Public Enlightenment Campaign Strategy and School Disciplinary Measures in the Management of Cultism in Tertiary Institutions in Nigeria

    ERIC Educational Resources Information Center

    Omemu, Felix

    2015-01-01

    The study investigated the perceptions of staff and students on the use of school disciplinary measures and public awareness campaign strategy in the management of cultism in tertiary institutions in Nigeria. The study is guided by two hypotheses tested using the t-test statistics. An instrument containing 10 items properly validated was used in…

  12. Validity of the SAT® for Predicting First-Year Grades: 2010 SAT Validity Sample. Statistical Report 2013-2

    ERIC Educational Resources Information Center

    Patterson, Brian F.; Mattern, Krista D.

    2013-01-01

    The continued accumulation of validity evidence for the core uses of educational assessments is critical to ensure that proper inferences will be made for those core purposes. To that end, the College Board has continued to follow previous cohorts of college students and this report provides updated validity evidence for using the SAT to predict…

  13. Probabilistic multi-catalogue positional cross-match

    NASA Astrophysics Data System (ADS)

    Pineau, F.-X.; Derriere, S.; Motch, C.; Carrera, F. J.; Genova, F.; Michel, L.; Mingo, B.; Mints, A.; Nebot Gómez-Morán, A.; Rosen, S. R.; Ruiz Camuñas, A.

    2017-01-01

    Context. Catalogue cross-correlation is essential to building large sets of multi-wavelength data, whether it be to study the properties of populations of astrophysical objects or to build reference catalogues (or timeseries) from survey observations. Nevertheless, resorting to automated processes with limited sets of information available on large numbers of sources detected at different epochs with various filters and instruments inevitably leads to spurious associations. We need both statistical criteria to select detections to be merged as unique sources, and statistical indicators helping in achieving compromises between completeness and reliability of selected associations. Aims: We lay the foundations of a statistical framework for multi-catalogue cross-correlation and cross-identification based on explicit simplified catalogue models. A proper identification process should rely on both astrometric and photometric data. Under some conditions, the astrometric part and the photometric part can be processed separately and merged a posteriori to provide a single global probability of identification. The present paper addresses almost exclusively the astrometrical part and specifies the proper probabilities to be merged with photometric likelihoods. Methods: To select matching candidates in n catalogues, we used the Chi (or, indifferently, the Chi-square) test with 2(n-1) degrees of freedom. We thus call this cross-match a χ-match. In order to use Bayes' formula, we considered exhaustive sets of hypotheses based on combinatorial analysis. The volume of the χ-test domain of acceptance - a 2(n-1)-dimensional acceptance ellipsoid - is used to estimate the expected numbers of spurious associations. We derived priors for those numbers using a frequentist approach relying on simple geometrical considerations. Likelihoods are based on standard Rayleigh, χ and Poisson distributions that we normalized over the χ-test acceptance domain. We validated our theoretical results by generating and cross-matching synthetic catalogues. Results: The results we obtain do not depend on the order used to cross-correlate the catalogues. We applied the formalism described in the present paper to build the multi-wavelength catalogues used for the science cases of the Astronomical Resource Cross-matching for High Energy Studies (ARCHES) project. Our cross-matching engine is publicly available through a multi-purpose web interface. In a longer term, we plan to integrate this tool into the CDS XMatch Service.

  14. Research on orbit prediction for solar-based calibration proper satellite

    NASA Astrophysics Data System (ADS)

    Chen, Xuan; Qi, Wenwen; Xu, Peng

    2018-03-01

    Utilizing the mathematical model of the orbit mechanics, the orbit prediction is to forecast the space target's orbit information of a certain time based on the orbit of the initial moment. The proper satellite radiometric calibration and calibration orbit prediction process are introduced briefly. On the basis of the research of the calibration space position design method and the radiative transfer model, an orbit prediction method for proper satellite radiometric calibration is proposed to select the appropriate calibration arc for the remote sensor and to predict the orbit information of the proper satellite and the remote sensor. By analyzing the orbit constraint of the proper satellite calibration, the GF-1solar synchronous orbit is chose as the proper satellite orbit in order to simulate the calibration visible durance for different satellites to be calibrated. The results of simulation and analysis provide the basis for the improvement of the radiometric calibration accuracy of the satellite remote sensor, which lays the foundation for the high precision and high frequency radiometric calibration.

  15. Validation of surrogate endpoints in advanced solid tumors: systematic review of statistical methods, results, and implications for policy makers.

    PubMed

    Ciani, Oriana; Davis, Sarah; Tappenden, Paul; Garside, Ruth; Stein, Ken; Cantrell, Anna; Saad, Everardo D; Buyse, Marc; Taylor, Rod S

    2014-07-01

    Licensing of, and coverage decisions on, new therapies should rely on evidence from patient-relevant endpoints such as overall survival (OS). Nevertheless, evidence from surrogate endpoints may also be useful, as it may not only expedite the regulatory approval of new therapies but also inform coverage decisions. It is, therefore, essential that candidate surrogate endpoints be properly validated. However, there is no consensus on statistical methods for such validation and on how the evidence thus derived should be applied by policy makers. We review current statistical approaches to surrogate-endpoint validation based on meta-analysis in various advanced-tumor settings. We assessed the suitability of two surrogates (progression-free survival [PFS] and time-to-progression [TTP]) using three current validation frameworks: Elston and Taylor's framework, the German Institute of Quality and Efficiency in Health Care's (IQWiG) framework and the Biomarker-Surrogacy Evaluation Schema (BSES3). A wide variety of statistical methods have been used to assess surrogacy. The strength of the association between the two surrogates and OS was generally low. The level of evidence (observation-level versus treatment-level) available varied considerably by cancer type, by evaluation tools and was not always consistent even within one specific cancer type. Not in all solid tumors the treatment-level association between PFS or TTP and OS has been investigated. According to IQWiG's framework, only PFS achieved acceptable evidence of surrogacy in metastatic colorectal and ovarian cancer treated with cytotoxic agents. Our study emphasizes the challenges of surrogate-endpoint validation and the importance of building consensus on the development of evaluation frameworks.

  16. MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.

    PubMed

    Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk

    2018-05-29

    Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.

  17. Novel statistical tools for management of public databases facilitate community-wide replicability and control of false discovery.

    PubMed

    Rosset, Saharon; Aharoni, Ehud; Neuvirth, Hani

    2014-07-01

    Issues of publication bias, lack of replicability, and false discovery have long plagued the genetics community. Proper utilization of public and shared data resources presents an opportunity to ameliorate these problems. We present an approach to public database management that we term Quality Preserving Database (QPD). It enables perpetual use of the database for testing statistical hypotheses while controlling false discovery and avoiding publication bias on the one hand, and maintaining testing power on the other hand. We demonstrate it on a use case of a replication server for GWAS findings, underlining its practical utility. We argue that a shift to using QPD in managing current and future biological databases will significantly enhance the community's ability to make efficient and statistically sound use of the available data resources. © 2014 WILEY PERIODICALS, INC.

  18. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    PubMed

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  19. QNB: differential RNA methylation analysis for count-based small-sample sequencing data with a quad-negative binomial model.

    PubMed

    Liu, Lian; Zhang, Shao-Wu; Huang, Yufei; Meng, Jia

    2017-08-31

    As a newly emerged research area, RNA epigenetics has drawn increasing attention recently for the participation of RNA methylation and other modifications in a number of crucial biological processes. Thanks to high throughput sequencing techniques, such as, MeRIP-Seq, transcriptome-wide RNA methylation profile is now available in the form of count-based data, with which it is often of interests to study the dynamics at epitranscriptomic layer. However, the sample size of RNA methylation experiment is usually very small due to its costs; and additionally, there usually exist a large number of genes whose methylation level cannot be accurately estimated due to their low expression level, making differential RNA methylation analysis a difficult task. We present QNB, a statistical approach for differential RNA methylation analysis with count-based small-sample sequencing data. Compared with previous approaches such as DRME model based on a statistical test covering the IP samples only with 2 negative binomial distributions, QNB is based on 4 independent negative binomial distributions with their variances and means linked by local regressions, and in the way, the input control samples are also properly taken care of. In addition, different from DRME approach, which relies only the input control sample only for estimating the background, QNB uses a more robust estimator for gene expression by combining information from both input and IP samples, which could largely improve the testing performance for very lowly expressed genes. QNB showed improved performance on both simulated and real MeRIP-Seq datasets when compared with competing algorithms. And the QNB model is also applicable to other datasets related RNA modifications, including but not limited to RNA bisulfite sequencing, m 1 A-Seq, Par-CLIP, RIP-Seq, etc.

  20. Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models

    NASA Astrophysics Data System (ADS)

    Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.

    2013-12-01

    We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.

  1. Monitoring of small laboratory animal experiments by a designated web-based database.

    PubMed

    Frenzel, T; Grohmann, C; Schumacher, U; Krüll, A

    2015-10-01

    Multiple-parametric small animal experiments require, by their very nature, a sufficient number of animals which may need to be large to obtain statistically significant results.(1) For this reason database-related systems are required to collect the experimental data as well as to support the later (re-) analysis of the information gained during the experiments. In particular, the monitoring of animal welfare is simplified by the inclusion of warning signals (for instance, loss in body weight >20%). Digital patient charts have been developed for human patients but are usually not able to fulfill the specific needs of animal experimentation. To address this problem a unique web-based monitoring system using standard MySQL, PHP, and nginx has been created. PHP was used to create the HTML-based user interface and outputs in a variety of proprietary file formats, namely portable document format (PDF) or spreadsheet files. This article demonstrates its fundamental features and the easy and secure access it offers to the data from any place using a web browser. This information will help other researchers create their own individual databases in a similar way. The use of QR-codes plays an important role for stress-free use of the database. We demonstrate a way to easily identify all animals and samples and data collected during the experiments. Specific ways to record animal irradiations and chemotherapy applications are shown. This new analysis tool allows the effective and detailed analysis of huge amounts of data collected through small animal experiments. It supports proper statistical evaluation of the data and provides excellent retrievable data storage. © The Author(s) 2015.

  2. Sizing Up the Milky Way: A Bayesian Mixture Model Meta-analysis of Photometric Scale Length Measurements

    NASA Astrophysics Data System (ADS)

    Licquia, Timothy C.; Newman, Jeffrey A.

    2016-11-01

    The exponential scale length (L d ) of the Milky Way’s (MW’s) disk is a critical parameter for describing the global physical size of our Galaxy, important both for interpreting other Galactic measurements and helping us to understand how our Galaxy fits into extragalactic contexts. Unfortunately, current estimates span a wide range of values and are often statistically incompatible with one another. Here, we perform a Bayesian meta-analysis to determine an improved, aggregate estimate for L d , utilizing a mixture-model approach to account for the possibility that any one measurement has not properly accounted for all statistical or systematic errors. Within this machinery, we explore a variety of ways of modeling the nature of problematic measurements, and then employ a Bayesian model averaging technique to derive net posterior distributions that incorporate any model-selection uncertainty. Our meta-analysis combines 29 different (15 visible and 14 infrared) photometric measurements of L d available in the literature; these involve a broad assortment of observational data sets, MW models and assumptions, and methodologies, all tabulated herein. Analyzing the visible and infrared measurements separately yields estimates for L d of {2.71}-0.20+0.22 kpc and {2.51}-0.13+0.15 kpc, respectively, whereas considering them all combined yields 2.64 ± 0.13 kpc. The ratio between the visible and infrared scale lengths determined here is very similar to that measured in external spiral galaxies. We use these results to update the model of the Galactic disk from our previous work, constraining its stellar mass to be {4.8}-1.1+1.5× {10}10 M ⊙, and the MW’s total stellar mass to be {5.7}-1.1+1.5× {10}10 M ⊙.

  3. Immortal time bias in observational studies of drug effects in pregnancy.

    PubMed

    Matok, Ilan; Azoulay, Laurent; Yin, Hui; Suissa, Samy

    2014-09-01

    The use of decongestants during the second or third trimesters of pregnancy has been associated with a decreased risk of preterm delivery in two observational studies. This effect may have been subject to immortal time bias, a bias arising from the improper classification of exposure during follow-up. We illustrate this bias by repeating the studies using a different data source. The United Kingdom Hospital Episodes Statistics and the Clinical Practice Research Datalink databases were linked to identify all live singleton pregnancies among women aged 15 to 45 years between 1997 and 2012. Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% confidence intervals of preterm delivery (before 37 weeks of gestation) by considering the use of decongestants during the third trimester as a time-fixed (biased analysis which misclassifies unexposed person-time as exposed person-time) and time-varying exposure (unbiased analysis with proper classification of unexposed person-time). All models were adjusted for maternal age, smoking status, maternal diabetes, maternal hypertension, preeclampsia, and parity. Of the 195,582 singleton deliveries, 10,248 (5.2%) were born preterm. In the time-fixed analysis, the HR of preterm delivery for the use of decongestants was below the null and suggestive of a 46% decreased risk (adjusted HR = 0.54; 95% confidence interval, 0.24-1.20). In contrast, the HR was closer to null (adjusted HR = 0.93 95% confidence interval, 0.42-2.06) when the use of decongestants was treated as a time-varying variable. Studies of drug safety in pregnancy should use the appropriate statistical techniques to avoid immortal time bias, particularly when the exposure occurs at later stages of pregnancy. © 2014 Wiley Periodicals, Inc.

  4. The Love of Large Numbers: A Popularity Bias in Consumer Choice.

    PubMed

    Powell, Derek; Yu, Jingqi; DeWolf, Melissa; Holyoak, Keith J

    2017-10-01

    Social learning-the ability to learn from observing the decisions of other people and the outcomes of those decisions-is fundamental to human evolutionary and cultural success. The Internet now provides social evidence on an unprecedented scale. However, properly utilizing this evidence requires a capacity for statistical inference. We examined how people's interpretation of online review scores is influenced by the numbers of reviews-a potential indicator both of an item's popularity and of the precision of the average review score. Our task was designed to pit statistical information against social information. We modeled the behavior of an "intuitive statistician" using empirical prior information from millions of reviews posted on Amazon.com and then compared the model's predictions with the behavior of experimental participants. Under certain conditions, people preferred a product with more reviews to one with fewer reviews even though the statistical model indicated that the latter was likely to be of higher quality than the former. Overall, participants' judgments suggested that they failed to make meaningful statistical inferences.

  5. A critical look at prospective surveillance using a scan statistic.

    PubMed

    Correa, Thais R; Assunção, Renato M; Costa, Marcelo A

    2015-03-30

    The scan statistic is a very popular surveillance technique for purely spatial, purely temporal, and spatial-temporal disease data. It was extended to the prospective surveillance case, and it has been applied quite extensively in this situation. When the usual signal rules, as those implemented in SaTScan(TM) (Boston, MA, USA) software, are used, we show that the scan statistic method is not appropriate for the prospective case. The reason is that it does not adjust properly for the sequential and repeated tests carried out during the surveillance. We demonstrate that the nominal significance level α is not meaningful and there is no relationship between α and the recurrence interval or the average run length (ARL). In some cases, the ARL may be equal to ∞, which makes the method ineffective. This lack of control of the type-I error probability and of the ARL leads us to strongly oppose the use of the scan statistic with the usual signal rules in the prospective context. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  7. Analysis of delay reducing and fuel saving sequencing and spacing algorithms for arrival traffic

    NASA Technical Reports Server (NTRS)

    Neuman, Frank; Erzberger, Heinz

    1991-01-01

    The air traffic control subsystem that performs sequencing and spacing is discussed. The function of the sequencing and spacing algorithms is to automatically plan the most efficient landing order and to assign optimally spaced landing times to all arrivals. Several algorithms are described and their statistical performance is examined. Sequencing brings order to an arrival sequence for aircraft. First-come-first-served sequencing (FCFS) establishes a fair order, based on estimated times of arrival, and determines proper separations. Because of the randomness of the arriving traffic, gaps will remain in the sequence of aircraft. Delays are reduced by time-advancing the leading aircraft of each group while still preserving the FCFS order. Tightly spaced groups of aircraft remain with a mix of heavy and large aircraft. Spacing requirements differ for different types of aircraft trailing each other. Traffic is reordered slightly to take advantage of this spacing criterion, thus shortening the groups and reducing average delays. For heavy traffic, delays for different traffic samples vary widely, even when the same set of statistical parameters is used to produce each sample. This report supersedes NASA TM-102795 on the same subject. It includes a new method of time-advance as well as an efficient method of sequencing and spacing for two dependent runways.

  8. Multi-scale radiomic analysis of sub-cortical regions in MRI related to autism, gender and age

    NASA Astrophysics Data System (ADS)

    Chaddad, Ahmad; Desrosiers, Christian; Toews, Matthew

    2017-03-01

    We propose using multi-scale image textures to investigate links between neuroanatomical regions and clinical variables in MRI. Texture features are derived at multiple scales of resolution based on the Laplacian-of-Gaussian (LoG) filter. Three quantifier functions (Average, Standard Deviation and Entropy) are used to summarize texture statistics within standard, automatically segmented neuroanatomical regions. Significance tests are performed to identify regional texture differences between ASD vs. TDC and male vs. female groups, as well as correlations with age (corrected p < 0.05). The open-access brain imaging data exchange (ABIDE) brain MRI dataset is used to evaluate texture features derived from 31 brain regions from 1112 subjects including 573 typically developing control (TDC, 99 females, 474 males) and 539 Autism spectrum disorder (ASD, 65 female and 474 male) subjects. Statistically significant texture differences between ASD vs. TDC groups are identified asymmetrically in the right hippocampus, left choroid-plexus and corpus callosum (CC), and symmetrically in the cerebellar white matter. Sex-related texture differences in TDC subjects are found in primarily in the left amygdala, left cerebellar white matter, and brain stem. Correlations between age and texture in TDC subjects are found in the thalamus-proper, caudate and pallidum, most exhibiting bilateral symmetry.

  9. Dynamical Classifications of the Kuiper Belt

    NASA Astrophysics Data System (ADS)

    Maggard, Steven; Ragozzine, Darin

    2018-04-01

    The Minor Planet Center (MPC) contains a plethora of observational data on thousands of Kuiper Belt Objects (KBOs). Understanding their orbital properties refines our understanding of the formation of the solar system. My analysis pipeline, BUNSHIN, uses Bayesian methods to take the MPC observations and generate 30 statistically weighted orbital clones for each KBO that are propagated backwards along their orbits until the beginning of the solar system. These orbital integrations are saved as REBOUND SimulationArchive files (Rein & Tamayo 2017) which we will make publicly available, allowing many others to perform statistically-robust dynamical classification or complex dynamical investigations of outer solar system small bodies.This database has been used to expand the known collisional family members of the dwarf planet Haumea. Detailed orbital integrations are required to determine the dynamical distances between family members, in the form of "Delta v" as measured from conserved proper orbital elements (Ragozzine & Brown 2007). Our preliminary results have already ~tripled the number of known Haumea family members, allowing us to show that the Haumea family can be identified purely through dynamical clustering.We will discuss the methods associated with BUNSHIN and the database it generates, the refinement of the updated Haumea family, a brief search for other possible clusterings in the outer solar system, and the potential of our research to aid other dynamicists.

  10. Perceptions of Nigerian Women about Human Papilloma Virus, Cervical Cancer, and HPV Vaccine

    PubMed Central

    Akanbi, Olusola Anuoluwapo; Iyanda, Abiodun; Osundare, Folakemi; Opaleye, Oluyinka Oladele

    2015-01-01

    Background. Cervical cancer caused by human papilloma virus (HPV) though preventable has claimed the lives of many women worldwide. This study was embarked upon to evaluate the general knowledge and perceptions of Nigerian women on HPV, cervical cancer, and HPV vaccine. Methods. Structured questionnaires were administered to a cross section of 737 women randomly selected from the general population in two southwestern States of Nigeria. Statistical analysis was done using SPSS computer software version 16. A P value >0.05 was considered statistically significant. Results. One hundred and seventy-six (23.9%) of the respondents had knowledge of HPV; 474 (64.3%) are aware of cervical cancer but only 136 (18.5%) know that HPV causes cervical cancer. 200 (27.1%) are aware that there is an HPV vaccine while 300 (40.7%) had knowledge of Pap smear test. Two hundred and sixty (35.3%) of the respondents know that early detection of HPV can prevent cervical cancer and in spite of this, only 110 (14.9%) have taken the Pap smear test before while 151 (20.5%) are not willing to go for the test at all. Conclusions. There is therefore the need to create proper awareness on the HPV and its possible consequence of cervical carcinoma. PMID:26550522

  11. Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall

    NASA Astrophysics Data System (ADS)

    Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.

  12. Gas analysis system for the Eight Foot High Temperature Tunnel

    NASA Technical Reports Server (NTRS)

    Leighty, Bradley D.; Davis, Patricia P.; Upchurch, Billy T.; Puster, Richard L.

    1992-01-01

    This paper describes the development of a gas collection and analysis system that is to be installed in the Eight-Foot High Temperature Tunnel (8' HTT) at NASA's Langley Research Center. This system will be used to analyze the test gas medium that results after burning a methane-air mixture to achieve the proper tunnel test parameters. The system consists of a sampling rake, a gas sample storage array, and a gas chromatographic system. Gas samples will be analyzed after each run to assure that proper combustion takes place in the tunnel resulting in a correctly balanced composition of the test gas medium. The proper ratio of gas species is critically necessary in order for the proper operation and testing of scramjet engines in the tunnel. After a variety of methane-air burn conditions have been analyzed, additional oxygen will be introduced into the combusted gas and the enriched test gas medium analyzed. The pre/post enrichment sets of data will be compared to verify that the gas species of the test gas medium is correctly balanced for testing of air-breathing engines.

  13. Measures of Residential Energy Consumption and their Relationships to DOE Policy,

    DTIC Science & Technology

    1999-11-01

    on consumer behavior is inconclusive. Du Pont and Lord report that "a large percentage of consumers either ignore or misinterpret the labels (du...residential per capita energy consumption. 2. Implications Consumer behavior with respect to energy efficiency remains poorly understood and the proper...question by studies of consumer behavior . The results are also subject to numerical instability. Descriptive statistics can be helpful in interpreting the

  14. Prediction of the dollar to the ruble rate. A system-theoretic approach

    NASA Astrophysics Data System (ADS)

    Borodachev, Sergey M.

    2017-07-01

    Proposed a simple state-space model of dollar rate formation based on changes in oil prices and some mechanisms of money transfer between monetary and stock markets. Comparison of predictions by means of input-output model and state-space model is made. It concludes that with proper use of statistical data (Kalman filter) the second approach provides more adequate predictions of the dollar rate.

  15. Comparison of V50 Shot Placement on Final Outcome

    DTIC Science & Technology

    2014-11-01

    molecular- weight polyethylene (UHMWPE). In V50 testing of those types of materials, large delaminations may occur that influence the results. This...placement, a proper evaluation of materials may not be possible. 15. SUBJECT TERMS ballistics, V50 test, logistic regression , statistical inference...from an impact. While this may work with ceramics or metal armor, it is inappropriate for use on composite armors like ultra-high-molecular- weight

  16. Development of Strategic Air Command, 1946 - 1976

    DTIC Science & Technology

    1976-03-01

    flashes red. Azure , two clouds proper, one issuing from sinister chief and one issuing from dexter base, a cubit arm in armor in bend, issuing...security reasons, no statistics have been Included for those types of recon- naissance aircraft currently assigned. The text was prepared by Mr. J. C...Hopkins who was aided by Mr. Sheldon A. Goldberg, Command Archivist, who critically reviewed the text and selected the photographs. Special

  17. An integrated one-step system to extract, analyze and annotate all relevant information from image-based cell screening of chemical libraries.

    PubMed

    Rabal, Obdulia; Link, Wolfgang; Serelde, Beatriz G; Bischoff, James R; Oyarzabal, Julen

    2010-04-01

    Here we report the development and validation of a complete solution to manage and analyze the data produced by image-based phenotypic screening campaigns of small-molecule libraries. In one step initial crude images are analyzed for multiple cytological features, statistical analysis is performed and molecules that produce the desired phenotypic profile are identified. A naïve Bayes classifier, integrating chemical and phenotypic spaces, is built and utilized during the process to assess those images initially classified as "fuzzy"-an automated iterative feedback tuning. Simultaneously, all this information is directly annotated in a relational database containing the chemical data. This novel fully automated method was validated by conducting a re-analysis of results from a high-content screening campaign involving 33 992 molecules used to identify inhibitors of the PI3K/Akt signaling pathway. Ninety-two percent of confirmed hits identified by the conventional multistep analysis method were identified using this integrated one-step system as well as 40 new hits, 14.9% of the total, originally false negatives. Ninety-six percent of true negatives were properly recognized too. A web-based access to the database, with customizable data retrieval and visualization tools, facilitates the posterior analysis of annotated cytological features which allows identification of additional phenotypic profiles; thus, further analysis of original crude images is not required.

  18. Analyzing self-controlled case series data when case confirmation rates are estimated from an internal validation sample.

    PubMed

    Xu, Stanley; Clarke, Christina L; Newcomer, Sophia R; Daley, Matthew F; Glanz, Jason M

    2018-05-16

    Vaccine safety studies are often electronic health record (EHR)-based observational studies. These studies often face significant methodological challenges, including confounding and misclassification of adverse event. Vaccine safety researchers use self-controlled case series (SCCS) study design to handle confounding effect and employ medical chart review to ascertain cases that are identified using EHR data. However, for common adverse events, limited resources often make it impossible to adjudicate all adverse events observed in electronic data. In this paper, we considered four approaches for analyzing SCCS data with confirmation rates estimated from an internal validation sample: (1) observed cases, (2) confirmed cases only, (3) known confirmation rate, and (4) multiple imputation (MI). We conducted a simulation study to evaluate these four approaches using type I error rates, percent bias, and empirical power. Our simulation results suggest that when misclassification of adverse events is present, approaches such as observed cases, confirmed case only, and known confirmation rate may inflate the type I error, yield biased point estimates, and affect statistical power. The multiple imputation approach considers the uncertainty of estimated confirmation rates from an internal validation sample, yields a proper type I error rate, largely unbiased point estimate, proper variance estimate, and statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Get Ready for Gaia: Cool White Dwarfs in Common Proper Motion with Tycho Stars

    NASA Astrophysics Data System (ADS)

    Hambly, N.; Rowell, N.; Lam, M.

    2017-03-01

    We discuss the Gaia Data Release 1 (September 2016) and preliminary work on maximising the benefit for cool white dwarf (WD) science in advance of the full parallax catalogue which will appear around one year later in DR2. The Tycho catalogue is used in conjunction with the all-sky ground based astrometric/ photometric SuperCOSMOS Sky Survey in order to identify candidate faint common proper motion objects to the Tycho stars. Gaia DR1 is supplemented by the Tycho-Gaia Astrometric Solution catalogue containing some 2 million parallaxes with Hipparcos-like precision for Tycho stars. While hotter, brighter WDs are present in Tycho, cooler examples are much rarer (if present at all) and CPM offers one method to infer precision distances for a statistically useful sample of these very faint WDs.

  20. Consideration of vertical uncertainty in elevation-based sea-level rise assessments: Mobile Bay, Alabama case study

    USGS Publications Warehouse

    Gesch, Dean B.

    2013-01-01

    The accuracy with which coastal topography has been mapped directly affects the reliability and usefulness of elevationbased sea-level rise vulnerability assessments. Recent research has shown that the qualities of the elevation data must be well understood to properly model potential impacts. The cumulative vertical uncertainty has contributions from elevation data error, water level data uncertainties, and vertical datum and transformation uncertainties. The concepts of minimum sealevel rise increment and minimum planning timeline, important parameters for an elevation-based sea-level rise assessment, are used in recognition of the inherent vertical uncertainty of the underlying data. These concepts were applied to conduct a sea-level rise vulnerability assessment of the Mobile Bay, Alabama, region based on high-quality lidar-derived elevation data. The results that detail the area and associated resources (land cover, population, and infrastructure) vulnerable to a 1.18-m sea-level rise by the year 2100 are reported as a range of values (at the 95% confidence level) to account for the vertical uncertainty in the base data. Examination of the tabulated statistics about land cover, population, and infrastructure in the minimum and maximum vulnerable areas shows that these resources are not uniformly distributed throughout the overall vulnerable zone. The methods demonstrated in the Mobile Bay analysis provide an example of how to consider and properly account for vertical uncertainty in elevation-based sea-level rise vulnerability assessments, and the advantages of doing so.

  1. A concept analysis of nurses' grief.

    PubMed

    Wisekal, Ashley E

    2015-10-01

    The psychological and personal well-being of nurses can change the way they care for patients. If nurses' grief is not properly managed, the nursing shortage will continue to grow. Consequently, a need exists for the identification of nurses' grief and effective interventions to manage grief to ensure the successful development and growth of the nursing profession. This concept analysis sought to properly define nurses' grief and the role it plays in the day-to-day requirements of nurses. A review of the literature was conducted using CINAHL®, BioMed, EBSCOhost, and MEDLINE® and the following key words. Nurses' grief must be incorporated into the nursing curriculum and addressed by employers. In particular, facility leaders should help promote a healthy work environment and address the need for proper grief management. Educators, managers, and nurses can benefit from acknowledging the current gap in managing nurses' grief.

  2. On the correct implementation of Fermi-Dirac statistics and electron trapping in nonlinear electrostatic plane wave propagation in collisionless plasmas

    NASA Astrophysics Data System (ADS)

    Schamel, Hans; Eliasson, Bengt

    2016-05-01

    Quantum statistics and electron trapping have a decisive influence on the propagation characteristics of coherent stationary electrostatic waves. The description of these strictly nonlinear structures, which are of electron hole type and violate linear Vlasov theory due to the particle trapping at any excitation amplitude, is obtained by a correct reduction of the three-dimensional Fermi-Dirac distribution function to one dimension and by a proper incorporation of trapping. For small but finite amplitudes, the holes become of cnoidal wave type and the electron density is shown to be described by a ϕ ( x ) 1 / 2 rather than a ϕ ( x ) expansion, where ϕ ( x ) is the electrostatic potential. The general coefficients are presented for a degenerate plasma as well as the quantum statistical analogue to these steady state coherent structures, including the shape of ϕ ( x ) and the nonlinear dispersion relation, which describes their phase velocity.

  3. The application of the statistical classifying models for signal evaluation of the gas sensors analyzing mold contamination of the building materials

    NASA Astrophysics Data System (ADS)

    Majerek, Dariusz; Guz, Łukasz; Suchorab, Zbigniew; Łagód, Grzegorz; Sobczuk, Henryk

    2017-07-01

    Mold that develops on moistened building barriers is a major cause of the Sick Building Syndrome (SBS). Fungal contamination is normally evaluated using standard biological methods which are time-consuming and require a lot of manual labor. Fungi emit Volatile Organic Compounds (VOC) that can be detected in the indoor air using several techniques of detection e.g. chromatography. VOCs can be also detected using gas sensors arrays. All array sensors generate particular voltage signals that ought to be analyzed using properly selected statistical methods of interpretation. This work is focused on the attempt to apply statistical classifying models in evaluation of signals from gas sensors matrix to analyze the air sampled from the headspace of various types of the building materials at different level of contamination but also clean reference materials.

  4. Investigation on improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering

    NASA Astrophysics Data System (ADS)

    Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan

    2014-11-01

    Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.

  5. Electron Waiting Times in Mesoscopic Conductors

    NASA Astrophysics Data System (ADS)

    Albert, Mathias; Haack, Géraldine; Flindt, Christian; Büttiker, Markus

    2012-05-01

    Electron transport in mesoscopic conductors has traditionally involved investigations of the mean current and the fluctuations of the current. A complementary view on charge transport is provided by the distribution of waiting times between charge carriers, but a proper theoretical framework for coherent electronic systems has so far been lacking. Here we develop a quantum theory of electron waiting times in mesoscopic conductors expressed by a compact determinant formula. We illustrate our methodology by calculating the waiting time distribution for a quantum point contact and find a crossover from Wigner-Dyson statistics at full transmission to Poisson statistics close to pinch-off. Even when the low-frequency transport is noiseless, the electrons are not equally spaced in time due to their inherent wave nature. We discuss the implications for renewal theory in mesoscopic systems and point out several analogies with level spacing statistics and random matrix theory.

  6. Description and typology of intensive Chios dairy sheep farms in Greece.

    PubMed

    Gelasakis, A I; Valergakis, G E; Arsenos, G; Banos, G

    2012-06-01

    The aim was to assess the intensified dairy sheep farming systems of the Chios breed in Greece, establishing a typology that may properly describe and characterize them. The study included the total of the 66 farms of the Chios sheep breeders' cooperative Macedonia. Data were collected using a structured direct questionnaire for in-depth interviews, including questions properly selected to obtain a general description of farm characteristics and overall management practices. A multivariate statistical analysis was used on the data to obtain the most appropriate typology. Initially, principal component analysis was used to produce uncorrelated variables (principal components), which would be used for the consecutive cluster analysis. The number of clusters was decided using hierarchical cluster analysis, whereas, the farms were allocated in 4 clusters using k-means cluster analysis. The identified clusters were described and afterward compared using one-way ANOVA or a chi-squared test. The main differences were evident on land availability and use, facility and equipment availability and type, expansion rates, and application of preventive flock health programs. In general, cluster 1 included newly established, intensive, well-equipped, specialized farms and cluster 2 included well-established farms with balanced sheep and feed/crop production. In cluster 3 were assigned small flock farms focusing more on arable crops than on sheep farming with a tendency to evolve toward cluster 2, whereas cluster 4 included farms representing a rather conservative form of Chios sheep breeding with low/intermediate inputs and choosing not to focus on feed/crop production. In the studied set of farms, 4 different farmer attitudes were evident: 1) farming disrupts sheep breeding; feed should be purchased and economies of scale will decrease costs (mainly cluster 1), 2) only exercise/pasture land is necessary; at least part of the feed (pasture) must be home-grown to decrease costs (clusters 1 and 4), 3) providing pasture to sheep is essential; on-farm feed production decreases costs (mainly cluster 3), and 4) large-scale farming (feed production and cash crops) does not disrupt sheep breeding; all feed must be produced on-farm to decrease costs (mainly cluster 3). Conducting a profitability analysis among different clusters, exploring and discovering the most beneficial levels of intensified management and capital investment should now be considered. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. How Students Learn from Multiple Contexts and Definitions: Proper Time as a Coordination Class

    ERIC Educational Resources Information Center

    Levrini, Olivia; diSessa, Andrea A.

    2008-01-01

    This article provides an empirical analysis of a single classroom episode in which students reveal difficulties with the concept of proper time in special relativity but slowly make progress in improving their understanding. The theoretical framework used is "coordination class theory," which is an evolving model of concepts and conceptual change.…

  8. Plant Nutrient Testing and Analysis in Forest and Conservation Nurseries

    Treesearch

    Thomas D. Landis; Diane L. Haase; R. Kasten Dumroese

    2005-01-01

    Supplying mineral nutrients at the proper rate and in the proper balance has a major effect on seedling growth rate but, more importantly, on seedling quality. In addition, mounting concerns about fertilizer pollution are increasing awareness of the benefits of precision fertilization. Because they reflect actual mineral nutrient uptake, plant tissue tests are the best...

  9. Applying causal mediation analysis to personality disorder research.

    PubMed

    Walters, Glenn D

    2018-01-01

    This article is designed to address fundamental issues in the application of causal mediation analysis to research on personality disorders. Causal mediation analysis is used to identify mechanisms of effect by testing variables as putative links between the independent and dependent variables. As such, it would appear to have relevance to personality disorder research. It is argued that proper implementation of causal mediation analysis requires that investigators take several factors into account. These factors are discussed under 5 headings: variable selection, model specification, significance evaluation, effect size estimation, and sensitivity testing. First, care must be taken when selecting the independent, dependent, mediator, and control variables for a mediation analysis. Some variables make better mediators than others and all variables should be based on reasonably reliable indicators. Second, the mediation model needs to be properly specified. This requires that the data for the analysis be prospectively or historically ordered and possess proper causal direction. Third, it is imperative that the significance of the identified pathways be established, preferably with a nonparametric bootstrap resampling approach. Fourth, effect size estimates should be computed or competing pathways compared. Finally, investigators employing the mediation method are advised to perform a sensitivity analysis. Additional topics covered in this article include parallel and serial multiple mediation designs, moderation, and the relationship between mediation and moderation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Analysis of postoperative complications for superficial liposuction: a review of 2398 cases.

    PubMed

    Kim, Youn Hwan; Cha, Sang Myun; Naidu, Shenthilkumar; Hwang, Weon Jung

    2011-02-01

    Superficial liposuction has found its application in maximizing and creating a lifting effect to achieve a better aesthetic result. Due to initial high complication rates, these procedures were generally accepted as risky. In a response to the increasing concerns over the safety and efficacy of superficial liposuction, the authors describe their 14-year experience of performing superficial liposuction and analysis of postoperative complications associated with these procedures. From March of 1995 to December of 2008, the authors performed superficial liposuction on 2398 patients. Three subgroups were incorporated according to liposuction methods as follows: power-assisted liposuction alone (subgroup 1), power-assisted liposuction combined with ultrasound energy (subgroup 2), and power-assisted liposuction combined with external ultrasound and postoperative Endermologie (subgroup 3). Statistical analyses for complications were performed among subgroups. The mean age was 42.8 years, mean body mass index was 27.9 kg/m2, and mean volume of total aspiration was 5045 cc. Overall complication rate was 8.6 percent (206 patients). Four cases of skin necroses and two cases of infections were included. The most common complication was postoperative contour irregularity. Power-assisted liposuction combined with external ultrasound with or without postoperative Endermologie was seen to decrease the overall complication rate, contour irregularity, and skin necrosis. There were no statistical differences regarding other complications. Superficial liposuction has potential risks for higher complications compared with conventional suction techniques, especially postoperative contour irregularity, which can be minimized with proper selection of candidates for the procedure, avoiding overzealous suctioning of superficial layer, and using a combination of ultrasound energy techniques.

  11. Optimization of Anodic Porous Alumina Fabricated from Commercial Aluminum Food Foils: A Statistical Approach

    PubMed Central

    Riccomagno, Eva; Shayganpour, Amirreza; Salerno, Marco

    2017-01-01

    Anodic porous alumina is a known material based on an old industry, yet with emerging applications in nanoscience and nanotechnology. This is promising, but the nanostructured alumina should be fabricated from inexpensive raw material. We fabricated porous alumina from commercial aluminum food plate in 0.4 M aqueous phosphoric acid, aiming to design an effective manufacturing protocol for the material used as nanoporous filler in dental restorative composites, an application demonstrated previously by our group. We identified the critical input parameters of anodization voltage, bath temperature and anodization time, and the main output parameters of pore diameter, pore spacing and oxide thickness. Scanning electron microscopy and grain analysis allowed us to assess the nanostructured material, and the statistical design of experiments was used to optimize its fabrication. We analyzed a preliminary dataset, designed a second dataset aimed at clarifying the correlations between input and output parameters, and ran a confirmation dataset. Anodization conditions close to 125 V, 20 °C, and 7 h were identified as the best for obtaining, in the shortest possible time, pore diameters and spacing of 100–150 nm and 150–275 nm respectively, and thickness of 6–8 µm, which are desirable for the selected application according to previously published results. Our analysis confirmed the linear dependence of pore size on anodization voltage and of thickness on anodization time. The importance of proper control on the experiment was highlighted, since batch effects emerge when the experimental conditions are not exactly reproduced. PMID:28772776

  12. Postural awareness among dental students in Jizan, Saudi Arabia

    PubMed Central

    Kanaparthy, Aruna; Kanaparthy, Rosaiah; Boreak, Nezar

    2015-01-01

    Objective: The study was conducted to assess the postural awareness of dental students in Jizan, Saudi Arabia. Materials and Methods: Close-ended, self-administered questionnaires were used for data collection in the survey. The questionnaire was prepared by observing the positions of students working in the clinics and the common mistakes they make with regard to their postures. The questionnaires were distributed among the dental students who were present and reported to work in the clinics. Levels of postural awareness and the relationship between postural awareness and the degree of musculoskeletal disorder (MSD) among the students was evaluated. This study was carried out in the College of Dental Sciences and Hospital, Jizan. Statistical Analysis: The level of knowledge of postural awareness was evaluated and correlated with the presence or absence of the MSDs. Categorical variables were compared using Chi-square test. P values of less than 0.05 were considered statistically significant. Results: A total of 162 dental students from the age group of 20–25 years were included in the survey, of which 134 dentists responded (83%). When their postural awareness was evaluated, results showed that 89% of the students had poor-to-medium levels of postural awareness. The relation between postural awareness and prevalence of MSDs indicated that 75% of the students with poor awareness, 49% of the students with average awareness, and 40% of the students with good awareness have MSDs. The results were statistically significant (0.002127, which is <0.005) stating that better awareness about proper postures while working helps to minimize the risk of MSDs. Conclusion: Evaluation of levels of postural awareness showed that 21% of the students had poor postural awareness, 67% had average awareness, and 11% had good postural awareness. The analysis of results showed that those students with low-to-average postural awareness had significantly greater prevalence of MSDs. PMID:26942113

  13. Statistical Methods for Proteomic Biomarker Discovery based on Feature Extraction or Functional Modeling Approaches.

    PubMed

    Morris, Jeffrey S

    2012-01-01

    In recent years, developments in molecular biotechnology have led to the increased promise of detecting and validating biomarkers, or molecular markers that relate to various biological or medical outcomes. Proteomics, the direct study of proteins in biological samples, plays an important role in the biomarker discovery process. These technologies produce complex, high dimensional functional and image data that present many analytical challenges that must be addressed properly for effective comparative proteomics studies that can yield potential biomarkers. Specific challenges include experimental design, preprocessing, feature extraction, and statistical analysis accounting for the inherent multiple testing issues. This paper reviews various computational aspects of comparative proteomic studies, and summarizes contributions I along with numerous collaborators have made. First, there is an overview of comparative proteomics technologies, followed by a discussion of important experimental design and preprocessing issues that must be considered before statistical analysis can be done. Next, the two key approaches to analyzing proteomics data, feature extraction and functional modeling, are described. Feature extraction involves detection and quantification of discrete features like peaks or spots that theoretically correspond to different proteins in the sample. After an overview of the feature extraction approach, specific methods for mass spectrometry ( Cromwell ) and 2D gel electrophoresis ( Pinnacle ) are described. The functional modeling approach involves modeling the proteomic data in their entirety as functions or images. A general discussion of the approach is followed by the presentation of a specific method that can be applied, wavelet-based functional mixed models, and its extensions. All methods are illustrated by application to two example proteomic data sets, one from mass spectrometry and one from 2D gel electrophoresis. While the specific methods presented are applied to two specific proteomic technologies, MALDI-TOF and 2D gel electrophoresis, these methods and the other principles discussed in the paper apply much more broadly to other expression proteomics technologies.

  14. Imputation approaches for animal movement modeling

    USGS Publications Warehouse

    Scharf, Henry; Hooten, Mevin B.; Johnson, Devin S.

    2017-01-01

    The analysis of telemetry data is common in animal ecological studies. While the collection of telemetry data for individual animals has improved dramatically, the methods to properly account for inherent uncertainties (e.g., measurement error, dependence, barriers to movement) have lagged behind. Still, many new statistical approaches have been developed to infer unknown quantities affecting animal movement or predict movement based on telemetry data. Hierarchical statistical models are useful to account for some of the aforementioned uncertainties, as well as provide population-level inference, but they often come with an increased computational burden. For certain types of statistical models, it is straightforward to provide inference if the latent true animal trajectory is known, but challenging otherwise. In these cases, approaches related to multiple imputation have been employed to account for the uncertainty associated with our knowledge of the latent trajectory. Despite the increasing use of imputation approaches for modeling animal movement, the general sensitivity and accuracy of these methods have not been explored in detail. We provide an introduction to animal movement modeling and describe how imputation approaches may be helpful for certain types of models. We also assess the performance of imputation approaches in two simulation studies. Our simulation studies suggests that inference for model parameters directly related to the location of an individual may be more accurate than inference for parameters associated with higher-order processes such as velocity or acceleration. Finally, we apply these methods to analyze a telemetry data set involving northern fur seals (Callorhinus ursinus) in the Bering Sea. Supplementary materials accompanying this paper appear online.

  15. Occupational safety and health status of medical laboratories in Kajiado County, Kenya.

    PubMed

    Tait, Fridah Ntinyari; Mburu, Charles; Gikunju, Joseph

    2018-01-01

    Despite the increasing interest in Occupational Safety and Health (OSH), seldom studies are available on OSH in medical laboratories from developing countries in general although a high number of injuries occur without proper documentation. It is estimated that every day 6,300 people die as a result of occupational accidents or work-related diseases resulting in over 2.3 million deaths per year. Medical laboratories handle a wide range of materials, potentially dangerous pathogenic agents and exposes health workers to numerous potential hazards. This study evaluated the status of OSH in medical laboratories in Kajiado County, Kenya. The objectives included establishment of biological, chemical and physical hazards; reviewing medical laboratories control measures; and enumerating factors hindering implementation of good practices in OSH. This was a cross-sectional descriptive study research design. Observation check lists, interview schedules and structured questionnaires were used. The study was carried out in 108 medical laboratories among 204 sampled respondents. Data was analysed using statistical package for social science (SPSS) 20 software. The commonest type of hazards in medical laboratories include; bacteria (80%) for Biological hazards; handling un-labelled and un-marked chemicals (38.2%) for chemical hazards; and laboratory equipment's dangerously placed (49.5%) for Physical hazards. According to Pearson's Product Moment Correlation analysis, not-wearing personal protective equipment's was statistically associated with exposure to hazards. Individual control measures were statistically significant at 0.01 significance level. Only 65.1% of the factors influencing implementation of OSH in medical laboratories were identified. Training has the highest contribution to good OSH practices.

  16. The Development of Strategic Air Command, 1946-1976

    DTIC Science & Technology

    1976-03-21

    branch green, and three lightning flashes red. SHIELD Azure , two clouds proper, one issuing from sinister chief and one issuing from dexter base, a...Financial Status, since 1958. For security reasons, no statistics have been included for those types of recon- naissance aircraft currently a ned. The text ...was prepared by . J C.)Hop~ns who was aided by Mr. Sheldon A. Goldberg, Command Archivist, w o critically reviewed the text and selected the

  17. Facial trauma as physical violence markers against elderly Brazilians: A comparative analysis between genders.

    PubMed

    de Sousa, Rayanne Izabel Maciel; de Macedo Bernardino, Ítalo; Castro, Ricardo Dias; Cavalcanti, Alessandro Leite; Bento, Patricia Meira; d'Ávila, Sérgio

    2016-01-01

    The aim of this study was to characterize the profile of elderly Brazilians with injuries resulting from physical violence and identify victimization differences. A descriptive and exploratory study was conducted involving the analysis of medico-legal and social records of 259 elderly victims of physical violence treated at an Institute of Forensic Medicine and Dentistry over four years (from January 2008 to December 2011). The forensic service database was evaluated by researchers properly trained and calibrated to perform this function between January and March 2013. Socio-demographic variables of victims, aggression characteristics, aggressor's profile and types of lesions were evaluated. Descriptive and multivariate statistics using Multiple Correspondence Analysis (MCA) were performed. The prevalence of facial trauma was 42.9%. Based on the MCA results, two groups with different victimization profiles were identified: married men aged 70-79 years, victims of community violence at night, suffering facial injuries; and single, widowed or separated women aged 60-69 years, victims of domestic violence during the day, suffering trauma in other areas of the body. The results suggest that there is a high prevalence of facial injuries among elderly Brazilians victims of physical violence and there are important differences related to victimization characteristics according to gender. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Least-Squares Regression and Spectral Residual Augmented Classical Least-Squares Chemometric Models for Stability-Indicating Analysis of Agomelatine and Its Degradation Products: A Comparative Study.

    PubMed

    Naguib, Ibrahim A; Abdelrahman, Maha M; El Ghobashy, Mohamed R; Ali, Nesma A

    2016-01-01

    Two accurate, sensitive, and selective stability-indicating methods are developed and validated for simultaneous quantitative determination of agomelatine (AGM) and its forced degradation products (Deg I and Deg II), whether in pure forms or in pharmaceutical formulations. Partial least-squares regression (PLSR) and spectral residual augmented classical least-squares (SRACLS) are two chemometric models that are being subjected to a comparative study through handling UV spectral data in range (215-350 nm). For proper analysis, a three-factor, four-level experimental design was established, resulting in a training set consisting of 16 mixtures containing different ratios of interfering species. An independent test set consisting of eight mixtures was used to validate the prediction ability of the suggested models. The results presented indicate the ability of mentioned multivariate calibration models to analyze AGM, Deg I, and Deg II with high selectivity and accuracy. The analysis results of the pharmaceutical formulations were statistically compared to the reference HPLC method, with no significant differences observed regarding accuracy and precision. The SRACLS model gives comparable results to the PLSR model; however, it keeps the qualitative spectral information of the classical least-squares algorithm for analyzed components.

  19. Coupling News Sentiment with Web Browsing Data Improves Prediction of Intra-Day Price Dynamics.

    PubMed

    Ranco, Gabriele; Bordino, Ilaria; Bormetti, Giacomo; Caldarelli, Guido; Lillo, Fabrizio; Treccani, Michele

    2016-01-01

    The new digital revolution of big data is deeply changing our capability of understanding society and forecasting the outcome of many social and economic systems. Unfortunately, information can be very heterogeneous in the importance, relevance, and surprise it conveys, affecting severely the predictive power of semantic and statistical methods. Here we show that the aggregation of web users' behavior can be elicited to overcome this problem in a hard to predict complex system, namely the financial market. Specifically, our in-sample analysis shows that the combined use of sentiment analysis of news and browsing activity of users of Yahoo! Finance greatly helps forecasting intra-day and daily price changes of a set of 100 highly capitalized US stocks traded in the period 2012-2013. Sentiment analysis or browsing activity when taken alone have very small or no predictive power. Conversely, when considering a news signal where in a given time interval we compute the average sentiment of the clicked news, weighted by the number of clicks, we show that for nearly 50% of the companies such signal Granger-causes hourly price returns. Our result indicates a "wisdom-of-the-crowd" effect that allows to exploit users' activity to identify and weigh properly the relevant and surprising news, enhancing considerably the forecasting power of the news sentiment.

  20. contamDE: differential expression analysis of RNA-seq data for contaminated tumor samples.

    PubMed

    Shen, Qi; Hu, Jiyuan; Jiang, Ning; Hu, Xiaohua; Luo, Zewei; Zhang, Hong

    2016-03-01

    Accurate detection of differentially expressed genes between tumor and normal samples is a primary approach of cancer-related biomarker identification. Due to the infiltration of tumor surrounding normal cells, the expression data derived from tumor samples would always be contaminated with normal cells. Ignoring such cellular contamination would deflate the power of detecting DE genes and further confound the biological interpretation of the analysis results. For the time being, there does not exists any differential expression analysis approach for RNA-seq data in literature that can properly account for the contamination of tumor samples. Without appealing to any extra information, we develop a new method 'contamDE' based on a novel statistical model that associates RNA-seq expression levels with cell types. It is demonstrated through simulation studies that contamDE could be much more powerful than the existing methods that ignore the contamination. In the application to two cancer studies, contamDE uniquely found several potential therapy and prognostic biomarkers of prostate cancer and non-small cell lung cancer. An R package contamDE is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/ zhanghfd@fudan.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

Top