Sample records for statistical methods reveal

  1. Validation of a modification to Performance-Tested Method 070601: Reveal Listeria Test for detection of Listeria spp. in selected foods and selected environmental samples.

    PubMed

    Alles, Susan; Peng, Linda X; Mozola, Mark A

    2009-01-01

    A modification to Performance-Tested Method (PTM) 070601, Reveal Listeria Test (Reveal), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there was a statistically significant difference in performance between the Reveal and reference culture [U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA/BAM) or U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS)] methods for only a single food in one trial (pasteurized crab meat) at the 27 h enrichment time point, with more positive results obtained with the FDA/BAM reference method. No foods showed statistically significant differences in method performance at the 30 h time point. Independent laboratory testing of 3 foods again produced a statistically significant difference in results for crab meat at the 27 h time point; otherwise results of the Reveal and reference methods were statistically equivalent. Overall, considering both internal and independent laboratory trials, sensitivity of the Reveal method relative to the reference culture procedures in testing of foods was 85.9% at 27 h and 97.1% at 30 h. Results from 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the Reveal method was more productive than the reference USDA-FSIS culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the Reveal method at the 24 h time point. Overall, sensitivity of the Reveal method at 24 h relative to that of the USDA-FSIS method was 153%. The Reveal method exhibited extremely high specificity, with only a single false-positive result in all trials combined for overall specificity of 99.5%.

  2. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  3. Reveal Listeria 2.0 test for detection of Listeria spp. in foods and environmental samples.

    PubMed

    Alles, Susan; Curry, Stephanie; Almy, David; Jagadeesan, Balamurugan; Rice, Jennifer; Mozola, Mark

    2012-01-01

    A Performance Tested Method validation study was conducted for a new lateral flow immunoassay (Reveal Listeria 2.0) for detection of Listeria spp. in foods and environmental samples. Results of inclusivity testing showed that the test detects all species of Listeria, with the exception of L. grayi. In exclusivity testing conducted under nonselective growth conditions, all non-listeriae tested produced negative Reveal assay results, except for three strains of Lactobacillus spp. However, these lactobacilli are inhibited by the selective Listeria Enrichment Single Step broth enrichment medium used with the Reveal method. Six foods were tested in parallel by the Reveal method and the U.S. Food and Drug Administration/Bacteriological Analytical Manual (FDA/BAM) reference culture procedure. Considering data from both internal and independent laboratory trials, overall sensitivity of the Reveal method relative to that of the FDA/BAM procedure was 101%. Four foods were tested in parallel by the Reveal method and the U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) reference culture procedure. Overall sensitivity of the Reveal method relative to that of the USDA-FSIS procedure was 98.2%. There were no statistically significant differences in the number of positives obtained by the Reveal and reference culture procedures in any food trials. In testing of swab or sponge samples from four types of environmental surfaces, sensitivity of Reveal relative to that of the USDA-FSIS reference culture procedure was 127%. For two surface types, differences in the number of positives obtained by the Reveal and reference methods were statistically significant, with more positives by the Reveal method in both cases. Specificity of the Reveal assay was 100%, as there were no unconfirmed positive results obtained in any phase of the testing. Results of ruggedness experiments showed that the Reveal assay is tolerant of modest deviations in test sample volume and device incubation time.

  4. OSPAR standard method and software for statistical analysis of beach litter data.

    PubMed

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Statistical methods used in the public health literature and implications for training of public health professionals

    PubMed Central

    Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190

  6. Statistical methods used in the public health literature and implications for training of public health professionals.

    PubMed

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  7. Comparison of U-spatial statistics and C-A fractal models for delineating anomaly patterns of porphyry-type Cu geochemical signatures in the Varzaghan district, NW Iran

    NASA Astrophysics Data System (ADS)

    Ghezelbash, Reza; Maghsoudi, Abbas

    2018-05-01

    The delineation of populations of stream sediment geochemical data is a crucial task in regional exploration surveys. In this contribution, uni-element stream sediment geochemical data of Cu, Au, Mo, and Bi have been subjected to two reliable anomaly-background separation methods, namely, the concentration-area (C-A) fractal and the U-spatial statistics methods to separate geochemical anomalies related to porphyry-type Cu mineralization in northwest Iran. The quantitative comparison of the delineated geochemical populations using the modified success-rate curves revealed the superiority of the U-spatial statistics method over the fractal model. Moreover, geochemical maps of investigated elements revealed strongly positive correlations between strong anomalies and Oligocene-Miocene intrusions in the study area. Therefore, follow-up exploration programs should focus on these areas.

  8. Using Data from Climate Science to Teach Introductory Statistics

    ERIC Educational Resources Information Center

    Witt, Gary

    2013-01-01

    This paper shows how the application of simple statistical methods can reveal to students important insights from climate data. While the popular press is filled with contradictory opinions about climate science, teachers can encourage students to use introductory-level statistics to analyze data for themselves on this important issue in public…

  9. Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-06-01

    Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.

  10. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  11. A statistical method (cross-validation) for bone loss region detection after spaceflight

    PubMed Central

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  12. Fear and loathing: undergraduate nursing students' experiences of a mandatory course in applied statistics.

    PubMed

    Hagen, Brad; Awosoga, Oluwagbohunmi A; Kellett, Peter; Damgaard, Marie

    2013-04-23

    This article describes the results of a qualitative research study evaluating nursing students' experiences of a mandatory course in applied statistics, and the perceived effectiveness of teaching methods implemented during the course. Fifteen nursing students in the third year of a four-year baccalaureate program in nursing participated in focus groups before and after taking the mandatory course in statistics. The interviews were transcribed and analyzed using content analysis to reveal four major themes: (i) "one of those courses you throw out?," (ii) "numbers and terrifying equations," (iii) "first aid for statistics casualties," and (iv) "re-thinking curriculum." Overall, the data revealed that although nursing students initially enter statistics courses with considerable skepticism, fear, and anxiety, there are a number of concrete actions statistics instructors can take to reduce student fear and increase the perceived relevance of courses in statistics.

  13. Comparison of the Reveal 20-hour method and the BAM culture method for the detection of Escherichia coli O157:H7 in selected foods and environmental swabs: collaborative study.

    PubMed

    Bird, C B; Hoerner, R J; Restaino, L

    2001-01-01

    Four different food types along with environmental swabs were analyzed by the Reveal for E. coli O157:H7 test (Reveal) and the Bacteriological Analytical Manual (BAM) culture method for the presence of Escherichia coli O157:H7. Twenty-seven laboratories representing academia and private industry in the United States and Canada participated. Sample types were inoculated with E. coli O157:H7 at 2 different levels. Of the 1,095 samples and controls analyzed and confirmed, 459 were positive and 557 were negative by both methods. No statistical differences (p <0.05) were observed between the Reveal and BAM methods.

  14. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  15. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  16. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  17. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    PubMed Central

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  18. A new universality class in corpus of texts; A statistical physics study

    NASA Astrophysics Data System (ADS)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  19. A spatial scan statistic for survival data based on Weibull distribution.

    PubMed

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Application of statistical methods to reveal and remove the causes of welding of coil laps upon annealing of cold-rolled steel strips

    NASA Astrophysics Data System (ADS)

    Garber, E. A.; Diligenskii, E. V.; Antonov, P. V.; Shalaevskii, D. L.; Dyatlov, I. A.

    2017-09-01

    The factors of the process of production of cold-rolled steel strips that promote and hinder the appearance of a coil lap welding defect upon annealing in bell-type furnaces are analyzed using statistical methods. The works dealing with this problem are analytically reviewed to reveal the problems to be studied and refined. The ranking of the technological factors according to the significance of their influence on the probability of appearance of this defect is determined and supported by industrial data, and a regression equation is derived to calculate this probability. The process of production is improved to minimize the rejection of strips caused by the welding of coil laps.

  1. Statistical methods to estimate treatment effects from multichannel electroencephalography (EEG) data in clinical trials.

    PubMed

    Ma, Junshui; Wang, Shubing; Raubertas, Richard; Svetnik, Vladimir

    2010-07-15

    With the increasing popularity of using electroencephalography (EEG) to reveal the treatment effect in drug development clinical trials, the vast volume and complex nature of EEG data compose an intriguing, but challenging, topic. In this paper the statistical analysis methods recommended by the EEG community, along with methods frequently used in the published literature, are first reviewed. A straightforward adjustment of the existing methods to handle multichannel EEG data is then introduced. In addition, based on the spatial smoothness property of EEG data, a new category of statistical methods is proposed. The new methods use a linear combination of low-degree spherical harmonic (SPHARM) basis functions to represent a spatially smoothed version of the EEG data on the scalp, which is close to a sphere in shape. In total, seven statistical methods, including both the existing and the newly proposed methods, are applied to two clinical datasets to compare their power to detect a drug effect. Contrary to the EEG community's recommendation, our results suggest that (1) the nonparametric method does not outperform its parametric counterpart; and (2) including baseline data in the analysis does not always improve the statistical power. In addition, our results recommend that (3) simple paired statistical tests should be avoided due to their poor power; and (4) the proposed spatially smoothed methods perform better than their unsmoothed versions. Copyright 2010 Elsevier B.V. All rights reserved.

  2. Research of facial feature extraction based on MMC

    NASA Astrophysics Data System (ADS)

    Xue, Donglin; Zhao, Jiufen; Tang, Qinhong; Shi, Shaokun

    2017-07-01

    Based on the maximum margin criterion (MMC), a new algorithm of statistically uncorrelated optimal discriminant vectors and a new algorithm of orthogonal optimal discriminant vectors for feature extraction were proposed. The purpose of the maximum margin criterion is to maximize the inter-class scatter while simultaneously minimizing the intra-class scatter after the projection. Compared with original MMC method and principal component analysis (PCA) method, the proposed methods are better in terms of reducing or eliminating the statistically correlation between features and improving recognition rate. The experiment results on Olivetti Research Laboratory (ORL) face database shows that the new feature extraction method of statistically uncorrelated maximum margin criterion (SUMMC) are better in terms of recognition rate and stability. Besides, the relations between maximum margin criterion and Fisher criterion for feature extraction were revealed.

  3. Use of recurrence plots in the analysis of pupil diameter dynamics in narcoleptics

    NASA Astrophysics Data System (ADS)

    Keegan, Andrew P.; Zbilut, J. P.; Merritt, S. L.; Mercer, P. J.

    1993-11-01

    Recurrence plots were used to evaluate pupil dynamics of subjects with narcolepsy. Preliminary data indicate that this nonlinear method of analyses may be more useful in revealing underlying deterministic differences than traditional methods like FFT and counting statistics.

  4. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings.

    PubMed

    De Luca, Carlo J; Kline, Joshua C

    2014-12-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles--a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. Copyright © 2014 the American Physiological Society.

  5. Statistical methods for detecting and comparing periodic data and their application to the nycthemeral rhythm of bodily harm: A population based study

    PubMed Central

    2010-01-01

    Background Animals, including humans, exhibit a variety of biological rhythms. This article describes a method for the detection and simultaneous comparison of multiple nycthemeral rhythms. Methods A statistical method for detecting periodic patterns in time-related data via harmonic regression is described. The method is particularly capable of detecting nycthemeral rhythms in medical data. Additionally a method for simultaneously comparing two or more periodic patterns is described, which derives from the analysis of variance (ANOVA). This method statistically confirms or rejects equality of periodic patterns. Mathematical descriptions of the detecting method and the comparing method are displayed. Results Nycthemeral rhythms of incidents of bodily harm in Middle Franconia are analyzed in order to demonstrate both methods. Every day of the week showed a significant nycthemeral rhythm of bodily harm. These seven patterns of the week were compared to each other revealing only two different nycthemeral rhythms, one for Friday and Saturday and one for the other weekdays. PMID:21059197

  6. [Statistics for statistics?--Thoughts about psychological tools].

    PubMed

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  7. The influence of depression and anxiety in the development of heart failure after coronary angioplasty.

    PubMed

    Gegenava, T; Gegenava, M; Kavtaradze, G

    2009-03-01

    The aim of our study was to investigate the association between history of depressive episode and anxiety and complications in patients after 6 months of coronary artery angioplasty. The research was conducted on 70 patients, the grade of coronary occlusion that would not respond to therapeutic treatment and need coronary angioplasty had been established. Complications were estimated in 60 patients after 6 months of coronary angioplasty. To evaluate depression we used Beck depression scale Anxiety was assessed by Spilberger State-trait anxiety scale. Statistic analysis of the data was made by means of the methods of variation statistics using Students' criterion and program of STATISTICA w 5.0. Complications were discovered in 36 (60%) patients; 24 (40%) patients had not complications. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. Our study demonstrated that complications were revealed in patients who had high degree of depression and anxiety.

  8. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    PubMed

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  9. A benchmark for statistical microarray data analysis that preserves actual biological and technical variance.

    PubMed

    De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric

    2010-01-11

    Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.

  10. Genetic assignment methods for gaining insight into the management of infectious disease by understanding pathogen, vector, and host movement.

    PubMed

    Remais, Justin V; Xiao, Ning; Akullian, Adam; Qiu, Dongchuan; Blair, David

    2011-04-01

    For many pathogens with environmental stages, or those carried by vectors or intermediate hosts, disease transmission is strongly influenced by pathogen, host, and vector movements across complex landscapes, and thus quantitative measures of movement rate and direction can reveal new opportunities for disease management and intervention. Genetic assignment methods are a set of powerful statistical approaches useful for establishing population membership of individuals. Recent theoretical improvements allow these techniques to be used to cost-effectively estimate the magnitude and direction of key movements in infectious disease systems, revealing important ecological and environmental features that facilitate or limit transmission. Here, we review the theory, statistical framework, and molecular markers that underlie assignment methods, and we critically examine recent applications of assignment tests in infectious disease epidemiology. Research directions that capitalize on use of the techniques are discussed, focusing on key parameters needing study for improved understanding of patterns of disease.

  11. Dealing with the Conflicting Results of Psycholinguistic Experiments: How to Resolve Them with the Help of Statistical Meta-analysis.

    PubMed

    Rákosi, Csilla

    2018-01-22

    This paper proposes the use of the tools of statistical meta-analysis as a method of conflict resolution with respect to experiments in cognitive linguistics. With the help of statistical meta-analysis, the effect size of similar experiments can be compared, a well-founded and robust synthesis of the experimental data can be achieved, and possible causes of any divergence(s) in the outcomes can be revealed. This application of statistical meta-analysis offers a novel method of how diverging evidence can be dealt with. The workability of this idea is exemplified by a case study dealing with a series of experiments conducted as non-exact replications of Thibodeau and Boroditsky (PLoS ONE 6(2):e16782, 2011. https://doi.org/10.1371/journal.pone.0016782 ).

  12. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  13. Resonance Raman of BCC and normal skin

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-hui; Sriramoju, Vidyasagar; Boydston-White, Susie; Wu, Binlin; Zhang, Chunyuan; Pei, Zhe; Sordillo, Laura; Beckman, Hugh; Alfano, Robert R.

    2017-02-01

    The Resonance Raman (RR) spectra of basal cell carcinoma (BCC) and normal human skin tissues were analyzed using 532nm laser excitation. RR spectral differences in vibrational fingerprints revealed skin normal and cancerous states tissues. The standard diagnosis criterion for BCC tissues are created by native RR biomarkers and its changes at peak intensity. The diagnostic algorithms for the classification of BCC and normal were generated based on SVM classifier and PCA statistical method. These statistical methods were used to analyze the RR spectral data collected from skin tissues, yielding a diagnostic sensitivity of 98.7% and specificity of 79% compared with pathological reports.

  14. A review of statistical methods to analyze extreme precipitation and temperature events in the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Lazoglou, Georgia; Anagnostopoulou, Christina; Tolika, Konstantia; Kolyva-Machera, Fotini

    2018-04-01

    The increasing trend of the intensity and frequency of temperature and precipitation extremes during the past decades has substantial environmental and socioeconomic impacts. Thus, the objective of the present study is the comparison of several statistical methods of the extreme value theory (EVT) in order to identify which is the most appropriate to analyze the behavior of the extreme precipitation, and high and low temperature events, in the Mediterranean region. The extremes choice was made using both the block maxima and the peaks over threshold (POT) technique and as a consequence both the generalized extreme value (GEV) and generalized Pareto distributions (GPDs) were used to fit them. The results were compared, in order to select the most appropriate distribution for extremes characterization. Moreover, this study evaluates the maximum likelihood estimation, the L-moments and the Bayesian method, based on both graphical and statistical goodness-of-fit tests. It was revealed that the GPD can characterize accurately both precipitation and temperature extreme events. Additionally, GEV distribution with the Bayesian method is proven to be appropriate especially for the greatest values of extremes. Another important objective of this investigation was the estimation of the precipitation and temperature return levels for three return periods (50, 100, and 150 years) classifying the data into groups with similar characteristics. Finally, the return level values were estimated with both GEV and GPD and with the three different estimation methods, revealing that the selected method can affect the return level values for both the parameter of precipitation and temperature.

  15. Methods to Approach Velocity Data Reduction and Their Effects on Conformation Statistics in Viscoelastic Turbulent Channel Flows

    NASA Astrophysics Data System (ADS)

    Samanta, Gaurab; Beris, Antony; Handler, Robert; Housiadas, Kostas

    2009-03-01

    Karhunen-Loeve (KL) analysis of DNS data of viscoelastic turbulent channel flows helps us to reveal more information on the time-dependent dynamics of viscoelastic modification of turbulence [Samanta et. al., J. Turbulence (in press), 2008]. A selected set of KL modes can be used for a data reduction modeling of these flows. However, it is pertinent that verification be done against established DNS results. For this purpose, we did comparisons of velocity and conformations statistics and probability density functions (PDFs) of relevant quantities obtained from DNS and reconstructed fields using selected KL modes and time-dependent coefficients. While the velocity statistics show good agreement between results from DNS and KL reconstructions even with just hundreds of KL modes, tens of thousands of KL modes are required to adequately capture the trace of polymer conformation resulting from DNS. New modifications to KL method have therefore been attempted to account for the differences in conformation statistics. The applicability and impact of these new modified KL methods will be discussed in the perspective of data reduction modeling.

  16. Statistical identification of stimulus-activated network nodes in multi-neuron voltage-sensitive dye optical recordings.

    PubMed

    Fathiazar, Elham; Anemuller, Jorn; Kretzberg, Jutta

    2016-08-01

    Voltage-Sensitive Dye (VSD) imaging is an optical imaging method that allows measuring the graded voltage changes of multiple neurons simultaneously. In neuroscience, this method is used to reveal networks of neurons involved in certain tasks. However, the recorded relative dye fluorescence changes are usually low and signals are superimposed by noise and artifacts. Therefore, establishing a reliable method to identify which cells are activated by specific stimulus conditions is the first step to identify functional networks. In this paper, we present a statistical method to identify stimulus-activated network nodes as cells, whose activities during sensory network stimulation differ significantly from the un-stimulated control condition. This method is demonstrated based on voltage-sensitive dye recordings from up to 100 neurons in a ganglion of the medicinal leech responding to tactile skin stimulation. Without relying on any prior physiological knowledge, the network nodes identified by our statistical analysis were found to match well with published cell types involved in tactile stimulus processing and to be consistent across stimulus conditions and preparations.

  17. The Profile of Creativity and Proposing Statistical Problem Quality Level Reviewed From Cognitive Style

    NASA Astrophysics Data System (ADS)

    Awi; Ahmar, A. S.; Rahman, A.; Minggi, I.; Mulbar, U.; Asdar; Ruslan; Upu, H.; Alimuddin; Hamda; Rosidah; Sutamrin; Tiro, M. A.; Rusli

    2018-01-01

    This research aims to reveal the profile about the level of creativity and the ability to propose statistical problem of students at Mathematics Education 2014 Batch in the State University of Makassar in terms of their cognitive style. This research uses explorative qualitative method by giving meta-cognitive scaffolding at the time of research. The hypothesis of research is that students who have field independent (FI) cognitive style in statistics problem posing from the provided information already able to propose the statistical problem that can be solved and create new data and the problem is already been included as a high quality statistical problem, while students who have dependent cognitive field (FD) commonly are still limited in statistics problem posing that can be finished and do not load new data and the problem is included as medium quality statistical problem.

  18. The taxonomy statistic uncovers novel clinical patterns in a population of ischemic stroke patients.

    PubMed

    Tukiendorf, Andrzej; Kaźmierski, Radosław; Michalak, Sławomir

    2013-01-01

    In this paper, we describe a simple taxonomic approach for clinical data mining elaborated by Marczewski and Steinhaus (M-S), whose performance equals the advanced statistical methodology known as the expectation-maximization (E-M) algorithm. We tested these two methods on a cohort of ischemic stroke patients. The comparison of both methods revealed strong agreement. Direct agreement between M-S and E-M classifications reached 83%, while Cohen's coefficient of agreement was κ = 0.766(P < 0.0001). The statistical analysis conducted and the outcomes obtained in this paper revealed novel clinical patterns in ischemic stroke patients. The aim of the study was to evaluate the clinical usefulness of Marczewski-Steinhaus' taxonomic approach as a tool for the detection of novel patterns of data in ischemic stroke patients and the prediction of disease outcome. In terms of the identification of fairly frequent types of stroke patients using their age, National Institutes of Health Stroke Scale (NIHSS), and diabetes mellitus (DM) status, when dealing with rough characteristics of patients, four particular types of patients are recognized, which cannot be identified by means of routine clinical methods. Following the obtained taxonomical outcomes, the strong correlation between the health status at moment of admission to emergency department (ED) and the subsequent recovery of patients is established. Moreover, popularization and simplification of the ideas of advanced mathematicians may provide an unconventional explorative platform for clinical problems.

  19. The impact of a scheduling change on ninth grade high school performance on biology benchmark exams and the California Standards Test

    NASA Astrophysics Data System (ADS)

    Leonardi, Marcelo

    The primary purpose of this study was to examine the impact of a scheduling change from a trimester 4x4 block schedule to a modified hybrid schedule on student achievement in ninth grade biology courses. This study examined the impact of the scheduling change on student achievement through teacher created benchmark assessments in Genetics, DNA, and Evolution and on the California Standardized Test in Biology. The secondary purpose of this study examined the ninth grade biology teacher perceptions of ninth grade biology student achievement. Using a mixed methods research approach, data was collected both quantitatively and qualitatively as aligned to research questions. Quantitative methods included gathering data from departmental benchmark exams and California Standardized Test in Biology and conducting multiple analysis of covariance and analysis of covariance to determine significance differences. Qualitative methods include journal entries questions and focus group interviews. The results revealed a statistically significant increase in scores on both the DNA and Evolution benchmark exams. DNA and Evolution benchmark exams showed significant improvements from a change in scheduling format. The scheduling change was responsible for 1.5% of the increase in DNA benchmark scores and 2% of the increase in Evolution benchmark scores. The results revealed a statistically significant decrease in scores on the Genetics Benchmark exam as a result of the scheduling change. The scheduling change was responsible for 1% of the decrease in Genetics benchmark scores. The results also revealed a statistically significant increase in scores on the CST Biology exam. The scheduling change was responsible for .7% of the increase in CST Biology scores. Results of the focus group discussions indicated that all teachers preferred the modified hybrid schedule over the trimester schedule and that it improved student achievement.

  20. Primary Student-Teachers' Conceptual Understanding of the Greenhouse Effect: A mixed method study

    NASA Astrophysics Data System (ADS)

    Ratinen, Ilkka Johannes

    2013-04-01

    The greenhouse effect is a reasonably complex scientific phenomenon which can be used as a model to examine students' conceptual understanding in science. Primary student-teachers' understanding of global environmental problems, such as climate change and ozone depletion, indicates that they have many misconceptions. The present mixed method study examines Finnish primary student-teachers' understanding of the greenhouse effect based on the results obtained via open-ended and closed-form questionnaires. The open-ended questionnaire considers primary student-teachers' spontaneous ideas about the greenhouse effect depicted by concept maps. The present study also uses statistical analysis to reveal respondents' conceptualization of the greenhouse effect. The concept maps and statistical analysis reveal that the primary student-teachers' factual knowledge and their conceptual understanding of the greenhouse effect are incomplete and even misleading. In the light of the results of the present study, proposals for modifying the instruction of climate change in science, especially in geography, are presented.

  1. Statistical Methods for Detecting Differentially Abundant Features in Clinical Metagenomic Samples

    PubMed Central

    White, James Robert; Nagarajan, Niranjan; Pop, Mihai

    2009-01-01

    Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them. We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing) to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software can also be applied to digital gene expression studies (e.g. SAGE). A web server implementation of our methods and freely available source code can be found at http://metastats.cbcb.umd.edu/. PMID:19360128

  2. SU-F-J-217: Accurate Dose Volume Parameters Calculation for Revealing Rectum Dose-Toxicity Effect Using Deformable Registration in Cervical Cancer Brachytherapy: A Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhen, X; Chen, H; Liao, Y

    Purpose: To study the feasibility of employing deformable registration methods for accurate rectum dose volume parameters calculation and their potentials in revealing rectum dose-toxicity between complication and non-complication cervical cancer patients with brachytherapy treatment. Method and Materials: Data from 60 patients treated with BT including planning images, treatment plans, and follow-up clinical exam were retrospectively collected. Among them, 12 patients complained about hematochezia were further examined with colonoscopy and scored as Grade 1–3 complication (CP). Meanwhile, another 12 non-complication (NCP) patients were selected as a reference group. To seek for potential gains in rectum toxicity prediction when fractional anatomical deformationsmore » are account for, the rectum dose volume parameters D0.1/1/2cc of the selected patients were retrospectively computed by three different approaches: the simple “worstcase scenario” (WS) addition method, an intensity-based deformable image registration (DIR) algorithm-Demons, and a more accurate, recent developed local topology preserved non-rigid point matching algorithm (TOP). Statistical significance of the differences between rectum doses of the CP group and the NCP group were tested by a two-tailed t-test and results were considered to be statistically significant if p < 0.05. Results: For the D0.1cc, no statistical differences are found between the CP and NCP group in all three methods. For the D1cc, dose difference is not detected by the WS method, however, statistical differences between the two groups are observed by both Demons and TOP, and more evident in TOP. For the D2cc, the CP and NCP cases are statistically significance of the difference for all three methods but more pronounced with TOP. Conclusion: In this study, we calculated the rectum D0.1/1/2cc by simple WS addition and two DIR methods and seek for gains in rectum toxicity prediction. The results favor the claim that accurate dose deformation and summation tend to be more sensitive in unveiling the dose-toxicity relationship. This work is supported in part by grant from VARIAN MEDICAL SYSTEMS INC, the National Natural Science Foundation of China (no 81428019 and no 81301940), the Guangdong Natural Science Foundation (2015A030313302)and the 2015 Pearl River S&T Nova Program of Guangzhou (201506010096).« less

  3. Evaluation of the Kinetic Property of Single-Molecule Junctions by Tunneling Current Measurements.

    PubMed

    Harashima, Takanori; Hasegawa, Yusuke; Kiguchi, Manabu; Nishino, Tomoaki

    2018-01-01

    We investigated the formation and breaking of single-molecule junctions of two kinds of dithiol molecules by time-resolved tunneling current measurements in a metal nanogap. The resulting current trajectory was statistically analyzed to determine the single-molecule conductance and, more importantly, to reveal the kinetic property of the single-molecular junction. These results suggested that combining a measurement of the single-molecule conductance and statistical analysis is a promising method to uncover the kinetic properties of the single-molecule junction.

  4. Statistics attack on `quantum private comparison with a malicious third party' and its improvement

    NASA Astrophysics Data System (ADS)

    Gu, Jun; Ho, Chih-Yung; Hwang, Tzonelih

    2018-02-01

    Recently, Sun et al. (Quantum Inf Process:14:2125-2133, 2015) proposed a quantum private comparison protocol allowing two participants to compare the equality of their secrets via a malicious third party (TP). They designed an interesting trap comparison method to prevent the TP from knowing the final comparison result. However, this study shows that the malicious TP can use the statistics attack to reveal the comparison result. A simple modification is hence proposed to solve this problem.

  5. Non-Earth-centric life detection

    NASA Technical Reports Server (NTRS)

    Conrad, P. G.; Nealson, K. H.

    2000-01-01

    Our hope is that life will, bit by bit, reveal the clues that will allow us to piece together enough evidence to recognize it whenever and however it presents itself. Indisputable evidence is measurable, statistically meaningful and independent of the nature of the life it defines. That the evidence for life be measurable is a fundamental requirement of the scientific method, as is the requirement for statistical significance, and this quantitation is what enables us to differentiate the measurable criteria of candidate biosignatures from a background (host environment).

  6. Reveal Salmonella 2.0 test for detection of Salmonella spp. in foods and environmental samples. Performance Tested Method 960801.

    PubMed

    Hoerner, Rebecca; Feldpausch, Jill; Gray, R Lucas; Curry, Stephanie; Islam, Zahidul; Goldy, Tim; Klein, Frank; Tadese, Theodros; Rice, Jennifer; Mozola, Mark

    2011-01-01

    Reveal Salmonella 2.0 is an improved version of the original Reveal Salmonella lateral flow immunoassay and is applicable to the detection of Salmonella enterica serogroups A-E in a variety of food and environmental samples. A Performance Tested Method validation study was conducted to compare performance of the Reveal 2.0 method with that of the U.S. Department of Agriculture-Food Safety and Inspection Service or U.S. Food and Drug Administration/Bacteriological Analytical Manual reference culture methods for detection of Salmonella spp. in chicken carcass rinse, raw ground turkey, raw ground beef, hot dogs, raw shrimp, a ready-to-eat meal product, dry pet food, ice cream, spinach, cantaloupe, peanut butter, stainless steel surface, and sprout irrigation water. In a total of 17 trials performed internally and four trials performed in an independent laboratory, there were no statistically significant differences in performance of the Reveal 2.0 and reference culture procedures as determined by Chi-square analysis, with the exception of one trial with stainless steel surface and one trial with sprout irrigation water where there were significantly more positive results by the Reveal 2.0 method. Considering all data generated in testing food samples using enrichment procedures specifically designed for the Reveal method, overall sensitivity of the Reveal method relative to the reference culture methods was 99%. In testing environmental samples, sensitivity of the Reveal method relative to the reference culture method was 164%. For select foods, use of the Reveal test in conjunction with reference method enrichment resulted in overall sensitivity of 92%. There were no unconfirmed positive results on uninoculated control samples in any trials for specificity of 100%. In inclusivity testing, 102 different Salmonella serovars belonging to serogroups A-E were tested and 99 were consistently positive in the Reveal test. In exclusivity testing of 33 strains of non-salmonellae representing 14 genera, 32 were negative when tested with Reveal following nonselective enrichment, and the remaining strain was found to be substantially inhibited by the enrichment media used with the Reveal method. Results of ruggedness testing showed that the Reveal test produces accurate results even with substantial deviation in sample volume or device development time.

  7. Organizational downsizing and age discrimination litigation: the influence of personnel practices and statistical evidence on litigation outcomes.

    PubMed

    Wingate, Peter H; Thornton, George C; McIntyre, Kelly S; Frame, Jennifer H

    2003-02-01

    The present study examined relationships between reduction-in-force (RIF) personnel practices, presentation of statistical evidence, and litigation outcomes. Policy capturing methods were utilized to analyze the components of 115 federal district court opinions involving age discrimination disparate treatment allegations and organizational downsizing. Univariate analyses revealed meaningful links between RIF personnel practices, use of statistical evidence, and judicial verdict. The defendant organization was awarded summary judgment in 73% of the claims included in the study. Judicial decisions in favor of the defendant organization were found to be significantly related to such variables as formal performance appraisal systems, termination decision review within the organization, methods of employee assessment and selection for termination, and the presence of a concrete layoff policy. The use of statistical evidence in ADEA disparate treatment litigation was investigated and found to be a potentially persuasive type of indirect evidence. Legal, personnel, and evidentiary ramifications are reviewed, and a framework of downsizing mechanics emphasizing legal defensibility is presented.

  8. Nakagami-based total variation method for speckle reduction in thyroid ultrasound images.

    PubMed

    Koundal, Deepika; Gupta, Savita; Singh, Sukhwinder

    2016-02-01

    A good statistical model is necessary for the reduction in speckle noise. The Nakagami model is more general than the Rayleigh distribution for statistical modeling of speckle in ultrasound images. In this article, the Nakagami-based noise removal method is presented to enhance thyroid ultrasound images and to improve clinical diagnosis. The statistics of log-compressed image are derived from the Nakagami distribution following a maximum a posteriori estimation framework. The minimization problem is solved by optimizing an augmented Lagrange and Chambolle's projection method. The proposed method is evaluated on both artificial speckle-simulated and real ultrasound images. The experimental findings reveal the superiority of the proposed method both quantitatively and qualitatively in comparison with other speckle reduction methods reported in the literature. The proposed method yields an average signal-to-noise ratio gain of more than 2.16 dB over the non-convex regularizer-based speckle noise removal method, 3.83 dB over the Aubert-Aujol model, 1.71 dB over the Shi-Osher model and 3.21 dB over the Rudin-Lions-Osher model on speckle-simulated synthetic images. Furthermore, visual evaluation of the despeckled images shows that the proposed method suppresses speckle noise well while preserving the textures and fine details. © IMechE 2015.

  9. Tooth-size discrepancy: A comparison between manual and digital methods

    PubMed Central

    Correia, Gabriele Dória Cabral; Habib, Fernando Antonio Lima; Vogel, Carlos Jorge

    2014-01-01

    Introduction Technological advances in Dentistry have emerged primarily in the area of diagnostic tools. One example is the 3D scanner, which can transform plaster models into three-dimensional digital models. Objective This study aimed to assess the reliability of tooth size-arch length discrepancy analysis measurements performed on three-dimensional digital models, and compare these measurements with those obtained from plaster models. Material and Methods To this end, plaster models of lower dental arches and their corresponding three-dimensional digital models acquired with a 3Shape R700T scanner were used. All of them had lower permanent dentition. Four different tooth size-arch length discrepancy calculations were performed on each model, two of which by manual methods using calipers and brass wire, and two by digital methods using linear measurements and parabolas. Results Data were statistically assessed using Friedman test and no statistically significant differences were found between the two methods (P > 0.05), except for values found by the linear digital method which revealed a slight, non-significant statistical difference. Conclusions Based on the results, it is reasonable to assert that any of these resources used by orthodontists to clinically assess tooth size-arch length discrepancy can be considered reliable. PMID:25279529

  10. Evaluation of Integrated Planning Systems in California Community Colleges

    ERIC Educational Resources Information Center

    Buckley, Jerry L.; Piland, William E.

    2012-01-01

    California community colleges are experiencing unprecedented levels of sanctions from their accrediting agency. A survey of planners in these colleges reveals a wide gap between current practice and perceived importance of integrated planning practices, as well as misalignment in budgeting methods. Statistically significant gaps were identified…

  11. Investigating Student Understanding of Histograms

    ERIC Educational Resources Information Center

    Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris

    2014-01-01

    Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…

  12. Using genetic data to strengthen causal inference in observational research.

    PubMed

    Pingault, Jean-Baptiste; O'Reilly, Paul F; Schoeler, Tabea; Ploubidis, George B; Rijsdijk, Frühling; Dudbridge, Frank

    2018-06-05

    Causal inference is essential across the biomedical, behavioural and social sciences.By progressing from confounded statistical associations to evidence of causal relationships, causal inference can reveal complex pathways underlying traits and diseases and help to prioritize targets for intervention. Recent progress in genetic epidemiology - including statistical innovation, massive genotyped data sets and novel computational tools for deep data mining - has fostered the intense development of methods exploiting genetic data and relatedness to strengthen causal inference in observational research. In this Review, we describe how such genetically informed methods differ in their rationale, applicability and inherent limitations and outline how they should be integrated in the future to offer a rich causal inference toolbox.

  13. Measurement and statistical analysis of single-molecule current-voltage characteristics, transition voltage spectroscopy, and tunneling barrier height.

    PubMed

    Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian

    2011-11-30

    We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.

  14. Seasonality of climate change and oscillations in the Northeast Asia and Northwest Pacific

    NASA Astrophysics Data System (ADS)

    Ponomarev, V.; Salomatin, A.; Kaplunenko, D.; Krokhin, V.

    2003-04-01

    The main goals of this study are to estimate and compare the seasonality of centennial/semi-centennial climatic tendencies and dominated oscillations in surface air temperature and precipitation over continental and marginal areas of the Northeast Asia, as well as in the Northwest Pacific SST. We use monthly mean data for the 20th century from the NOAA Global History Climatic Network, JMA data base and WMU/COADS World Atlas of Surface Marine Data. Details of climate change/oscillations associated with cooling or warming in different areas and periods of a year are revealed. Wavelet analyses and two methods of the linear trend estimation are applied. First one is least-squares (LS) method with Fisher’s test for statistical significance level. Second one is nonparametric robust (NR) method, based on Theil's rank regression and Kendall's test for statistical significance level. The NR method should be applied to time series with abnormal distribution function typical for precipitation time series. Application of the NR method result in increase the statistical significance of both positive and negative linear trends in all cases of abnormal distribution with negative/positive skewness and low/high kurtosis. Using this method, we have determined spatial patterns of statistically significant climatic trends in surface air temperature, precipitation in the Northeast Asia, and in the Northwest Pacific SST. The most substantial centennial warming in the vast continental area of the mid-latitude band is found mainly for December March. The semi-centennial/ centennial cooling occurs in South Siberia and the subarctic mid-continental area in June September. Opposite tendencies were also revealed in precipitation and SST. Positive semi-centennial tendency in the SST in the second half of the 20th century predominates in the Kuroshio region and in the northwestern area of the subarctic gyre in winter. Negative tendency in the SST dominates in the southwestern subarctic gyre and the offshore area of the subtropic gyre in summer. Comparison of air temperature, precipitation, SST trends and oscillations in different seasons over land marginal and continental areas, as well as in the subarctic and subtropic zones indicates general features of the Northeast Asian Monsoon change/oscillation in 20th century and its second half. Similar features of seasonality in centennial, semi-centennial trends and dominated oscillations are manifested. Climate change and oscillation in the Northwest Pacific marginal seas revealed for the 20th century are explained.

  15. Seasonality of climate change and oscillations in the Northeast Asia and Northwest Pacific

    NASA Astrophysics Data System (ADS)

    Ponomarev, V.; Salomatin, A.; Kaplunenko, D.; Krokhin, V.

    2003-04-01

    The main goals of this study are to estimate and compare the centennial/semi-centennial climatic tendencies and oscillations in surface air temperature and precipitation over continental and marginal areas of the Northeast Asian, as well as in the Northwest Pacific SST for all months of a year. We use monthly mean data for the 20th century from the NOAA Global History Climatic Network, JMA data base and WMU/COADS World Atlas of Surface Marine Data. Details of climate change/oscillations associated with cooling or warming in different areas and periods of a year are revealed. Wavelet analyses and two methods of the linear trend estimation are applied. First one is least-squares (LS) method with Fisher’s test for statistical significance level. Second one is nonparametric robust (NR) method, based on Theil's rank regression and Kendall's test for statistical significance level. The NR method should be applied to time series with abnormal distribution function typical for precipitation time series. Application of the NR method result in increase the statistical significance of both positive and negative linear trends in all cases of abnormal distribution with negative/positive skewness and low/high kurtosis. Using this method, we have determined spatial patterns of statistically significant climatic trends in surface air temperature, precipitation in the Northeast Asia, and in the Northwest Pacific SST. The most substantial centennial warming in the vast continental area of the mid-latitude band is found mainly for December March. The semi-centennial/ centennial cooling occurs in South Siberia and the subarctic mid-continental area in June September. Opposite tendencies were also revealed in precipitation and SST. Positive semi-centennial tendency in the SST in the second half of the 20th century predominates in the Kuroshio region and in the northwestern area of the subarctic gyre in winter. Negative tendency in the SST dominates in the southwestern subarctic gyre and the offshore area of the subtropic gyre in summer. Comparison of air temperature, precipitation, SST trends and oscillations in different seasons over land marginal and continental areas, as well as in the subarctic and subtropic zones indicates general features of the Northeast Asian Monsoon change/oscillation in 20th century and its second half. Similar features of seasonality in centennial, semi-centennial trends and dominated oscillations are manifested. Climate change and oscillation in the Northwest Pacific marginal seas revealed for the 20th century are explained.

  16. Introducing Students to Plant Geography: Polar Ordination Applied to Hanging Gardens.

    ERIC Educational Resources Information Center

    Malanson, George P.; And Others

    1993-01-01

    Reports on a research study in which college students used a statistical ordination method to reveal relationships among plant community structures and physical, disturbance, and spatial variables. Concludes that polar ordination helps students understand the methodology of plant geography and encourages further student research. (CFR)

  17. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  18. Finding Statistically Significant Communities in Networks

    PubMed Central

    Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo

    2011-01-01

    Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480

  19. Robust Statistical Detection of Power-Law Cross-Correlation.

    PubMed

    Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert

    2016-06-02

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  20. Robust Statistical Detection of Power-Law Cross-Correlation

    PubMed Central

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-01-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630

  1. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  2. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.

    PubMed

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-19

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  3. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    PubMed Central

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  4. Evaluation on the use of cerium in the NBL Titrimetric Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.

    An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less

  5. History, rare, and multiple events of mechanical unfolding of repeat proteins

    NASA Astrophysics Data System (ADS)

    Sumbul, Fidan; Marchesi, Arin; Rico, Felix

    2018-03-01

    Mechanical unfolding of proteins consisting of repeat domains is an excellent tool to obtain large statistics. Force spectroscopy experiments using atomic force microscopy on proteins presenting multiple domains have revealed that unfolding forces depend on the number of folded domains (history) and have reported intermediate states and rare events. However, the common use of unspecific attachment approaches to pull the protein of interest holds important limitations to study unfolding history and may lead to discarding rare and multiple probing events due to the presence of unspecific adhesion and uncertainty on the pulling site. Site-specific methods that have recently emerged minimize this uncertainty and would be excellent tools to probe unfolding history and rare events. However, detailed characterization of these approaches is required to identify their advantages and limitations. Here, we characterize a site-specific binding approach based on the ultrastable complex dockerin/cohesin III revealing its advantages and limitations to assess the unfolding history and to investigate rare and multiple events during the unfolding of repeated domains. We show that this approach is more robust, reproducible, and provides larger statistics than conventional unspecific methods. We show that the method is optimal to reveal the history of unfolding from the very first domain and to detect rare events, while being more limited to assess intermediate states. Finally, we quantify the forces required to unfold two molecules pulled in parallel, difficult when using unspecific approaches. The proposed method represents a step forward toward more reproducible measurements to probe protein unfolding history and opens the door to systematic probing of rare and multiple molecule unfolding mechanisms.

  6. Visualizations of Travel Time Performance Based on Vehicle Reidentification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Stanley Ernest; Sharifi, Elham; Day, Christopher M.

    This paper provides a visual reference of the breadth of arterial performance phenomena based on travel time measures obtained from reidentification technology that has proliferated in the past 5 years. These graphical performance measures are revealed through overlay charts and statistical distribution as revealed through cumulative frequency diagrams (CFDs). With overlays of vehicle travel times from multiple days, dominant traffic patterns over a 24-h period are reinforced and reveal the traffic behavior induced primarily by the operation of traffic control at signalized intersections. A cumulative distribution function in the statistical literature provides a method for comparing traffic patterns from variousmore » time frames or locations in a compact visual format that provides intuitive feedback on arterial performance. The CFD may be accumulated hourly, by peak periods, or by time periods specific to signal timing plans that are in effect. Combined, overlay charts and CFDs provide visual tools with which to assess the quality and consistency of traffic movement for various periods throughout the day efficiently, without sacrificing detail, which is a typical byproduct of numeric-based performance measures. These methods are particularly effective for comparing before-and-after median travel times, as well as changes in interquartile range, to assess travel time reliability.« less

  7. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  8. Modelling gene expression profiles related to prostate tumor progression using binary states

    PubMed Central

    2013-01-01

    Background Cancer is a complex disease commonly characterized by the disrupted activity of several cancer-related genes such as oncogenes and tumor-suppressor genes. Previous studies suggest that the process of tumor progression to malignancy is dynamic and can be traced by changes in gene expression. Despite the enormous efforts made for differential expression detection and biomarker discovery, few methods have been designed to model the gene expression level to tumor stage during malignancy progression. Such models could help us understand the dynamics and simplify or reveal the complexity of tumor progression. Methods We have modeled an on-off state of gene activation per sample then per stage to select gene expression profiles associated to tumor progression. The selection is guided by statistical significance of profiles based on random permutated datasets. Results We show that our method identifies expected profiles corresponding to oncogenes and tumor suppressor genes in a prostate tumor progression dataset. Comparisons with other methods support our findings and indicate that a considerable proportion of significant profiles is not found by other statistical tests commonly used to detect differential expression between tumor stages nor found by other tailored methods. Ontology and pathway analysis concurred with these findings. Conclusions Results suggest that our methodology may be a valuable tool to study tumor malignancy progression, which might reveal novel cancer therapies. PMID:23721350

  9. Revealing how network structure affects accuracy of link prediction

    NASA Astrophysics Data System (ADS)

    Yang, Jin-Xuan; Zhang, Xiao-Dong

    2017-08-01

    Link prediction plays an important role in network reconstruction and network evolution. The network structure affects the accuracy of link prediction, which is an interesting problem. In this paper we use common neighbors and the Gini coefficient to reveal the relation between them, which can provide a good reference for the choice of a suitable link prediction algorithm according to the network structure. Moreover, the statistical analysis reveals correlation between the common neighbors index, Gini coefficient index and other indices to describe the network structure, such as Laplacian eigenvalues, clustering coefficient, degree heterogeneity, and assortativity of network. Furthermore, a new method to predict missing links is proposed. The experimental results show that the proposed algorithm yields better prediction accuracy and robustness to the network structure than existing currently used methods for a variety of real-world networks.

  10. Using Performance Methods to Enhance Students' Reading Fluency

    ERIC Educational Resources Information Center

    Young, Chase; Valadez, Corinne; Gandara, Cori

    2016-01-01

    The quasi-experimental study examined the effects of pairing Rock and Read with Readers Theater and only Rock and Read on second grade students' reading fluency scores. The 51 subjects were pre- and post-tested on five different reading fluency measures. A series of 3 × 2 repeated measures ANOVAs revealed statistically significant interaction…

  11. Impact of E-Learning and Digitalization in Primary and Secondary Schools

    ERIC Educational Resources Information Center

    Tunmibi, Sunday; Aregbesola, Ayooluwa; Adejobi, Pascal; Ibrahim, Olaniyi

    2015-01-01

    This study examines into the impact of e-learning and digitalization in primary and secondary schools, using Greensprings School in Lagos State, Nigeria as a case study. Questionnaire was used as a data collection instrument, and descriptive statistical method was adopted for analysis. Responses from students and teachers reveal that application…

  12. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    ERIC Educational Resources Information Center

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  13. Thou Shalt Not Bear False Witness against Null Hypothesis Significance Testing

    ERIC Educational Resources Information Center

    García-Pérez, Miguel A.

    2017-01-01

    Null hypothesis significance testing (NHST) has been the subject of debate for decades and alternative approaches to data analysis have been proposed. This article addresses this debate from the perspective of scientific inquiry and inference. Inference is an inverse problem and application of statistical methods cannot reveal whether effects…

  14. QSAR Study of p56lck Protein Tyrosine Kinase Inhibitory Activity of Flavonoid Derivatives Using MLR and GA-PLS

    PubMed Central

    Fassihi, Afshin; Sabet, Razieh

    2008-01-01

    Quantitative relationships between molecular structure and p56lck protein tyrosine kinase inhibitory activity of 50 flavonoid derivatives are discovered by MLR and GA-PLS methods. Different QSAR models revealed that substituent electronic descriptors (SED) parameters have significant impact on protein tyrosine kinase inhibitory activity of the compounds. Between the two statistical methods employed, GA-PLS gave superior results. The resultant GA-PLS model had a high statistical quality (R2 = 0.74 and Q2 = 0.61) for predicting the activity of the inhibitors. The models proposed in the present work are more useful in describing QSAR of flavonoid derivatives as p56lck protein tyrosine kinase inhibitors than those provided previously. PMID:19325836

  15. Analysis of data collected from right and left limbs: Accounting for dependence and improving statistical efficiency in musculoskeletal research.

    PubMed

    Stewart, Sarah; Pearson, Janet; Rome, Keith; Dalbeth, Nicola; Vandal, Alain C

    2018-01-01

    Statistical techniques currently used in musculoskeletal research often inefficiently account for paired-limb measurements or the relationship between measurements taken from multiple regions within limbs. This study compared three commonly used analysis methods with a mixed-models approach that appropriately accounted for the association between limbs, regions, and trials and that utilised all information available from repeated trials. Four analysis were applied to an existing data set containing plantar pressure data, which was collected for seven masked regions on right and left feet, over three trials, across three participant groups. Methods 1-3 averaged data over trials and analysed right foot data (Method 1), data from a randomly selected foot (Method 2), and averaged right and left foot data (Method 3). Method 4 used all available data in a mixed-effects regression that accounted for repeated measures taken for each foot, foot region and trial. Confidence interval widths for the mean differences between groups for each foot region were used as a criterion for comparison of statistical efficiency. Mean differences in pressure between groups were similar across methods for each foot region, while the confidence interval widths were consistently smaller for Method 4. Method 4 also revealed significant between-group differences that were not detected by Methods 1-3. A mixed effects linear model approach generates improved efficiency and power by producing more precise estimates compared to alternative approaches that discard information in the process of accounting for paired-limb measurements. This approach is recommended in generating more clinically sound and statistically efficient research outputs. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Analysis and interpretation of cost data in randomised controlled trials: review of published studies

    PubMed Central

    Barber, Julie A; Thompson, Simon G

    1998-01-01

    Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854

  17. Experimental and environmental factors affect spurious detection of ecological thresholds

    USGS Publications Warehouse

    Daily, Jonathan P.; Hitt, Nathaniel P.; Smith, David; Snyder, Craig D.

    2012-01-01

    Threshold detection methods are increasingly popular for assessing nonlinear responses to environmental change, but their statistical performance remains poorly understood. We simulated linear change in stream benthic macroinvertebrate communities and evaluated the performance of commonly used threshold detection methods based on model fitting (piecewise quantile regression [PQR]), data partitioning (nonparametric change point analysis [NCPA]), and a hybrid approach (significant zero crossings [SiZer]). We demonstrated that false detection of ecological thresholds (type I errors) and inferences on threshold locations are influenced by sample size, rate of linear change, and frequency of observations across the environmental gradient (i.e., sample-environment distribution, SED). However, the relative importance of these factors varied among statistical methods and between inference types. False detection rates were influenced primarily by user-selected parameters for PQR (τ) and SiZer (bandwidth) and secondarily by sample size (for PQR) and SED (for SiZer). In contrast, the location of reported thresholds was influenced primarily by SED. Bootstrapped confidence intervals for NCPA threshold locations revealed strong correspondence to SED. We conclude that the choice of statistical methods for threshold detection should be matched to experimental and environmental constraints to minimize false detection rates and avoid spurious inferences regarding threshold location.

  18. Robust hierarchical state-space models reveal diel variation in travel rates of migrating leatherback turtles.

    PubMed

    Jonsen, Ian D; Myers, Ransom A; James, Michael C

    2006-09-01

    1. Biological and statistical complexity are features common to most ecological data that hinder our ability to extract meaningful patterns using conventional tools. Recent work on implementing modern statistical methods for analysis of such ecological data has focused primarily on population dynamics but other types of data, such as animal movement pathways obtained from satellite telemetry, can also benefit from the application of modern statistical tools. 2. We develop a robust hierarchical state-space approach for analysis of multiple satellite telemetry pathways obtained via the Argos system. State-space models are time-series methods that allow unobserved states and biological parameters to be estimated from data observed with error. We show that the approach can reveal important patterns in complex, noisy data where conventional methods cannot. 3. Using the largest Atlantic satellite telemetry data set for critically endangered leatherback turtles, we show that the diel pattern in travel rates of these turtles changes over different phases of their migratory cycle. While foraging in northern waters the turtles show similar travel rates during day and night, but on their southward migration to tropical waters travel rates are markedly faster during the day. These patterns are generally consistent with diving data, and may be related to changes in foraging behaviour. Interestingly, individuals that migrate southward to breed generally show higher daytime travel rates than individuals that migrate southward in a non-breeding year. 4. Our approach is extremely flexible and can be applied to many ecological analyses that use complex, sequential data.

  19. Power-law behaviour evaluation from foreign exchange market data using a wavelet transform method

    NASA Astrophysics Data System (ADS)

    Wei, H. L.; Billings, S. A.

    2009-09-01

    Numerous studies in the literature have shown that the dynamics of many time series including observations in foreign exchange markets exhibit scaling behaviours. A simple new statistical approach, derived from the concept of the continuous wavelet transform correlation function (WTCF), is proposed for the evaluation of power-law properties from observed data. The new method reveals that foreign exchange rates obey power-laws and thus belong to the class of self-similarity processes.

  20. The critical role of NIR spectroscopy and statistical process control (SPC) strategy towards captopril tablets (25 mg) manufacturing process understanding: a case study.

    PubMed

    Curtivo, Cátia Panizzon Dal; Funghi, Nathália Bitencourt; Tavares, Guilherme Diniz; Barbosa, Sávio Fujita; Löbenberg, Raimar; Bou-Chacra, Nádia Araci

    2015-05-01

    In this work, near-infrared spectroscopy (NIRS) method was used to evaluate the uniformity of dosage units of three captopril 25 mg tablets commercial batches. The performance of the calibration method was assessed by determination of Q value (0.9986), standard error of estimation (C-set SEE = 1.956), standard error of prediction (V-set SEP = 2.076) as well as the consistency (106.1%). These results indicated the adequacy of the selected model. The method validation revealed the agreement of the reference high pressure liquid chromatography (HPLC) and NIRS methods. The process evaluation using the NIRS method showed that the variability was due to common causes and delivered predictable results consistently. Cp and Cpk values were, respectively, 2.05 and 1.80. These results revealed a non-centered process in relation to the average target (100% w/w), in the specified range (85-115%). The probability of failure was 21:100 million tablets of captopril. The NIRS in combination with the method of multivariate calibration, partial least squares (PLS) regression, allowed the development of methodology for the uniformity of dosage units evaluation of captopril tablets 25 mg. The statistical process control strategy associated with NIRS method as PAT played a critical role in understanding of the sources and degree of variation and its impact on the process. This approach led towards a better process understanding and provided the sound scientific basis for its continuous improvement.

  1. Multifactor-Dimensionality Reduction Reveals High-Order Interactions among Estrogen-Metabolism Genes in Sporadic Breast Cancer

    PubMed Central

    Ritchie, Marylyn D.; Hahn, Lance W.; Roodi, Nady; Bailey, L. Renee; Dupont, William D.; Parl, Fritz F.; Moore, Jason H.

    2001-01-01

    One of the greatest challenges facing human geneticists is the identification and characterization of susceptibility genes for common complex multifactorial human diseases. This challenge is partly due to the limitations of parametric-statistical methods for detection of gene effects that are dependent solely or partially on interactions with other genes and with environmental exposures. We introduce multifactor-dimensionality reduction (MDR) as a method for reducing the dimensionality of multilocus information, to improve the identification of polymorphism combinations associated with disease risk. The MDR method is nonparametric (i.e., no hypothesis about the value of a statistical parameter is made), is model-free (i.e., it assumes no particular inheritance model), and is directly applicable to case-control and discordant-sib-pair studies. Using simulated case-control data, we demonstrate that MDR has reasonable power to identify interactions among two or more loci in relatively small samples. When it was applied to a sporadic breast cancer case-control data set, in the absence of any statistically significant independent main effects, MDR identified a statistically significant high-order interaction among four polymorphisms from three different estrogen-metabolism genes. To our knowledge, this is the first report of a four-locus interaction associated with a common complex multifactorial disease. PMID:11404819

  2. Use of dichotomous choice nonmarket methods to value the whooping crane resource

    Treesearch

    J. Michael Bowker; John R. Stoll

    1985-01-01

    A dichotomous choice form of contingent valuation is applied to quantify individuals' economic surplus associated with preservation of the whooping crane resource. Specific issues and limitations of the empirical approach are discussed. The results of this case study reveal that models with similar statistical fits can lead to very disparate measures of economic...

  3. How to Use Value-Added Analysis to Improve Student Learning: A Field Guide for School and District Leaders

    ERIC Educational Resources Information Center

    Kennedy, Kate; Peters, Mary; Thomas, Mike

    2012-01-01

    Value-added analysis is the most robust, statistically significant method available for helping educators quantify student progress over time. This powerful tool also reveals tangible strategies for improving instruction. Built around the work of Battelle for Kids, this book provides a field-tested continuous improvement model for using…

  4. Integrative Analysis of Salmonellosis Outbreaks in Israel 1999-2012 Revealed an Invasive S. enterica Serovar 9,12:l,v:- and Endemic S. Typhimurium DT104 strain

    USDA-ARS?s Scientific Manuscript database

    Salmonella enterica is the leading etiologic agent of bacterial foodborne outbreaks worldwide. Methods. Laboratory-based statistical surveillance, molecular and genomics analyses were applied to characterize Salmonella outbreaks pattern in Israel. 65,087 Salmonella isolates reported to the National ...

  5. A Systematic Comparison of Linear Regression-Based Statistical Methods to Assess Exposome-Health Associations.

    PubMed

    Agier, Lydiane; Portengen, Lützen; Chadeau-Hyam, Marc; Basagaña, Xavier; Giorgis-Allemand, Lise; Siroux, Valérie; Robinson, Oliver; Vlaanderen, Jelle; González, Juan R; Nieuwenhuijsen, Mark J; Vineis, Paolo; Vrijheid, Martine; Slama, Rémy; Vermeulen, Roel

    2016-12-01

    The exposome constitutes a promising framework to improve understanding of the effects of environmental exposures on health by explicitly considering multiple testing and avoiding selective reporting. However, exposome studies are challenged by the simultaneous consideration of many correlated exposures. We compared the performances of linear regression-based statistical methods in assessing exposome-health associations. In a simulation study, we generated 237 exposure covariates with a realistic correlation structure and with a health outcome linearly related to 0 to 25 of these covariates. Statistical methods were compared primarily in terms of false discovery proportion (FDP) and sensitivity. On average over all simulation settings, the elastic net and sparse partial least-squares regression showed a sensitivity of 76% and an FDP of 44%; Graphical Unit Evolutionary Stochastic Search (GUESS) and the deletion/substitution/addition (DSA) algorithm revealed a sensitivity of 81% and an FDP of 34%. The environment-wide association study (EWAS) underperformed these methods in terms of FDP (average FDP, 86%) despite a higher sensitivity. Performances decreased considerably when assuming an exposome exposure matrix with high levels of correlation between covariates. Correlation between exposures is a challenge for exposome research, and the statistical methods investigated in this study were limited in their ability to efficiently differentiate true predictors from correlated covariates in a realistic exposome context. Although GUESS and DSA provided a marginally better balance between sensitivity and FDP, they did not outperform the other multivariate methods across all scenarios and properties examined, and computational complexity and flexibility should also be considered when choosing between these methods. Citation: Agier L, Portengen L, Chadeau-Hyam M, Basagaña X, Giorgis-Allemand L, Siroux V, Robinson O, Vlaanderen J, González JR, Nieuwenhuijsen MJ, Vineis P, Vrijheid M, Slama R, Vermeulen R. 2016. A systematic comparison of linear regression-based statistical methods to assess exposome-health associations. Environ Health Perspect 124:1848-1856; http://dx.doi.org/10.1289/EHP172.

  6. Statistical universals reveal the structures and functions of human music.

    PubMed

    Savage, Patrick E; Brown, Steven; Sakai, Emi; Currie, Thomas E

    2015-07-21

    Music has been called "the universal language of mankind." Although contemporary theories of music evolution often invoke various musical universals, the existence of such universals has been disputed for decades and has never been empirically demonstrated. Here we combine a music-classification scheme with statistical analyses, including phylogenetic comparative methods, to examine a well-sampled global set of 304 music recordings. Our analyses reveal no absolute universals but strong support for many statistical universals that are consistent across all nine geographic regions sampled. These universals include 18 musical features that are common individually as well as a network of 10 features that are commonly associated with one another. They span not only features related to pitch and rhythm that are often cited as putative universals but also rarely cited domains including performance style and social context. These cross-cultural structural regularities of human music may relate to roles in facilitating group coordination and cohesion, as exemplified by the universal tendency to sing, play percussion instruments, and dance to simple, repetitive music in groups. Our findings highlight the need for scientists studying music evolution to expand the range of musical cultures and musical features under consideration. The statistical universals we identified represent important candidates for future investigation.

  7. Statistical universals reveal the structures and functions of human music

    PubMed Central

    Savage, Patrick E.; Brown, Steven; Sakai, Emi; Currie, Thomas E.

    2015-01-01

    Music has been called “the universal language of mankind.” Although contemporary theories of music evolution often invoke various musical universals, the existence of such universals has been disputed for decades and has never been empirically demonstrated. Here we combine a music-classification scheme with statistical analyses, including phylogenetic comparative methods, to examine a well-sampled global set of 304 music recordings. Our analyses reveal no absolute universals but strong support for many statistical universals that are consistent across all nine geographic regions sampled. These universals include 18 musical features that are common individually as well as a network of 10 features that are commonly associated with one another. They span not only features related to pitch and rhythm that are often cited as putative universals but also rarely cited domains including performance style and social context. These cross-cultural structural regularities of human music may relate to roles in facilitating group coordination and cohesion, as exemplified by the universal tendency to sing, play percussion instruments, and dance to simple, repetitive music in groups. Our findings highlight the need for scientists studying music evolution to expand the range of musical cultures and musical features under consideration. The statistical universals we identified represent important candidates for future investigation. PMID:26124105

  8. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu

    2017-06-01

    This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  9. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics

    PubMed Central

    Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an “answer.” Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities. PMID:26909064

  10. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics.

    PubMed

    Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an "answer." Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities.

  11. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    PubMed

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  12. Statistics of dislocation pinning at localized obstacles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.

    2014-10-14

    Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less

  13. SU-F-T-386: Analysis of Three QA Methods for Predicting Dose Deviation Pass Percentage for Lung SBRT VMAT Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, M; To, D; Giaddui, T

    2016-06-15

    Purpose: To investigate the significance of using pinpoint ionization chambers (IC) and RadCalc (RC) in determining the quality of lung SBRT VMAT plans with low dose deviation pass percentage (DDPP) as reported by ScandiDos Delta4 (D4). To quantify the relationship between DDPP and point dose deviations determined by IC (ICDD), RadCalc (RCDD), and median dose deviation reported by D4 (D4DD). Methods: Point dose deviations and D4 DDPP were compiled for 45 SBRT VMAT plans. Eighteen patients were treated on Varian Truebeam linear accelerators (linacs); the remaining 27 were treated on Elekta Synergy linacs with Agility collimators. A one-way analysis ofmore » variance (ANOVA) was performed to determine if there were any statistically significant differences between D4DD, ICDD, and RCDD. Tukey’s test was used to determine which pair of means was statistically different from each other. Multiple regression analysis was performed to determine if D4DD, ICDD, or RCDD are statistically significant predictors of DDPP. Results: Median DDPP, D4DD, ICDD, and RCDD were 80.5% (47.6%–99.2%), −0.3% (−2.0%–1.6%), 0.2% (−7.5%–6.3%), and 2.9% (−4.0%–19.7%), respectively. The ANOVA showed a statistically significant difference between D4DD, ICDD, and RCDD for a 95% confidence interval (p < 0.001). Tukey’s test revealed a statistically significant difference between two pairs of groups, RCDD-D4DD and RCDD-ICDD (p < 0.001), but no difference between ICDD-D4DD (p = 0.485). Multiple regression analysis revealed that ICDD (p = 0.04) and D4DD (p = 0.03) are statistically significant predictors of DDPP with an adjusted r{sup 2} of 0.115. Conclusion: This study shows ICDD predicts trends in D4 DDPP; however this trend is highly variable as shown by our low r{sup 2}. This work suggests that ICDD can be used as a method to verify DDPP in delivery of lung SBRT VMAT plans. RCDD may not validate low DDPP discovered in D4 QA for small field SBRT treatments.« less

  14. [Role of redox- and hormonal metabolism in the mechanisms of skin aging].

    PubMed

    Berianidze, K; Katsitadze, A; Jalaghania, N; Sanikidze, T

    2014-10-01

    The aim of the study was to investigate the role of redox balance in the pathogenesis of skin aging in menopausal women. 30 menopausal women aged 40 to 55 years and 30 reproductive women aged 25 to 35 years were studied. Qualitative assessment of the skin (moisture, fat, elasticity) was performed; in the venous blood hormonal metabolism indicators: estradiole - E, testosterone - T, follicle stimulating hormone - FSH and redox parameters - oxygen and lipid free radical content (EPR method), antioxidant enzymes (catalase, superoxide dismutase (SOD) and glutationreducrase (GR)) activity (spectroscopic method) were studied. According results of the study, in menopausal women statistically significant loss of skin elasticity and increase the number of pores was revealed in comparison to the reproductive women. These changes occur against the background of statistically significant increase of the blood testosterone and FSH content; estradiol in women menopausal period has tendency to decrease. Redox indicators of blood did not differ statistically significant in women of reproductive and menopausal period, although there was a tendency to increase the activity of catalase and GR in menopausal women period, indicating on the intensification of oxidative processes in this age group. Statistically significant negative correlation between blood estradiole content and SOD's activity (r=-0.413, p=0.0017) and positive correlation between blood estradiole content and GR activity (r=0.565, p=0.002) was revealed. Decrease in the estradiol concentration and disbalance in redox-system in the women's blood correlats with the rate of pigmented spots growth and decrease of the skin moisture. It is concluded that in mechanisms of skin aging of menopausal women estrogen-depending alterations in redox-balance places important role.

  15. Statistical modeling of temperature, humidity and wind fields in the atmospheric boundary layer over the Siberian region

    NASA Astrophysics Data System (ADS)

    Lomakina, N. Ya.

    2017-11-01

    The work presents the results of the applied climatic division of the Siberian region into districts based on the methodology of objective classification of the atmospheric boundary layer climates by the "temperature-moisture-wind" complex realized with using the method of principal components and the special similarity criteria of average profiles and the eigen values of correlation matrices. On the territory of Siberia, it was identified 14 homogeneous regions for winter season and 10 regions were revealed for summer. The local statistical models were constructed for each region. These include vertical profiles of mean values, mean square deviations, and matrices of interlevel correlation of temperature, specific humidity, zonal and meridional wind velocity. The advantage of the obtained local statistical models over the regional models is shown.

  16. Detection of Bursts and Pauses in Spike Trains

    PubMed Central

    Ko, D.; Wilson, C. J.; Lobb, C. J.; Paladini, C. A.

    2012-01-01

    Midbrain dopaminergic neurons in vivo exhibit a wide range of firing patterns. They normally fire constantly at a low rate, and speed up, firing a phasic burst when reward exceeds prediction, or pause when an expected reward does not occur. Therefore, the detection of bursts and pauses from spike train data is a critical problem when studying the role of phasic dopamine (DA) in reward related learning, and other DA dependent behaviors. However, few statistical methods have been developed that can identify bursts and pauses simultaneously. We propose a new statistical method, the Robust Gaussian Surprise (RGS) method, which performs an exhaustive search of bursts and pauses in spike trains simultaneously. We found that the RGS method is adaptable to various patterns of spike trains recorded in vivo, and is not influenced by baseline firing rate, making it applicable to all in vivo spike trains where baseline firing rates vary over time. We compare the performance of the RGS method to other methods of detecting bursts, such as the Poisson Surprise (PS), Rank Surprise (RS), and Template methods. Analysis of data using the RGS method reveals potential mechanisms underlying how bursts and pauses are controlled in DA neurons. PMID:22939922

  17. Evaluation of graphical and statistical representation of analytical signals of spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam Mahmoud; Fayez, Yasmin Mohammed; Tawakkol, Shereen Mostafa; Fahmy, Nesma Mahmoud; Shehata, Mostafa Abd El-Atty

    2017-09-01

    Simultaneous determination of miconazole (MIC), mometasone furaoate (MF), and gentamicin (GEN) in their pharmaceutical combination. Gentamicin determination is based on derivatization with of o-phthalaldehyde reagent (OPA) without any interference of other cited drugs, while the spectra of MIC and MF are resolved using both successive and progressive resolution techniques. The first derivative spectrum of MF is measured using constant multiplication or spectrum subtraction, while its recovered zero order spectrum is obtained using derivative transformation. Beside the application of constant value method. Zero order spectrum of MIC is obtained by derivative transformation after getting its first derivative spectrum by derivative subtraction method. The novel method namely, differential amplitude modulation is used to get the concentration of MF and MIC, while the novel graphical method namely, concentration value is used to get the concentration of MIC, MF, and GEN. Accuracy and precision testing of the developed methods show good results. Specificity of the methods is ensured and is successfully applied for the analysis of pharmaceutical formulation of the three drugs in combination. ICH guidelines are used for validation of the proposed methods. Statistical data are calculated, and the results are satisfactory revealing no significant difference regarding accuracy and precision.

  18. Physical and genetic-interaction density reveals functional organization and informs significance cutoffs in genome-wide screens

    PubMed Central

    Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.

    2013-01-01

    Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890

  19. Reliability of intestinal temperature using an ingestible telemetry pill system during exercise in a hot environment.

    PubMed

    Ruddock, Alan D; Tew, Garry A; Purvis, Alison J

    2014-03-01

    Ingestible telemetry pill systems are being increasingly used to assess the intestinal temperature during exercise in hot environments. The purpose of this investigation was to assess the interday reliability of intestinal temperature during an exercise-heat challenge. Intestinal temperature was recorded as 12 physically active men (25 ± 4 years, stature 181.7 ± 7.0 cm, body mass 81.1 ± 10.6 kg) performed two 60-minute bouts of recumbent cycling (50% of peak aerobic power [watts]) in an environmental chamber set at 35° C 50% relative humidity 3-10 days apart. A range of statistics were used to calculate the reliability, including a paired t-test, 95% limits of agreement (LOA), coefficient of variation (CV), standard error of measurement (SEM), Pearson's correlation coefficient (r), intraclass correlation coefficient (ICC), and Cohen's d. Statistical significance was set at p ≤ 0.05. The method indicated a good overall reliability (LOA = ± 0.61° C, CV = 0.58%, SEM = 0.12° C, Cohen's d = 0.12, r = 0.84, ICC = 0.84). Analysis revealed a statistically significant (p = 0.02) mean systematic bias of -0.07 ± 0.31° C, and the investigation of the Bland-Altman plot suggested the presence of heteroscedasticity. Further analysis revealed the minimum "likely" change in intestinal temperature to be 0.34° C. Although the method demonstrates a good reliability, researchers should be aware of heteroscedasticity. Changes in intestinal temperature >0.34° C as a result of exercise or an intervention in a hot environment are likely changes and less influenced by error associated with the method.

  20. Statistical characterization of Earth’s heterogeneities from seismic scattering

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Wu, R.

    2009-12-01

    The distortion of a teleseismic wavefront carries information about the heterogeneities through which the wave propagates and it is manifestited as logarithmic amplitude (logA) and phase fluctuations of the direct P wave recorded by a seismic network. By cross correlating the fluctuations (e.g., logA-logA or phase-phase), we obtain coherence functions, which depend on spatial lags between stations and incident angles between the incident waves. We have mathematically related the depth-dependent heterogeneity spectrum to the observable coherence functions using seismic scattering theory. We will show that our method has sharp depth resolution. Using the HiNet seismic network data in Japan, we have inverted power spectra for two depth ranges, ~0-120km and below ~120km depth. The coherence functions formed by different groups of stations or by different groups of earthquakes at different back azimuths are similar. This demonstrates that the method is statistically stable and the inhomogeneities are statistically stationary. In both depth intervals, the trend of the spectral amplitude decays from large scale to small scale in a power-law fashion with exceptions at ~50km for the logA data. Due to the spatial spacing of the seismometers, only information from length scale 15km to 200km is inverted. However our scattering method provides new information on small to intermediate scales that are comparable to scales of the recycled materials and thus is complimentary to the global seismic tomography which reveals mainly large-scale heterogeneities on the order of ~1000km. The small-scale heterogeneities revealed here are not likely of pure thermal origin. Therefore, the length scale and strength of heterogeneities as a function of depth may provide important constraints in mechanical mixing of various components in the mantle convection.

  1. Estimation of signal coherence threshold and concealed spectral lines applied to detection of turbofan engine combustion noise.

    PubMed

    Miles, Jeffrey Hilton

    2011-05-01

    Combustion noise from turbofan engines has become important, as the noise from sources like the fan and jet are reduced. An aligned and un-aligned coherence technique has been developed to determine a threshold level for the coherence and thereby help to separate the coherent combustion noise source from other noise sources measured with far-field microphones. This method is compared with a statistics based coherence threshold estimation method. In addition, the un-aligned coherence procedure at the same time also reveals periodicities, spectral lines, and undamped sinusoids hidden by broadband turbofan engine noise. In calculating the coherence threshold using a statistical method, one may use either the number of independent records or a larger number corresponding to the number of overlapped records used to create the average. Using data from a turbofan engine and a simulation this paper shows that applying the Fisher z-transform to the un-aligned coherence can aid in making the proper selection of samples and produce a reasonable statistics based coherence threshold. Examples are presented showing that the underlying tonal and coherent broad band structure which is buried under random broadband noise and jet noise can be determined. The method also shows the possible presence of indirect combustion noise.

  2. Genomic Data Quality Impacts Automated Detection of Lateral Gene Transfer in Fungi

    PubMed Central

    Dupont, Pierre-Yves; Cox, Murray P.

    2017-01-01

    Lateral gene transfer (LGT, also known as horizontal gene transfer), an atypical mechanism of transferring genes between species, has almost become the default explanation for genes that display an unexpected composition or phylogeny. Numerous methods of detecting LGT events all rely on two fundamental strategies: primary structure composition or gene tree/species tree comparisons. Discouragingly, the results of these different approaches rarely coincide. With the wealth of genome data now available, detection of laterally transferred genes is increasingly being attempted in large uncurated eukaryotic datasets. However, detection methods depend greatly on the quality of the underlying genomic data, which are typically complex for eukaryotes. Furthermore, given the automated nature of genomic data collection, it is typically impractical to manually verify all protein or gene models, orthology predictions, and multiple sequence alignments, requiring researchers to accept a substantial margin of error in their datasets. Using a test case comprising plant-associated genomes across the fungal kingdom, this study reveals that composition- and phylogeny-based methods have little statistical power to detect laterally transferred genes. In particular, phylogenetic methods reveal extreme levels of topological variation in fungal gene trees, the vast majority of which show departures from the canonical species tree. Therefore, it is inherently challenging to detect LGT events in typical eukaryotic genomes. This finding is in striking contrast to the large number of claims for laterally transferred genes in eukaryotic species that routinely appear in the literature, and questions how many of these proposed examples are statistically well supported. PMID:28235827

  3. Comparison of two fractal interpolation methods

    NASA Astrophysics Data System (ADS)

    Fu, Yang; Zheng, Zeyu; Xiao, Rui; Shi, Haibo

    2017-03-01

    As a tool for studying complex shapes and structures in nature, fractal theory plays a critical role in revealing the organizational structure of the complex phenomenon. Numerous fractal interpolation methods have been proposed over the past few decades, but they differ substantially in the form features and statistical properties. In this study, we simulated one- and two-dimensional fractal surfaces by using the midpoint displacement method and the Weierstrass-Mandelbrot fractal function method, and observed great differences between the two methods in the statistical characteristics and autocorrelation features. From the aspect of form features, the simulations of the midpoint displacement method showed a relatively flat surface which appears to have peaks with different height as the fractal dimension increases. While the simulations of the Weierstrass-Mandelbrot fractal function method showed a rough surface which appears to have dense and highly similar peaks as the fractal dimension increases. From the aspect of statistical properties, the peak heights from the Weierstrass-Mandelbrot simulations are greater than those of the middle point displacement method with the same fractal dimension, and the variances are approximately two times larger. When the fractal dimension equals to 1.2, 1.4, 1.6, and 1.8, the skewness is positive with the midpoint displacement method and the peaks are all convex, but for the Weierstrass-Mandelbrot fractal function method the skewness is both positive and negative with values fluctuating in the vicinity of zero. The kurtosis is less than one with the midpoint displacement method, and generally less than that of the Weierstrass-Mandelbrot fractal function method. The autocorrelation analysis indicated that the simulation of the midpoint displacement method is not periodic with prominent randomness, which is suitable for simulating aperiodic surface. While the simulation of the Weierstrass-Mandelbrot fractal function method has strong periodicity, which is suitable for simulating periodic surface.

  4. [Triple-type theory of statistics and its application in the scientific research of biomedicine].

    PubMed

    Hu, Liang-ping; Liu, Hui-gang

    2005-07-20

    To point out the crux of why so many people failed to grasp statistics and to bring forth a "triple-type theory of statistics" to solve the problem in a creative way. Based on the experience in long-time teaching and research in statistics, the "three-type theory" was raised and clarified. Examples were provided to demonstrate that the 3 types, i.e., expressive type, prototype and the standardized type are the essentials for people to apply statistics rationally both in theory and practice, and moreover, it is demonstrated by some instances that the "three types" are correlated with each other. It can help people to see the essence by interpreting and analyzing the problems of experimental designs and statistical analyses in medical research work. Investigations reveal that for some questions, the three types are mutually identical; for some questions, the prototype is their standardized type; however, for some others, the three types are distinct from each other. It has been shown that in some multifactor experimental researches, it leads to the nonexistence of the standardized type corresponding to the prototype at all, because some researchers have committed the mistake of "incomplete control" in setting experimental groups. This is a problem which should be solved by the concept and method of "division". Once the "triple-type" for each question is clarified, a proper experimental design and statistical method can be carried out easily. "Triple-type theory of statistics" can help people to avoid committing statistical mistakes or at least to decrease the misuse rate dramatically and improve the quality, level and speed of biomedical research during the process of applying statistics. It can also help people to improve the quality of statistical textbooks and the teaching effect of statistics and it has demonstrated how to advance biomedical statistics.

  5. Development of uncertainty-based work injury model using Bayesian structural equation modelling.

    PubMed

    Chatterjee, Snehamoy

    2014-01-01

    This paper proposed a Bayesian method-based structural equation model (SEM) of miners' work injury for an underground coal mine in India. The environmental and behavioural variables for work injury were identified and causal relationships were developed. For Bayesian modelling, prior distributions of SEM parameters are necessary to develop the model. In this paper, two approaches were adopted to obtain prior distribution for factor loading parameters and structural parameters of SEM. In the first approach, the prior distributions were considered as a fixed distribution function with specific parameter values, whereas, in the second approach, prior distributions of the parameters were generated from experts' opinions. The posterior distributions of these parameters were obtained by applying Bayesian rule. The Markov Chain Monte Carlo sampling in the form Gibbs sampling was applied for sampling from the posterior distribution. The results revealed that all coefficients of structural and measurement model parameters are statistically significant in experts' opinion-based priors, whereas, two coefficients are not statistically significant when fixed prior-based distributions are applied. The error statistics reveals that Bayesian structural model provides reasonably good fit of work injury with high coefficient of determination (0.91) and less mean squared error as compared to traditional SEM.

  6. Statistics of Weighted Brain Networks Reveal Hierarchical Organization and Gaussian Degree Distribution

    PubMed Central

    Ivković, Miloš; Kuceyeski, Amy; Raj, Ashish

    2012-01-01

    Whole brain weighted connectivity networks were extracted from high resolution diffusion MRI data of 14 healthy volunteers. A statistically robust technique was proposed for the removal of questionable connections. Unlike most previous studies our methods are completely adapted for networks with arbitrary weights. Conventional statistics of these weighted networks were computed and found to be comparable to existing reports. After a robust fitting procedure using multiple parametric distributions it was found that the weighted node degree of our networks is best described by the normal distribution, in contrast to previous reports which have proposed heavy tailed distributions. We show that post-processing of the connectivity weights, such as thresholding, can influence the weighted degree asymptotics. The clustering coefficients were found to be distributed either as gamma or power-law distribution, depending on the formula used. We proposed a new hierarchical graph clustering approach, which revealed that the brain network is divided into a regular base-2 hierarchical tree. Connections within and across this hierarchy were found to be uncommonly ordered. The combined weight of our results supports a hierarchically ordered view of the brain, whose connections have heavy tails, but whose weighted node degrees are comparable. PMID:22761649

  7. Statistics of weighted brain networks reveal hierarchical organization and Gaussian degree distribution.

    PubMed

    Ivković, Miloš; Kuceyeski, Amy; Raj, Ashish

    2012-01-01

    Whole brain weighted connectivity networks were extracted from high resolution diffusion MRI data of 14 healthy volunteers. A statistically robust technique was proposed for the removal of questionable connections. Unlike most previous studies our methods are completely adapted for networks with arbitrary weights. Conventional statistics of these weighted networks were computed and found to be comparable to existing reports. After a robust fitting procedure using multiple parametric distributions it was found that the weighted node degree of our networks is best described by the normal distribution, in contrast to previous reports which have proposed heavy tailed distributions. We show that post-processing of the connectivity weights, such as thresholding, can influence the weighted degree asymptotics. The clustering coefficients were found to be distributed either as gamma or power-law distribution, depending on the formula used. We proposed a new hierarchical graph clustering approach, which revealed that the brain network is divided into a regular base-2 hierarchical tree. Connections within and across this hierarchy were found to be uncommonly ordered. The combined weight of our results supports a hierarchically ordered view of the brain, whose connections have heavy tails, but whose weighted node degrees are comparable.

  8. Method of analysis of local neuronal circuits in the vertebrate central nervous system.

    PubMed

    Reinis, S; Weiss, D S; McGaraughty, S; Tsoukatos, J

    1992-06-01

    Although a considerable amount of knowledge has been accumulated about the activity of individual nerve cells in the brain, little is known about their mutual interactions at the local level. The method presented in this paper allows the reconstruction of functional relations within a group of neurons as recorded by a single microelectrode. Data are sampled at 10 or 13 kHz. Prominent spikes produced by one or more single cells are selected and sorted by K-means cluster analysis. The activities of single cells are then related to the background firing of neurons in their vicinity. Auto-correlograms of the leading cells, auto-correlograms of the background cells (mass correlograms) and cross-correlograms between these two levels of firing are computed and evaluated. The statistical probability of mutual interactions is determined, and the statistically significant, most common interspike intervals are stored and attributed to real pairs of spikes in the original record. Selected pairs of spikes, characterized by statistically significant intervals between them, are then assembled into a working model of the system. This method has revealed substantial differences between the information processing in the visual cortex, the inferior colliculus, the rostral ventromedial medulla and the ventrobasal complex of the thalamus. Even short 1-s records of the multiple neuronal activity may provide meaningful and statistically significant results.

  9. Normalization, bias correction, and peak calling for ChIP-seq

    PubMed Central

    Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.

    2012-01-01

    Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706

  10. SAFER, an Analysis Method of Quantitative Proteomic Data, Reveals New Interactors of the C. elegans Autophagic Protein LGG-1.

    PubMed

    Yi, Zhou; Manil-Ségalen, Marion; Sago, Laila; Glatigny, Annie; Redeker, Virginie; Legouis, Renaud; Mucchielli-Giorgi, Marie-Hélène

    2016-05-06

    Affinity purifications followed by mass spectrometric analysis are used to identify protein-protein interactions. Because quantitative proteomic data are noisy, it is necessary to develop statistical methods to eliminate false-positives and identify true partners. We present here a novel approach for filtering false interactors, named "SAFER" for mass Spectrometry data Analysis by Filtering of Experimental Replicates, which is based on the reproducibility of the replicates and the fold-change of the protein intensities between bait and control. To identify regulators or targets of autophagy, we characterized the interactors of LGG1, a ubiquitin-like protein involved in autophagosome formation in C. elegans. LGG-1 partners were purified by affinity, analyzed by nanoLC-MS/MS mass spectrometry, and quantified by a label-free proteomic approach based on the mass spectrometric signal intensity of peptide precursor ions. Because the selection of confident interactions depends on the method used for statistical analysis, we compared SAFER with several statistical tests and different scoring algorithms on this set of data. We show that SAFER recovers high-confidence interactors that have been ignored by the other methods and identified new candidates involved in the autophagy process. We further validated our method on a public data set and conclude that SAFER notably improves the identification of protein interactors.

  11. Study of deformation evolution during failure of rock specimens using laser-based vibration measurements

    NASA Astrophysics Data System (ADS)

    Smolin, I. Yu.; Kulkov, A. S.; Makarov, P. V.; Tunda, V. A.; Krasnoveikin, V. A.; Eremin, M. O.; Bakeev, R. A.

    2017-12-01

    The aim of the paper is to analyze experimental data on the dynamic response of the marble specimen in uniaxial compression. To make it we use the methods of mathematical statistics. The lateral surface velocity evolution obtained by the laser Doppler vibrometer represents the data for analysis. The registered data were regarded as a time series that reflects deformation evolution of the specimen loaded up to failure. The revealed changes in statistical parameters were considered as precursors of failure. It is shown that before failure the deformation response is autocorrelated and reflects the states of dynamic chaos and self-organized criticality.

  12. Statistical Properties of Online Auctions

    NASA Astrophysics Data System (ADS)

    Namazi, Alireza; Schadschneider, Andreas

    We characterize the statistical properties of a large number of online auctions run on eBay. Both stationary and dynamic properties, like distributions of prices, number of bids etc., as well as relations between these quantities are studied. The analysis of the data reveals surprisingly simple distributions and relations, typically of power-law form. Based on these findings we introduce a simple method to identify suspicious auctions that could be influenced by a form of fraud known as shill bidding. Furthermore the influence of bidding strategies is discussed. The results indicate that the observed behavior is related to a mixture of agents using a variety of strategies.

  13. Hyperspectral Analysis of Soil Nitrogen, Carbon, Carbonate, and Organic Matter Using Regression Trees

    PubMed Central

    Gmur, Stephan; Vogt, Daniel; Zabowski, Darlene; Moskal, L. Monika

    2012-01-01

    The characterization of soil attributes using hyperspectral sensors has revealed patterns in soil spectra that are known to respond to mineral composition, organic matter, soil moisture and particle size distribution. Soil samples from different soil horizons of replicated soil series from sites located within Washington and Oregon were analyzed with the FieldSpec Spectroradiometer to measure their spectral signatures across the electromagnetic range of 400 to 1,000 nm. Similarity rankings of individual soil samples reveal differences between replicate series as well as samples within the same replicate series. Using classification and regression tree statistical methods, regression trees were fitted to each spectral response using concentrations of nitrogen, carbon, carbonate and organic matter as the response variables. Statistics resulting from fitted trees were: nitrogen R2 0.91 (p < 0.01) at 403, 470, 687, and 846 nm spectral band widths, carbonate R2 0.95 (p < 0.01) at 531 and 898 nm band widths, total carbon R2 0.93 (p < 0.01) at 400, 409, 441 and 907 nm band widths, and organic matter R2 0.98 (p < 0.01) at 300, 400, 441, 832 and 907 nm band widths. Use of the 400 to 1,000 nm electromagnetic range utilizing regression trees provided a powerful, rapid and inexpensive method for assessing nitrogen, carbon, carbonate and organic matter for upper soil horizons in a nondestructive method. PMID:23112620

  14. Use of Statistical Analysis of Acoustic Emission Data on Carbon-Epoxy COPV Materials-of-Construction for Enhanced Felicity Ratio Onset Determination

    NASA Technical Reports Server (NTRS)

    Abraham, Arick Reed A.; Johnson, Kenneth L.; Nichols, Charles T.; Saulsberry, Regor L.; Waller, Jess M.

    2012-01-01

    Broadband modal acoustic emission (AE) data were acquired during intermittent load hold tensile test profiles on Toray T1000G carbon fiber-reinforced epoxy (C/Ep) single tow specimens. A novel trend seeking statistical method to determine the onset of significant AE was developed, resulting in more linear decreases in the Felicity ratio (FR) with load, potentially leading to more accurate failure prediction. The method developed uses an exponentially weighted moving average (EWMA) control chart. Comparison of the EWMA with previously used FR onset methods, namely the discrete (n), mean (n (raised bar)), normalized (n%) and normalized mean (n(raised bar)%) methods, revealed the EWMA method yields more consistently linear FR versus load relationships between specimens. Other findings include a correlation between AE data richness and FR linearity based on the FR methods discussed in this paper, and evidence of premature failure at lower than expected loads. Application of the EWMA method should be extended to other composite materials and, eventually, composite components such as composite overwrapped pressure vessels. Furthermore, future experiments should attempt to uncover the factors responsible for infant mortality in C/Ep strands.

  15. Learn the game but don't play it: nurses' perspectives on learning and applying statistics in practice.

    PubMed

    Gaudet, Julie; Singh, Mina D; Epstein, Iris; Santa Mina, Elaine; Gula, Taras

    2014-07-01

    An integrative review regarding undergraduate level statistics pedagogy for nurses revealed a paucity of research to inform curricula development and delivery. The aim of the study was to explore alumni nurses' perspectives about statistics education and its application to practice. A mixed-method approach was used whereby a quantitative approach was used to complement and develop the qualitative aspect. This study was conducted in Toronto, Ontario, Canada. Participants were nursing alumni who graduated from four types of nursing degree programs (BScN) in two Ontario universities between the years 2005-2009. Data were collected via surveys (n=232) followed by interviews (n=36). Participants reported that they did not fear statistics and that they thought their math skills were very good or excellent. They felt that statistics courses were important to their nursing practice but they were not required to use statistics. Qualitative findings emerged in the two major themes: 1) nurses value statistics and 2) nurses do not feel comfortable using statistics. Nurses recognize the inherent value of statistics to improve their professional image and interprofessional communication; yet they feel denied of full participation in application to their practice. Our findings have major implications for changes in pedagogy and practice. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Revealing physical interaction networks from statistics of collective dynamics

    PubMed Central

    Nitzan, Mor; Casadiego, Jose; Timme, Marc

    2017-01-01

    Revealing physical interactions in complex systems from observed collective dynamics constitutes a fundamental inverse problem in science. Current reconstruction methods require access to a system’s model or dynamical data at a level of detail often not available. We exploit changes in invariant measures, in particular distributions of sampled states of the system in response to driving signals, and use compressed sensing to reveal physical interaction networks. Dynamical observations following driving suffice to infer physical connectivity even if they are temporally disordered, are acquired at large sampling intervals, and stem from different experiments. Testing various nonlinear dynamic processes emerging on artificial and real network topologies indicates high reconstruction quality for existence as well as type of interactions. These results advance our ability to reveal physical interaction networks in complex synthetic and natural systems. PMID:28246630

  17. The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics.

    PubMed

    Lyon, Yana A; Riggs, Dylan; Fornelli, Luca; Compton, Philip D; Julian, Ryan R

    2018-01-01

    Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. Graphical abstract ᅟ.

  18. The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics

    NASA Astrophysics Data System (ADS)

    Lyon, Yana A.; Riggs, Dylan; Fornelli, Luca; Compton, Philip D.; Julian, Ryan R.

    2018-01-01

    Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. [Figure not available: see fulltext.

  19. Statistical Methods, Some Old, Some New: A Tutorial Survey.

    DTIC Science & Technology

    1981-01-01

    Ext I Z - MExt I Ext SExt \\2 Ext-Ext The quartile (eighth, extreme) means MQ(ME,.MExt) can be quickly compared to M to detect systematic asymmetry or...summary then is 3! Q r2 31 1 60_ 30 -6 0.43 I 11 - 130 - 656 I ill SExt III-1 = 0.45 The numbers suggest positive skewness, but closer examination reveals

  20. Ubiquity of Polynucleobacter necessarius subsp. asymbioticus in lentic freshwater habitats of a heterogenous 2000 km2 area

    PubMed Central

    Jezberová, Jitka; Jezbera, Jan; Brandt, Ulrike; Lindström, Eva S.; Langenheder, Silke; Hahn, Martin W.

    2010-01-01

    Summary We present a survey on the distribution and habitat range of P. necessarius subspecies asymbioticus (PnecC), an important taxon in the water column of freshwater systems. We systematically sampled stagnant freshwater habitats in a heterogeneous 2000 km2 area, together with ecologically different habitats outside this area. In total, 137 lakes, ponds and puddles were investigated, which represent an enormous diversity of habitats differing, i.e., in depth (<10 cm – 171 m) and pH (3.9 – 8.5). PnecC was detected by cultivation-independent methods in all investigated habitats, and its presence was confirmed by cultivation of strains from selected habitats including the most extreme ones. The determined relative abundance of the subspecies ranged from slightly above 0% to 67% (average 14.5% ± 14.3%), and the highest observed absolute abundance was 5.3×106 cells mL−1. Statistical analyses revealed that the abundance of PnecC is partially controlled by factors linked to concentrations of humic substances, which might support the hypothesis that these bacteria utilize photodegradation products of humic substances. . Statistical analyses revealed that the abundance of PnecC is partially controlled by low conductivity and pH and factors linked to concentrations of humic substances. Based on the revealed statistical relationships, an average relative abundance of this subspecies of 20% in global freshwater habitats was extrapolated. Our study provides important implications for the current debate on ubiquity and biogeography in microorganisms. PMID:20041938

  1. Climatic change on the Gulf of Fonseca (Central America) using two-step statistical downscaling of CMIP5 model outputs

    NASA Astrophysics Data System (ADS)

    Ribalaygua, Jaime; Gaitán, Emma; Pórtoles, Javier; Monjo, Robert

    2018-05-01

    A two-step statistical downscaling method has been reviewed and adapted to simulate twenty-first-century climate projections for the Gulf of Fonseca (Central America, Pacific Coast) using Coupled Model Intercomparison Project (CMIP5) climate models. The downscaling methodology is adjusted after looking for good predictor fields for this area (where the geostrophic approximation fails and the real wind fields are the most applicable). The method's performance for daily precipitation and maximum and minimum temperature is analysed and revealed suitable results for all variables. For instance, the method is able to simulate the characteristic cycle of the wet season for this area, which includes a mid-summer drought between two peaks. Future projections show a gradual temperature increase throughout the twenty-first century and a change in the features of the wet season (the first peak and mid-summer rainfall being reduced relative to the second peak, earlier onset of the wet season and a broader second peak).

  2. Using statistical process control methods to trace small changes in perinatal mortality after a training program in a low-resource setting.

    PubMed

    Mduma, Estomih R; Ersdal, Hege; Kvaloy, Jan Terje; Svensen, Erling; Mdoe, Paschal; Perlman, Jeffrey; Kidanto, Hussein Lessio; Soreide, Eldar

    2018-05-01

    To trace and document smaller changes in perinatal survival over time. Prospective observational study, with retrospective analysis. Labor ward and operating theater at Haydom Lutheran Hospital in rural north-central Tanzania. All women giving birth and birth attendants. Helping Babies Breathe (HBB) simulation training on newborn care and resuscitation and some other efforts to improve perinatal outcome. Perinatal survival, including fresh stillbirths and early (24-h) newborn survival. The variable life-adjusted plot and cumulative sum chart revealed a steady improvement in survival over time, after the baseline period. There were some variations throughout the study period, and some of these could be linked to different interventions and events. To our knowledge, this is the first time statistical process control methods have been used to document changes in perinatal mortality over time in a rural Sub-Saharan hospital, showing a steady increase in survival. These methods can be utilized to continuously monitor and describe changes in patient outcomes.

  3. Simulation analysis of air flow and turbulence statistics in a rib grit roughened duct.

    PubMed

    Vogiatzis, I I; Denizopoulou, A C; Ntinas, G K; Fragos, V P

    2014-01-01

    The implementation of variable artificial roughness patterns on a surface is an effective technique to enhance the rate of heat transfer to fluid flow in the ducts of solar air heaters. Different geometries of roughness elements investigated have demonstrated the pivotal role that vortices and associated turbulence have on the heat transfer characteristics of solar air heater ducts by increasing the convective heat transfer coefficient. In this paper we investigate the two-dimensional, turbulent, unsteady flow around rectangular ribs of variable aspect ratios by directly solving the transient Navier-Stokes and continuity equations using the finite elements method. Flow characteristics and several aspects of turbulent flow are presented and discussed including velocity components and statistics of turbulence. The results reveal the impact that different rib lengths have on the computed mean quantities and turbulence statistics of the flow. The computed turbulence parameters show a clear tendency to diminish downstream with increasing rib length. Furthermore, the applied numerical method is capable of capturing small-scale flow structures resulting from the direct solution of Navier-Stokes and continuity equations.

  4. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Dynamic thiol/disulphide homeostasis in patients with basal cell carcinoma.

    PubMed

    Demirseren, Duriye Deniz; Cicek, Cagla; Alisik, Murat; Demirseren, Mustafa Erol; Aktaş, Akın; Erel, Ozcan

    2017-09-01

    The aim of this study is to measure and compare the dynamic thiol/disulphide homeostasis of patients with basal cell carcinoma and healthy subjects with a newly developed and original method. Thirty four patients attending our outpatient clinic and clinically and histopathologically diagnosed as nodular basal cell carcinoma, and age and gender matched 30 healthy individuals have been involved in the study. Thiol/disulphide homeostasis tests have been measured with a novel automatic spectrophotometric method developed and the results have been compared statistically. Serum native thiol and disulphide levels in the patient and control group show a considerable variance statistically (p = 0.028, 0.039, respectively). Total thiol levels do not reveal a considerable variation (p = 0.094). Disulphide/native thiol ratios and native thiol/total thiol ratios also show a considerable variance statistically (p = 0.012, 0.013, 0.010, respectively). Thiol disulphide homeostasis in patients with basal cell carcinoma alters in the way that disulphide gets lower and thiols get higher. Thiol/disulphide level is likely to have a role in basal cell carcinoma pathogenesis.

  6. Statistical analyses of influence of solar and geomagnetic activities on car accident events

    NASA Astrophysics Data System (ADS)

    Alania, M. V.; Gil, A.; Wieliczuk, R.

    2001-01-01

    Statistical analyses of the influence of Solar and geomagnetic activity, sector structure of the interplanetary magnetic field and galactic cosmic ray Forbush effects on car accident events in Poland for the period of 1990-1999 have been carried out. Using auto-correlation, cross-correlation, spectral analyses and superposition epochs methods it has been shown that there are separate periods when car accident events have direct correlation with Ap index of the geomagnetic activity, sector structure of the interplanetary magnetic field and Forbush decreases of galactic cosmic rays. Nevertheless, the single-valued direct correlation is not possible to reveal for the whole period of 1990-1999. Periodicity of 7 days and its second harmonic (3.5 days) has been reliably revealed in the car accident events data in Poland for the each year of the period 1990-1999. It is shown that the maximum car accident events take place in Poland on Friday and practically does not depend on the level of solar and geomagnetic activities.

  7. Views of elementary school teachers towards students with cochlear implants inclusion in the process of education.

    PubMed

    Dulcić, Adinda; Bakota, Koraljka

    2009-06-01

    The paper reveals views of teachers in some regular elementary schools in the Republic of Croatia where students with cochlear implants, who are also rehabilitants of SUVAG Polyclinic, are educated. Survey aimed to research the views of teachers towards education. Survey included 98 teachers. Likert type scale was applied in order to identify the views of teachers towards students with hearing impairment. The survey was carried out in May 2007. Data were processed by SPSS for Windows program, version 13. Methods of descriptive statistics were applied to process frequencies of responds on variables for total sample and 3 statistically significant factors emerged by factor analysis. Results of this survey reveal that teachers have positive views towards inclusive education as a process which offers students with cochlear implants the possibility to socialize and achieve intellectual and emotional development. The survey suggests that the way of inclusion enforcement mostly satisfies the criteria specified for successful inclusion.

  8. Laser speckle imaging of rat retinal blood flow with hybrid temporal and spatial analysis method

    NASA Astrophysics Data System (ADS)

    Cheng, Haiying; Yan, Yumei; Duong, Timothy Q.

    2009-02-01

    Noninvasive monitoring of blood flow in retinal circulation will reveal the progression and treatment of ocular disorders, such as diabetic retinopathy, age-related macular degeneration and glaucoma. A non-invasive and direct BF measurement technique with high spatial-temporal resolution is needed for retinal imaging. Laser speckle imaging (LSI) is such a method. Currently, there are two analysis methods for LSI: spatial statistics LSI (SS-LSI) and temporal statistical LSI (TS-LSI). Comparing these two analysis methods, SS-LSI has higher signal to noise ratio (SNR) and TSLSI is less susceptible to artifacts from stationary speckle. We proposed a hybrid temporal and spatial analysis method (HTS-LSI) to measure the retinal blood flow. Gas challenge experiment was performed and images were analyzed by HTS-LSI. Results showed that HTS-LSI can not only remove the stationary speckle but also increase the SNR. Under 100% O2, retinal BF decreased by 20-30%. This was consistent with the results observed with laser Doppler technique. As retinal blood flow is a critical physiological parameter and its perturbation has been implicated in the early stages of many retinal diseases, HTS-LSI will be an efficient method in early detection of retina diseases.

  9. Ensemble stacking mitigates biases in inference of synaptic connectivity.

    PubMed

    Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N

    2018-01-01

    A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.

  10. Statistical distributions of ultra-low dose CT sinograms and their fundamental limits

    NASA Astrophysics Data System (ADS)

    Lee, Tzu-Cheng; Zhang, Ruoqiao; Alessio, Adam M.; Fu, Lin; De Man, Bruno; Kinahan, Paul E.

    2017-03-01

    Low dose CT imaging is typically constrained to be diagnostic. However, there are applications for even lowerdose CT imaging, including image registration across multi-frame CT images and attenuation correction for PET/CT imaging. We define this as the ultra-low-dose (ULD) CT regime where the exposure level is a factor of 10 lower than current low-dose CT technique levels. In the ULD regime it is possible to use statistically-principled image reconstruction methods that make full use of the raw data information. Since most statistical based iterative reconstruction methods are based on the assumption of that post-log noise distribution is close to Poisson or Gaussian, our goal is to understand the statistical distribution of ULD CT data with different non-positivity correction methods, and to understand when iterative reconstruction methods may be effective in producing images that are useful for image registration or attenuation correction in PET/CT imaging. We first used phantom measurement and calibrated simulation to reveal how the noise distribution deviate from normal assumption under the ULD CT flux environment. In summary, our results indicate that there are three general regimes: (1) Diagnostic CT, where post-log data are well modeled by normal distribution. (2) Lowdose CT, where normal distribution remains a reasonable approximation and statistically-principled (post-log) methods that assume a normal distribution have an advantage. (3) An ULD regime that is photon-starved and the quadratic approximation is no longer effective. For instance, a total integral density of 4.8 (ideal pi for 24 cm of water) for 120kVp, 0.5mAs of radiation source is the maximum pi value where a definitive maximum likelihood value could be found. This leads to fundamental limits in the estimation of ULD CT data when using a standard data processing stream

  11. Complex patterns of abnormal heartbeats

    NASA Technical Reports Server (NTRS)

    Schulte-Frohlinde, Verena; Ashkenazy, Yosef; Goldberger, Ary L.; Ivanov, Plamen Ch; Costa, Madalena; Morley-Davies, Adrian; Stanley, H. Eugene; Glass, Leon

    2002-01-01

    Individuals having frequent abnormal heartbeats interspersed with normal heartbeats may be at an increased risk of sudden cardiac death. However, mechanistic understanding of such cardiac arrhythmias is limited. We present a visual and qualitative method to display statistical properties of abnormal heartbeats. We introduce dynamical "heartprints" which reveal characteristic patterns in long clinical records encompassing approximately 10(5) heartbeats and may provide information about underlying mechanisms. We test if these dynamics can be reproduced by model simulations in which abnormal heartbeats are generated (i) randomly, (ii) at a fixed time interval following a preceding normal heartbeat, or (iii) by an independent oscillator that may or may not interact with the normal heartbeat. We compare the results of these three models and test their limitations to comprehensively simulate the statistical features of selected clinical records. This work introduces methods that can be used to test mathematical models of arrhythmogenesis and to develop a new understanding of underlying electrophysiologic mechanisms of cardiac arrhythmia.

  12. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  13. Prioritizing GWAS Results: A Review of Statistical Methods and Recommendations for Their Application

    PubMed Central

    Cantor, Rita M.; Lange, Kenneth; Sinsheimer, Janet S.

    2010-01-01

    Genome-wide association studies (GWAS) have rapidly become a standard method for disease gene discovery. A substantial number of recent GWAS indicate that for most disorders, only a few common variants are implicated and the associated SNPs explain only a small fraction of the genetic risk. This review is written from the viewpoint that findings from the GWAS provide preliminary genetic information that is available for additional analysis by statistical procedures that accumulate evidence, and that these secondary analyses are very likely to provide valuable information that will help prioritize the strongest constellations of results. We review and discuss three analytic methods to combine preliminary GWAS statistics to identify genes, alleles, and pathways for deeper investigations. Meta-analysis seeks to pool information from multiple GWAS to increase the chances of finding true positives among the false positives and provides a way to combine associations across GWAS, even when the original data are unavailable. Testing for epistasis within a single GWAS study can identify the stronger results that are revealed when genes interact. Pathway analysis of GWAS results is used to prioritize genes and pathways within a biological context. Following a GWAS, association results can be assigned to pathways and tested in aggregate with computational tools and pathway databases. Reviews of published methods with recommendations for their application are provided within the framework for each approach. PMID:20074509

  14. Nonlinear multivariate and time series analysis by neural network methods

    NASA Astrophysics Data System (ADS)

    Hsieh, William W.

    2004-03-01

    Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.

  15. Optimized statistical parametric mapping procedure for NIRS data contaminated by motion artifacts : Neurometric analysis of body schema extension.

    PubMed

    Suzuki, Satoshi

    2017-09-01

    This study investigated the spatial distribution of brain activity on body schema (BS) modification induced by natural body motion using two versions of a hand-tracing task. In Task 1, participants traced Japanese Hiragana characters using the right forefinger, requiring no BS expansion. In Task 2, participants performed the tracing task with a long stick, requiring BS expansion. Spatial distribution was analyzed using general linear model (GLM)-based statistical parametric mapping of near-infrared spectroscopy data contaminated with motion artifacts caused by the hand-tracing task. Three methods were utilized in series to counter the artifacts, and optimal conditions and modifications were investigated: a model-free method (Step 1), a convolution matrix method (Step 2), and a boxcar-function-based Gaussian convolution method (Step 3). The results revealed four methodological findings: (1) Deoxyhemoglobin was suitable for the GLM because both Akaike information criterion and the variance against the averaged hemodynamic response function were smaller than for other signals, (2) a high-pass filter with a cutoff frequency of .014 Hz was effective, (3) the hemodynamic response function computed from a Gaussian kernel function and its first- and second-derivative terms should be included in the GLM model, and (4) correction of non-autocorrelation and use of effective degrees of freedom were critical. Investigating z-maps computed according to these guidelines revealed that contiguous areas of BA7-BA40-BA21 in the right hemisphere became significantly activated ([Formula: see text], [Formula: see text], and [Formula: see text], respectively) during BS modification while performing the hand-tracing task.

  16. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  17. Performance comparison of LUR and OK in PM2.5 concentration mapping: a multidimensional perspective

    PubMed Central

    Zou, Bin; Luo, Yanqing; Wan, Neng; Zheng, Zhong; Sternberg, Troy; Liao, Yilan

    2015-01-01

    Methods of Land Use Regression (LUR) modeling and Ordinary Kriging (OK) interpolation have been widely used to offset the shortcomings of PM2.5 data observed at sparse monitoring sites. However, traditional point-based performance evaluation strategy for these methods remains stagnant, which could cause unreasonable mapping results. To address this challenge, this study employs ‘information entropy’, an area-based statistic, along with traditional point-based statistics (e.g. error rate, RMSE) to evaluate the performance of LUR model and OK interpolation in mapping PM2.5 concentrations in Houston from a multidimensional perspective. The point-based validation reveals significant differences between LUR and OK at different test sites despite the similar end-result accuracy (e.g. error rate 6.13% vs. 7.01%). Meanwhile, the area-based validation demonstrates that the PM2.5 concentrations simulated by the LUR model exhibits more detailed variations than those interpolated by the OK method (i.e. information entropy, 7.79 vs. 3.63). Results suggest that LUR modeling could better refine the spatial distribution scenario of PM2.5 concentrations compared to OK interpolation. The significance of this study primarily lies in promoting the integration of point- and area-based statistics for model performance evaluation in air pollution mapping. PMID:25731103

  18. Data series embedding and scale invariant statistics.

    PubMed

    Michieli, I; Medved, B; Ristov, S

    2010-06-01

    Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.

  19. The effect of teaching medical ethics on medical students' moral reasoning.

    PubMed

    Self, D J; Wolinsky, F D; Baldwin, D C

    1989-12-01

    A study assessed the effect of incorporating medical ethics into the medical curriculum and the relative effects of two methods of implementing that curriculum, namely, lecture and case-study discussions. Results indicate a statistically significant increase (p less than or equal to .0001) in the level of moral reasoning of students exposed to the medical ethics course, regardless of format. Moreover, the unadjusted posttest scores indicated that the case-study method was significantly (p less than or equal to .03) more effective than the lecture method in increasing students' level of moral reasoning. When adjustment were made for the pretest scores, however, this difference was not statistically significant (p less than or equal to .18). Regression analysis by linear panel techniques revealed that age, gender, undergraduate grade-point average, and scores on the Medical College Admission Test were not related to the changes in moral-reasoning scores. All of the variance that could be explained was due to the students' being in one of the two experimental groups. In comparison with the control group, the change associated with each experimental format was statistically significant (lecture, p less than or equal to .004; case study, p less than or equal to .0001). Various explanations for these findings and their implications are given.

  20. Proposal for a recovery prediction method for patients affected by acute mediastinitis

    PubMed Central

    2012-01-01

    Background An attempt to find a prediction method of death risk in patients affected by acute mediastinitis. There is not such a tool described in available literature for that serious disease. Methods The study comprised 44 consecutive cases of acute mediastinitis. General anamnesis and biochemical data were included. Factor analysis was used to extract the risk characteristic for the patients. The most valuable results were obtained for 8 parameters which were selected for further statistical analysis (all collected during few hours after admission). Three factors reached Eigenvalue >1. Clinical explanations of these combined statistical factors are: Factor1 - proteinic status (serum total protein, albumin, and hemoglobin level), Factor2 - inflammatory status (white blood cells, CRP, procalcitonin), and Factor3 - general risk (age, number of coexisting diseases). Threshold values of prediction factors were estimated by means of statistical analysis (factor analysis, Statgraphics Centurion XVI). Results The final prediction result for the patients is constructed as simultaneous evaluation of all factor scores. High probability of death should be predicted if factor 1 value decreases with simultaneous increase of factors 2 and 3. The diagnostic power of the proposed method was revealed to be high [sensitivity =90%, specificity =64%], for Factor1 [SNC = 87%, SPC = 79%]; for Factor2 [SNC = 87%, SPC = 50%] and for Factor3 [SNC = 73%, SPC = 71%]. Conclusion The proposed prediction method seems a useful emergency signal during acute mediastinitis control in affected patients. PMID:22574625

  1. Dental enamel defect diagnosis through different technology-based devices.

    PubMed

    Kobayashi, Tatiana Yuriko; Vitor, Luciana Lourenço Ribeiro; Carrara, Cleide Felício Carvalho; Silva, Thiago Cruvinel; Rios, Daniela; Machado, Maria Aparecida Andrade Moreira; Oliveira, Thais Marchini

    2018-06-01

    Dental enamel defects (DEDs) are faulty or deficient enamel formations of primary and permanent teeth. Changes during tooth development result in hypoplasia (a quantitative defect) and/or hypomineralisation (a qualitative defect). To compare technology-based diagnostic methods for detecting DEDs. Two-hundred and nine dental surfaces of anterior permanent teeth were selected in patients, 6-11 years of age, with cleft lip with/without cleft palate. First, a conventional clinical examination was conducted according to the modified Developmental Defects of Enamel Index (DDE Index). Dental surfaces were evaluated using an operating microscope and a fluorescence-based device. Interexaminer reproducibility was determined using the kappa test. To compare groups, McNemar's test was used. Cramer's V test was used for comparing the distribution of index codes obtained after classification of all dental surfaces. Cramer's V test revealed statistically significant differences (P < .0001) in the distribution of index codes obtained using the different methods; the coefficients were 0.365 for conventional clinical examination versus fluorescence, 0.961 for conventional clinical examination versus operating microscope and 0.358 for operating microscope versus fluorescence. The sensitivity of the operating microscope and fluorescence method was statistically significant (P = .008 and P < .0001, respectively). Otherwise, the results did not show statistically significant differences in accuracy and specificity for either the operating microscope or the fluorescence methods. This study suggests that the operating microscope performed better than the fluorescence-based device and could be an auxiliary method for the detection of DEDs. © 2017 FDI World Dental Federation.

  2. Neurosphere and adherent culture conditions are equivalent for malignant glioma stem cell lines.

    PubMed

    Rahman, Maryam; Reyner, Karina; Deleyrolle, Loic; Millette, Sebastien; Azari, Hassan; Day, Bryan W; Stringer, Brett W; Boyd, Andrew W; Johns, Terrance G; Blot, Vincent; Duggal, Rohit; Reynolds, Brent A

    2015-03-01

    Certain limitations of the neurosphere assay (NSA) have resulted in a search for alternative culture techniques for brain tumor-initiating cells (TICs). Recently, reports have described growing glioblastoma (GBM) TICs as a monolayer using laminin. We performed a side-by-side analysis of the NSA and laminin (adherent) culture conditions to compare the growth and expansion of GBM TICs. GBM cells were grown using the NSA and adherent culture conditions. Comparisons were made using growth in culture, apoptosis assays, protein expression, limiting dilution clonal frequency assay, genetic affymetrix analysis, and tumorigenicity in vivo. In vitro expansion curves for the NSA and adherent culture conditions were virtually identical (P=0.24) and the clonogenic frequencies (5.2% for NSA vs. 5.0% for laminin, P=0.9) were similar as well. Likewise, markers of differentiation (glial fibrillary acidic protein and beta tubulin III) and proliferation (Ki67 and MCM2) revealed no statistical difference between the sphere and attachment methods. Several different methods were used to determine the numbers of dead or dying cells (trypan blue, DiIC, caspase-3, and annexin V) with none of the assays noting a meaningful variance between the two methods. In addition, genetic expression analysis with microarrays revealed no significant differences between the two groups. Finally, glioma cells derived from both methods of expansion formed large invasive tumors exhibiting GBM features when implanted in immune-compromised animals. A detailed functional, protein and genetic characterization of human GBM cells cultured in serum-free defined conditions demonstrated no statistically meaningful differences when grown using sphere (NSA) or adherent conditions. Hence, both methods are functionally equivalent and remain suitable options for expanding primary high-grade gliomas in tissue culture.

  3. Neurosphere and adherent culture conditions are equivalent for malignant glioma stem cell lines

    PubMed Central

    Reyner, Karina; Deleyrolle, Loic; Millette, Sebastien; Azari, Hassan; Day, Bryan W.; Stringer, Brett W.; Boyd, Andrew W.; Johns, Terrance G.; Blot, Vincent; Duggal, Rohit; Reynolds, Brent A.

    2015-01-01

    Certain limitations of the neurosphere assay (NSA) have resulted in a search for alternative culture techniques for brain tumor-initiating cells (TICs). Recently, reports have described growing glioblastoma (GBM) TICs as a monolayer using laminin. We performed a side-by-side analysis of the NSA and laminin (adherent) culture conditions to compare the growth and expansion of GBM TICs. GBM cells were grown using the NSA and adherent culture conditions. Comparisons were made using growth in culture, apoptosis assays, protein expression, limiting dilution clonal frequency assay, genetic affymetrix analysis, and tumorigenicity in vivo. In vitro expansion curves for the NSA and adherent culture conditions were virtually identical (P=0.24) and the clonogenic frequencies (5.2% for NSA vs. 5.0% for laminin, P=0.9) were similar as well. Likewise, markers of differentiation (glial fibrillary acidic protein and beta tubulin III) and proliferation (Ki67 and MCM2) revealed no statistical difference between the sphere and attachment methods. Several different methods were used to determine the numbers of dead or dying cells (trypan blue, DiIC, caspase-3, and annexin V) with none of the assays noting a meaningful variance between the two methods. In addition, genetic expression analysis with microarrays revealed no significant differences between the two groups. Finally, glioma cells derived from both methods of expansion formed large invasive tumors exhibiting GBM features when implanted in immune-compromised animals. A detailed functional, protein and genetic characterization of human GBM cells cultured in serum-free defined conditions demonstrated no statistically meaningful differences when grown using sphere (NSA) or adherent conditions. Hence, both methods are functionally equivalent and remain suitable options for expanding primary high-grade gliomas in tissue culture. PMID:25806119

  4. Identifying Social Learning in Animal Populations: A New ‘Option-Bias’ Method

    PubMed Central

    Kendal, Rachel L.; Kendal, Jeremy R.; Hoppitt, Will; Laland, Kevin N.

    2009-01-01

    Background Studies of natural animal populations reveal widespread evidence for the diffusion of novel behaviour patterns, and for intra- and inter-population variation in behaviour. However, claims that these are manifestations of animal ‘culture’ remain controversial because alternative explanations to social learning remain difficult to refute. This inability to identify social learning in social settings has also contributed to the failure to test evolutionary hypotheses concerning the social learning strategies that animals deploy. Methodology/Principal Findings We present a solution to this problem, in the form of a new means of identifying social learning in animal populations. The method is based on the well-established premise of social learning research, that - when ecological and genetic differences are accounted for - social learning will generate greater homogeneity in behaviour between animals than expected in its absence. Our procedure compares the observed level of homogeneity to a sampling distribution generated utilizing randomization and other procedures, allowing claims of social learning to be evaluated according to consensual standards. We illustrate the method on data from groups of monkeys provided with novel two-option extractive foraging tasks, demonstrating that social learning can indeed be distinguished from unlearned processes and asocial learning, and revealing that the monkeys only employed social learning for the more difficult tasks. The method is further validated against published datasets and through simulation, and exhibits higher statistical power than conventional inferential statistics. Conclusions/Significance The method is potentially a significant technological development, which could prove of considerable value in assessing the validity of claims for culturally transmitted behaviour in animal groups. It will also be of value in enabling investigation of the social learning strategies deployed in captive and natural animal populations. PMID:19657389

  5. Emerging technologies for pediatric and adult trauma care.

    PubMed

    Moulton, Steven L; Haley-Andrews, Stephanie; Mulligan, Jane

    2010-06-01

    Current Emergency Medical Service protocols rely on provider-directed care for evaluation, management and triage of injured patients from the field to a trauma center. New methods to quickly diagnose, support and coordinate the movement of trauma patients from the field to the most appropriate trauma center are in development. These methods will enhance trauma care and promote trauma system development. Recent advances in machine learning, statistical methods, device integration and wireless communication are giving rise to new methods for vital sign data analysis and a new generation of transport monitors. These monitors will collect and synchronize exponentially growing amounts of vital sign data with electronic patient care information. The application of advanced statistical methods to these complex clinical data sets has the potential to reveal many important physiological relationships and treatment effects. Several emerging technologies are converging to yield a new generation of smart sensors and tightly integrated transport monitors. These technologies will assist prehospital providers in quickly identifying and triaging the most severely injured children and adults to the most appropriate trauma centers. They will enable the development of real-time clinical support systems of increasing complexity, able to provide timelier, more cost-effective, autonomous care.

  6. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Gene selection for microarray cancer classification using a new evolutionary method employing artificial intelligence concepts.

    PubMed

    Dashtban, M; Balafar, Mohammadali

    2017-03-01

    Gene selection is a demanding task for microarray data analysis. The diverse complexity of different cancers makes this issue still challenging. In this study, a novel evolutionary method based on genetic algorithms and artificial intelligence is proposed to identify predictive genes for cancer classification. A filter method was first applied to reduce the dimensionality of feature space followed by employing an integer-coded genetic algorithm with dynamic-length genotype, intelligent parameter settings, and modified operators. The algorithmic behaviors including convergence trends, mutation and crossover rate changes, and running time were studied, conceptually discussed, and shown to be coherent with literature findings. Two well-known filter methods, Laplacian and Fisher score, were examined considering similarities, the quality of selected genes, and their influences on the evolutionary approach. Several statistical tests concerning choice of classifier, choice of dataset, and choice of filter method were performed, and they revealed some significant differences between the performance of different classifiers and filter methods over datasets. The proposed method was benchmarked upon five popular high-dimensional cancer datasets; for each, top explored genes were reported. Comparing the experimental results with several state-of-the-art methods revealed that the proposed method outperforms previous methods in DLBCL dataset. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. The Legacy of Florence Nightingale's Environmental Theory: Nursing Research Focusing on the Impact of Healthcare Environments.

    PubMed

    Zborowsky, Terri

    2014-01-01

    The purpose of this paper is to explore nursing research that is focused on the impact of healthcare environments and that has resonance with the aspects of Florence Nightingale's environmental theory. Nurses have a unique ability to apply their observational skills to understand the role of the designed environment to enable healing in their patients. This affords nurses the opportunity to engage in research studies that have immediate impact on the act of nursing. Descriptive statistics were performed on 67 healthcare design-related research articles from 25 nursing journals to discover the topical areas of interest of nursing research today. Data were also analyzed to reveal the research designs, research methods, and research settings. These data are part of an ongoing study. Descriptive statistics reveal that topics and settings most frequently cited are in keeping with the current healthcare foci of patient care quality and safety in acute and intensive care environments. Research designs and methods most frequently cited are in keeping with the early progression of a knowledge area. A few assertions can be made as a result of this study. First, education is important to continue the knowledge development in this area. Second, multiple method research studies should continue to be considered as important to healthcare research. Finally, bedside nurses are in the best position possible to begin to help us all, through research, understand how the design environment impacts patients during the act of nursing. Evidence-based design, literature review, nursing.

  9. A Rapid Colorimetric Method Reveals Fraudulent Substitutions in Sea Urchin Roe Marketed in Sardinia (Italy).

    PubMed

    Meloni, Domenico; Spina, Antonio; Satta, Gianluca; Chessa, Vittorio

    2016-06-25

    In recent years, besides the consumption of fresh sea urchin specimens, the demand of minimally-processed roe has grown considerably. This product has made frequent consumption in restaurants possible and frauds are becoming widespread with the partial replacement of sea urchin roe with surrogates that are similar in colour. One of the main factors that determines the quality of the roe is its colour and small differences in colour scale cannot be easily discerned by the consumers. In this study we have applied a rapid colorimetric method for reveal the fraudulent partial substitution of semi-solid sea urchin roe with liquid egg yolk. Objective assessment of whiteness (L*), redness (a*), yellowness (b*), hue (h*), and chroma (C*) was carried out with a digital spectrophotometer using the CIE L*a*b* colour measurement system. The colorimetric method highlighted statistically significant differences among sea urchin roe and liquid egg yolk that could be easily discerned quantitatively.

  10. An application of statistics to comparative metagenomics

    PubMed Central

    Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A

    2006-01-01

    Background Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Results Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. Conclusion The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems. PMID:16549025

  11. An application of statistics to comparative metagenomics.

    PubMed

    Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A

    2006-03-20

    Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems.

  12. Stiffening of fluid membranes due to thermal undulations: density-matrix renormalization-group study.

    PubMed

    Nishiyama, Yoshihiro

    2002-12-01

    It has been considered that the effective bending rigidity of fluid membranes should be reduced by thermal undulations. However, recent thorough investigation by Pinnow and Helfrich revealed the significance of measure factors for the partition sum. Accepting the local curvature as a statistical measure, they found that fluid membranes are stiffened macroscopically. In order to examine this remarkable idea, we performed extensive ab initio simulations for a fluid membrane. We set up a transfer matrix that is diagonalized by means of the density-matrix renormalization group. Our method has an advantage, in that it allows us to survey various statistical measures. As a consequence, we found that the effective bending rigidity flows toward strong coupling under the choice of local curvature as a statistical measure. On the contrary, for other measures such as normal displacement and tilt angle, we found a clear tendency toward softening.

  13. Statistical Methods for the Analysis of Discrete Choice Experiments: A Report of the ISPOR Conjoint Analysis Good Research Practices Task Force.

    PubMed

    Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P

    2016-06-01

    Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. [Interlaboratory Study on Evaporation Residue Test for Food Contact Products (Report 1)].

    PubMed

    Ohno, Hiroyuki; Mutsuga, Motoh; Abe, Tomoyuki; Abe, Yutaka; Amano, Homare; Ishihara, Kinuyo; Ohsaka, Ikue; Ohno, Haruka; Ohno, Yuichiro; Ozaki, Asako; Kakihara, Yoshiteru; Kobayashi, Hisashi; Sakuragi, Hiroshi; Shibata, Hiroshi; Shirono, Katsuhiro; Sekido, Haruko; Takasaka, Noriko; Takenaka, Yu; Tajima, Yoshiyasu; Tanaka, Aoi; Tanaka, Hideyuki; Tonooka, Hiroyuki; Nakanishi, Toru; Nomura, Chie; Haneishi, Nahoko; Hayakawa, Masato; Miura, Toshihiko; Yamaguchi, Miku; Watanabe, Kazunari; Sato, Kyoko

    2018-01-01

    An interlaboratory study was performed to evaluate the equivalence between an official method and a modified method of evaporation residue test using three food-simulating solvents (water, 4% acetic acid and 20% ethanol), based on the Japanese Food Sanitation Law for food contact products. Twenty-three laboratories participated, and tested the evaporation residues of nine test solutions as blind duplicates. For evaporation, a water bath was used in the official method, and a hot plate in the modified method. In most laboratories, the test solutions were heated until just prior to evaporation to dryness, and then allowed to dry under residual heat. Statistical analysis revealed that there was no significant difference between the two methods, regardless of the heating equipment used. Accordingly, the modified method provides performance equal to the official method, and is available as an alternative method.

  15. The Math Problem: Advertising Students' Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Kendrick, Alice

    2013-01-01

    This study used the Students' Attitudes toward Statistics Scale (STATS) to measure attitude toward statistics among a national sample of advertising students. A factor analysis revealed four underlying factors make up the attitude toward statistics construct--"Interest & Future Applicability," "Confidence," "Statistical Tools," and "Initiative."…

  16. Quantifying the influences of various ecological factors on land surface temperature of urban forests.

    PubMed

    Ren, Yin; Deng, Lu-Ying; Zuo, Shu-Di; Song, Xiao-Dong; Liao, Yi-Lan; Xu, Cheng-Dong; Chen, Qi; Hua, Li-Zhong; Li, Zheng-Wei

    2016-09-01

    Identifying factors that influence the land surface temperature (LST) of urban forests can help improve simulations and predictions of spatial patterns of urban cool islands. This requires a quantitative analytical method that combines spatial statistical analysis with multi-source observational data. The purpose of this study was to reveal how human activities and ecological factors jointly influence LST in clustering regions (hot or cool spots) of urban forests. Using Xiamen City, China from 1996 to 2006 as a case study, we explored the interactions between human activities and ecological factors, as well as their influences on urban forest LST. Population density was selected as a proxy for human activity. We integrated multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) to develop a database on a unified urban scale. The driving mechanism of urban forest LST was revealed through a combination of multi-source spatial data and spatial statistical analysis of clustering regions. The results showed that the main factors contributing to urban forest LST were dominant tree species and elevation. The interactions between human activity and specific ecological factors linearly or nonlinearly increased LST in urban forests. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. In conclusion, quantitative studies based on spatial statistics and GeogDetector models should be conducted in urban areas to reveal interactions between human activities, ecological factors, and LST. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Separation and confirmation of showers

    NASA Astrophysics Data System (ADS)

    Neslušan, L.; Hajduková, M.

    2017-02-01

    Aims: Using IAU MDC photographic, IAU MDC CAMS video, SonotaCo video, and EDMOND video databases, we aim to separate all provable annual meteor showers from each of these databases. We intend to reveal the problems inherent in this procedure and answer the question whether the databases are complete and the methods of separation used are reliable. We aim to evaluate the statistical significance of each separated shower. In this respect, we intend to give a list of reliably separated showers rather than a list of the maximum possible number of showers. Methods: To separate the showers, we simultaneously used two methods. The use of two methods enables us to compare their results, and this can indicate the reliability of the methods. To evaluate the statistical significance, we suggest a new method based on the ideas of the break-point method. Results: We give a compilation of the showers from all four databases using both methods. Using the first (second) method, we separated 107 (133) showers, which are in at least one of the databases used. These relatively low numbers are a consequence of discarding any candidate shower with a poor statistical significance. Most of the separated showers were identified as meteor showers from the IAU MDC list of all showers. Many of them were identified as several of the showers in the list. This proves that many showers have been named multiple times with different names. Conclusions: At present, a prevailing share of existing annual showers can be found in the data and confirmed when we use a combination of results from large databases. However, to gain a complete list of showers, we need more-complete meteor databases than the most extensive databases currently are. We also still need a more sophisticated method to separate showers and evaluate their statistical significance. Tables A.1 and A.2 are also available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A40

  18. A Statistical Approach to Provide Individualized Privacy for Surveys

    PubMed Central

    Esponda, Fernando; Huerta, Kael; Guerrero, Victor M.

    2016-01-01

    In this paper we propose an instrument for collecting sensitive data that allows for each participant to customize the amount of information that she is comfortable revealing. Current methods adopt a uniform approach where all subjects are afforded the same privacy guarantees; however, privacy is a highly subjective property with intermediate points between total disclosure and non-disclosure: each respondent has a different criterion regarding the sensitivity of a particular topic. The method we propose empowers respondents in this respect while still allowing for the discovery of interesting findings through the application of well-known inferential procedures. PMID:26824758

  19. A Conway-Maxwell-Poisson (CMP) model to address data dispersion on positron emission tomography.

    PubMed

    Santarelli, Maria Filomena; Della Latta, Daniele; Scipioni, Michele; Positano, Vincenzo; Landini, Luigi

    2016-10-01

    Positron emission tomography (PET) in medicine exploits the properties of positron-emitting unstable nuclei. The pairs of γ- rays emitted after annihilation are revealed by coincidence detectors and stored as projections in a sinogram. It is well known that radioactive decay follows a Poisson distribution; however, deviation from Poisson statistics occurs on PET projection data prior to reconstruction due to physical effects, measurement errors, correction of deadtime, scatter, and random coincidences. A model that describes the statistical behavior of measured and corrected PET data can aid in understanding the statistical nature of the data: it is a prerequisite to develop efficient reconstruction and processing methods and to reduce noise. The deviation from Poisson statistics in PET data could be described by the Conway-Maxwell-Poisson (CMP) distribution model, which is characterized by the centring parameter λ and the dispersion parameter ν, the latter quantifying the deviation from a Poisson distribution model. In particular, the parameter ν allows quantifying over-dispersion (ν<1) or under-dispersion (ν>1) of data. A simple and efficient method for λ and ν parameters estimation is introduced and assessed using Monte Carlo simulation for a wide range of activity values. The application of the method to simulated and experimental PET phantom data demonstrated that the CMP distribution parameters could detect deviation from the Poisson distribution both in raw and corrected PET data. It may be usefully implemented in image reconstruction algorithms and quantitative PET data analysis, especially in low counting emission data, as in dynamic PET data, where the method demonstrated the best accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Estimation of the vortex length scale and intensity from two-dimensional samples

    NASA Technical Reports Server (NTRS)

    Reuss, D. L.; Cheng, W. P.

    1992-01-01

    A method is proposed for estimating flow features that influence flame wrinkling in reciprocating internal combustion engines, where traditional statistical measures of turbulence are suspect. Candidate methods were tested in a computed channel flow where traditional turbulence measures are valid and performance can be rationally evaluated. Two concepts are tested. First, spatial filtering is applied to the two-dimensional velocity distribution and found to reveal structures corresponding to the vorticity field. Decreasing the spatial-frequency cutoff of the filter locally changes the character and size of the flow structures that are revealed by the filter. Second, vortex length scale and intensity is estimated by computing the ensemble-average velocity distribution conditionally sampled on the vorticity peaks. The resulting conditionally sampled 'average vortex' has a peak velocity less than half the rms velocity and a size approximately equal to the two-point-correlation integral-length scale.

  1. Methods of comparing associative models and an application to retrospective revaluation.

    PubMed

    Witnauer, James E; Hutchings, Ryan; Miller, Ralph R

    2017-11-01

    Contemporary theories of associative learning are increasingly complex, which necessitates the use of computational methods to reveal predictions of these models. We argue that comparisons across multiple models in terms of goodness of fit to empirical data from experiments often reveal more about the actual mechanisms of learning and behavior than do simulations of only a single model. Such comparisons are best made when the values of free parameters are discovered through some optimization procedure based on the specific data being fit (e.g., hill climbing), so that the comparisons hinge on the psychological mechanisms assumed by each model rather than being biased by using parameters that differ in quality across models with respect to the data being fit. Statistics like the Bayesian information criterion facilitate comparisons among models that have different numbers of free parameters. These issues are examined using retrospective revaluation data. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. [Population genetics of the inhabitants of Northern European USSR. II. Blood group distribution and antropogenetic characteristics in 6 villages in Archangel Oblast].

    PubMed

    Revazov, A A; Pasekov, V P; Lukasheva, I D

    1975-01-01

    The paper deals with the distribution of genetic markers (systems ABO, MN, Rh (D), Hp, PTC) and a number of demographic (folding of arms, hand clasping, tongue rolling, right- and left-handedness, of the type of ear lobe, of the types of dermatoglyphic patterns) in the inhabitants of 6 villages in the Mezen District of the Archangelsk Region of the RSFSR (river Peosa basin). The data presented in this work were obtained in the course of examination of over 800 persons. Differences in the interpretation of the results of generally adopted methods of statistical analysis of samples from small populations are discussed. Among the systems analysed in one third of all the cases there was a statistically significant deviation from Hardy-Weinberg's ratios. For the MN blood groups and haptoglobins this was caused by the excess of heterozygotes. The test of Hardy--Weinberg's ratios at the level of two-loci phenotypes revealed no statistically significant deviations either in separate villages or in all the villages taken together. The analysis of heterogeneity with respect to markers inherited according to Mendel's law revealed statistically significant differences between villages in all the systems except haptoglobins. A considerable heterogeneity in the distribution of family names, the frequencies of some of them varying from village to village from 0 to 90%. Statistically significant differences between villages were shown for all the anthropogenetic characters except arm folding, hand clasping and right-left-handedness. Considering the uniformity of the environmental pressure in the region examined, the heterogeneity of the population studied is apparently associated with a random genetic differentiation (genetic drift) and, possibly, with the effect of the progenitor.

  3. Measurement of turbulent spatial structure and kinetic energy spectrum by exact temporal-to-spatial mapping

    NASA Astrophysics Data System (ADS)

    Buchhave, Preben; Velte, Clara M.

    2017-08-01

    We present a method for converting a time record of turbulent velocity measured at a point in a flow to a spatial velocity record consisting of consecutive convection elements. The spatial record allows computation of dynamic statistical moments such as turbulent kinetic wavenumber spectra and spatial structure functions in a way that completely bypasses the need for Taylor's hypothesis. The spatial statistics agree with the classical counterparts, such as the total kinetic energy spectrum, at least for spatial extents up to the Taylor microscale. The requirements for applying the method are access to the instantaneous velocity magnitude, in addition to the desired flow quantity, and a high temporal resolution in comparison to the relevant time scales of the flow. We map, without distortion and bias, notoriously difficult developing turbulent high intensity flows using three main aspects that distinguish these measurements from previous work in the field: (1) The measurements are conducted using laser Doppler anemometry and are therefore not contaminated by directional ambiguity (in contrast to, e.g., frequently employed hot-wire anemometers); (2) the measurement data are extracted using a correctly and transparently functioning processor and are analysed using methods derived from first principles to provide unbiased estimates of the velocity statistics; (3) the exact mapping proposed herein has been applied to the high turbulence intensity flows investigated to avoid the significant distortions caused by Taylor's hypothesis. The method is first confirmed to produce the correct statistics using computer simulations and later applied to measurements in some of the most difficult regions of a round turbulent jet—the non-equilibrium developing region and the outermost parts of the developed jet. The proposed mapping is successfully validated using corresponding directly measured spatial statistics in the fully developed jet, even in the difficult outer regions of the jet where the average convection velocity is negligible and turbulence intensities increase dramatically. The measurements in the developing region reveal interesting features of an incomplete Richardson-Kolmogorov cascade under development.

  4. An Investigation of Dental Luting Cement Solubility as a Function of the Marginal Gap.

    DTIC Science & Technology

    1988-05-01

    way ANOVA for the Phase 1 Diffusion Study revealed that there were statistically significant differences between the test groups. A Duncan’s Multiple...cement. The 25, 50, and 75 micron groups demonstrated no statistically significant differences in the amount of remaining luting cement. ( p< 0.05) A...one-way ANOVA was also performed on Phase 2 Dynamic Study. This test revealed that there were statistically significant differences among the test

  5. Extracting fingerprint of wireless devices based on phase noise and multiple level wavelet decomposition

    NASA Astrophysics Data System (ADS)

    Zhao, Weichen; Sun, Zhuo; Kong, Song

    2016-10-01

    Wireless devices can be identified by the fingerprint extracted from the signal transmitted, which is useful in wireless communication security and other fields. This paper presents a method that extracts fingerprint based on phase noise of signal and multiple level wavelet decomposition. The phase of signal will be extracted first and then decomposed by multiple level wavelet decomposition. The statistic value of each wavelet coefficient vector is utilized for constructing fingerprint. Besides, the relationship between wavelet decomposition level and recognition accuracy is simulated. And advertised decomposition level is revealed as well. Compared with previous methods, our method is simpler and the accuracy of recognition remains high when Signal Noise Ratio (SNR) is low.

  6. Probabilistic analysis and fatigue damage assessment of offshore mooring system due to non-Gaussian bimodal tension processes

    NASA Astrophysics Data System (ADS)

    Chang, Anteng; Li, Huajun; Wang, Shuqing; Du, Junfeng

    2017-08-01

    Both wave-frequency (WF) and low-frequency (LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system. This paper conducts a comprehensive investigation of applicable probability density functions (PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method. Short-term statistical characteristics of mooring-line tension responses are firstly investigated, in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients. Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes. Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components. A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses. Using time domain simulation as a benchmark, its accuracy is further validated using a numerical case study of a moored semi-submersible platform.

  7. A novel approach to detect test-seeking behaviour in the blood donor population: making the invisible visible.

    PubMed

    de Vos, A S; Lieshout-Krikke, R W; Slot, E; Cator, E A; Janssen, M P

    2016-10-01

    Individuals may donate blood in order to determine their infection status after exposure to an increased infection risk. Such test-seeking behaviour decreases transfusion safety. Instances of test seeking are difficult to substantiate as donors are unlikely to admit to such behaviour. However, manifestation in a population of repeat donors may be determined using statistical inference. Test-seeking donors would be highly motivated to donate following infection risk, influencing the timing of their donation. Donation intervals within 2005-2014 of all Dutch blood donors who acquired syphilis (N = 50), HIV (N = 13), HTLV (N = 4) or HCV (N = 2) were compared to donation intervals of uninfected blood donors (N = 7 327 836) using the Anderson-Darling test. We adjusted for length bias as well as for age, gender and donation type of the infected. Additionally, the power of the proposed method was investigated by simulation. Among the Dutch donors who acquired infection, we found only a non-significant overrepresentation of short donation intervals (P = 0·54). However, we show by simulation that both relatively short and long donation intervals among infected donors can reveal test seeking. The power of the method is >90% if among 69 infected donors >35 (51%) are test seeking, or if among 320 infected donors >90 (30%) are test seeking. We show how statistical analysis may be used to reveal the extent of test seeking in repeat blood donor populations. In the Dutch setting, indications for test-seeking behaviour were not statistically significant. This may, however, be due to the low number of infected individuals. © 2016 International Society of Blood Transfusion.

  8. Failure of antiarrhythmic drugs to prevent experimental reperfusion ventricular fibrillation.

    PubMed

    Naito, M; Michelson, E L; Kmetzo, J J; Kaplinsky, E; Dreifus, L S

    1981-01-01

    Ninety-nine adult mongrel dogs underwent acute ligation of the proximal left anterior descending coronary artery. Thirty minutes later, the occlusion was released to evaluate the effectiveness of five antiarrhythmic protocols in eliminating reperfusion ventricular fibrillation. The five protocols included: protocol 1 --i.v. lidocaine, preligation and prerelease (n = 19); protocol 2 -- i.v. lidocaine, prereperfusion only (n = 22); protocol 3 -- chronic, oral, daily amiodarone for 2 weeks preligation (n = 19); protocol 4 -- i.v. procainamide, preligation and prereperfusion (n = 21); and protocol 5 -- i.v. verapamil, prereperfusion (n = 18). Each regimen was evaluated with respect to the incidence of reperfusion ventricular fibrillation in dogs that survived to reperfusion, and the results were compared to 77 control dogs that underwent identical coronary artery occlusion and release procedures without drug therapy. The incidence of reperfusion ventricular fibrillation was as follows: protocol 1 -- seven of 15 dogs (47%); protocol 2 -- six of 18 (33%); protocol 3 -- 11 of 16 dogs (69%); protocol 4 -- eight of 17 dogs (47%); and protocol 5 -- 10 of 17 dogs (59%), compared with 36 of 60 (60%) in control dogs. Using chi-square analysis, protocol 2 was beneficial (p < 0.05). The dogs were then stratified into high- and low-risk subgroups based on the arrhythmic events of the antecedent coronary artery ligation periods, and predictive risk indexes for the occurrence of reperfusion ventricular fibrillation were developed. the Mantel-Haenszel method of statistical analysis revealed that none of these protocols resulted in a statistically significant reduction in the incidence of reperfusion ventricular fibrillation. Thus, use of these predictive indexes plus appropriate statistical methods has revealed, unexpectedly, limitations in the efficacy of a spectrum of antiarrhythmic agents in preventing reperfusion ventricular fibrillation.

  9. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  10. Estimation of Signal Coherence Threshold and Concealed Spectral Lines Applied to Detection of Turbofan Engine Combustion Noise

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2010-01-01

    Combustion noise from turbofan engines has become important, as the noise from sources like the fan and jet are reduced. An aligned and un-aligned coherence technique has been developed to determine a threshold level for the coherence and thereby help to separate the coherent combustion noise source from other noise sources measured with far-field microphones. This method is compared with a statistics based coherence threshold estimation method. In addition, the un-aligned coherence procedure at the same time also reveals periodicities, spectral lines, and undamped sinusoids hidden by broadband turbofan engine noise. In calculating the coherence threshold using a statistical method, one may use either the number of independent records or a larger number corresponding to the number of overlapped records used to create the average. Using data from a turbofan engine and a simulation this paper shows that applying the Fisher z-transform to the un-aligned coherence can aid in making the proper selection of samples and produce a reasonable statistics based coherence threshold. Examples are presented showing that the underlying tonal and coherent broad band structure which is buried under random broadband noise and jet noise can be determined. The method also shows the possible presence of indirect combustion noise. Copyright 2011 Acoustical Society of America. This article may be downloaded for personal use only. Any other use requires prior permission of the author and the Acoustical Society of America.

  11. Evaluation of dissolution profile similarity - Comparison between the f2, the multivariate statistical distance and the f2 bootstrapping methods.

    PubMed

    Paixão, Paulo; Gouveia, Luís F; Silva, Nuno; Morais, José A G

    2017-03-01

    A simulation study is presented, evaluating the performance of the f 2 , the model-independent multivariate statistical distance and the f 2 bootstrap methods in the ability to conclude similarity between two dissolution profiles. Different dissolution profiles, based on the Noyes-Whitney equation and ranging from theoretical f 2 values between 100 and 40, were simulated. Variability was introduced in the dissolution model parameters in an increasing order, ranging from a situation complying with the European guidelines requirements for the use of the f 2 metric to several situations where the f 2 metric could not be used anymore. Results have shown that the f 2 is an acceptable metric when used according to the regulatory requirements, but loses its applicability when variability increases. The multivariate statistical distance presented contradictory results in several of the simulation scenarios, which makes it an unreliable metric for dissolution profile comparisons. The bootstrap f 2 , although conservative in its conclusions is an alternative suitable method. Overall, as variability increases, all of the discussed methods reveal problems that can only be solved by increasing the number of dosage form units used in the comparison, which is usually not practical or feasible. Additionally, experimental corrective measures may be undertaken in order to reduce the overall variability, particularly when it is shown that it is mainly due to the dissolution assessment instead of being intrinsic to the dosage form. Copyright © 2016. Published by Elsevier B.V.

  12. A comparison of the accuracy of intraoral scanners using an intraoral environment simulator

    PubMed Central

    Park, Hye-Nan; Lim, Young-Jun; Yi, Won-Jin

    2018-01-01

    PURPOSE The aim of this study was to design an intraoral environment simulator and to assess the accuracy of two intraoral scanners using the simulator. MATERIALS AND METHODS A box-shaped intraoral environment simulator was designed to simulate two specific intraoral environments. The cast was scanned 10 times by Identica Blue (MEDIT, Seoul, South Korea), TRIOS (3Shape, Copenhagen, Denmark), and CS3500 (Carestream Dental, Georgia, USA) scanners in the two simulated groups. The distances between the left and right canines (D3), first molars (D6), second molars (D7), and the left canine and left second molar (D37) were measured. The distance data were analyzed by the Kruskal-Wallis test. RESULTS The differences in intraoral environments were not statistically significant (P>.05). Between intraoral scanners, statistically significant differences (P<.05) were revealed by the Kruskal-Wallis test with regard to D3 and D6. CONCLUSION No difference due to the intraoral environment was revealed. The simulator will contribute to the higher accuracy of intraoral scanners in the future. PMID:29503715

  13. Experience and limited lighting may affect sleepiness of tunnel workers

    PubMed Central

    2014-01-01

    Background Working on shifts, especially on a night shift, influences the endogenous sleep regulation system leading to diminished sleep time and increased somnolence. We attempted to evaluate the impact of shifts on sleepiness and correlate the sleepiness score to the experience in a shift schedule. Materials and methods This cross-sectional study consists of 42 male and 2 female workers involved in a tunnel construction. They underwent spirometry, pulse oximetry and were asked to complete the Epworth Sleepiness Scale questionnaire. Results Statistical analysis revealed that workers of lower Epworth had a mean age of 43.6 years, compared to the mean age of 36.4 years of workers with higher Epworth. Furthermore, workers of lower Epworth were characterized by a mean number of shift years equal to 14.8, while those of higher Epworth possessed a mean number of shift years equal to 8. The shift schedule did not reveal any statistically significant correlation. Conclusions Workers employed for a longer time had diminished sleepiness. However, there is no relationship between night shifts and sleepiness, possibly because of exposure to artificial lighting in the construction site. PMID:24993796

  14. Hierarchical Commensurate and Power Prior Models for Adaptive Incorporation of Historical Information in Clinical Trials

    PubMed Central

    Hobbs, Brian P.; Carlin, Bradley P.; Mandrekar, Sumithra J.; Sargent, Daniel J.

    2011-01-01

    Summary Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected Type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this paper, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen. PMID:21361892

  15. Exploring and accounting for publication bias in mental health: a brief overview of methods.

    PubMed

    Mavridis, Dimitris; Salanti, Georgia

    2014-02-01

    OBJECTIVE Publication bias undermines the integrity of published research. The aim of this paper is to present a synopsis of methods for exploring and accounting for publication bias. METHODS We discussed the main features of the following methods to assess publication bias: funnel plot analysis; trim-and-fill methods; regression techniques and selection models. We applied these methods to a well-known example of antidepressants trials that compared trials submitted to the Food and Drug Administration (FDA) for regulatory approval. RESULTS The funnel plot-related methods (visual inspection, trim-and-fill, regression models) revealed an association between effect size and SE. Contours of statistical significance showed that asymmetry in the funnel plot is probably due to publication bias. Selection model found a significant correlation between effect size and propensity for publication. CONCLUSIONS Researchers should always consider the possible impact of publication bias. Funnel plot-related methods should be seen as a means of examining for small-study effects and not be directly equated with publication bias. Possible causes for funnel plot asymmetry should be explored. Contours of statistical significance may help disentangle whether asymmetry in a funnel plot is caused by publication bias or not. Selection models, although underused, could be useful resource when publication bias and heterogeneity are suspected because they address directly the problem of publication bias and not that of small-study effects.

  16. Higher-order statistical moments and a procedure that detects potentially anomalous years as two alternative methods describing alterations in continuous environmental data

    USGS Publications Warehouse

    Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.

    2015-01-01

    Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.

  17. Non-Born-Oppenheimer molecular dynamics of the spin-forbidden reaction O(3P) + CO(X 1Σ+) → CO2(tilde X{}^1Σ _g^ +)

    NASA Astrophysics Data System (ADS)

    Jasper, Ahren W.; Dawes, Richard

    2013-10-01

    The lowest-energy singlet (1 1A') and two lowest-energy triplet (1 3A' and 1 3A″) electronic states of CO2 are characterized using dynamically weighted multireference configuration interaction (dw-MRCI+Q) electronic structure theory calculations extrapolated to the complete basis set (CBS) limit. Global analytic representations of the dw-MRCI+Q/CBS singlet and triplet surfaces and of their CASSCF/aug-cc-pVQZ spin-orbit coupling surfaces are obtained via the interpolated moving least squares (IMLS) semiautomated surface fitting method. The spin-forbidden kinetics of the title reaction is calculated using the coupled IMLS surfaces and coherent switches with decay of mixing non-Born-Oppenheimer molecular dynamics. The calculated spin-forbidden association rate coefficient (corresponding to the high pressure limit of the rate coefficient) is 7-35 times larger at 1000-5000 K than the rate coefficient used in many detailed chemical models of combustion. A dynamical analysis of the multistate trajectories is presented. The trajectory calculations reveal direct (nonstatistical) and indirect (statistical) spin-forbidden reaction mechanisms and may be used to test the suitability of transition-state-theory-like statistical methods for spin-forbidden kinetics. Specifically, we consider the appropriateness of the "double passage" approximation, of assuming statistical distributions of seam crossings, and of applications of the unified statistical model for spin-forbidden reactions.

  18. Individual Movement Strategies Revealed through Novel Clustering of Emergent Movement Patterns

    NASA Astrophysics Data System (ADS)

    Valle, Denis; Cvetojevic, Sreten; Robertson, Ellen P.; Reichert, Brian E.; Hochmair, Hartwig H.; Fletcher, Robert J.

    2017-03-01

    Understanding movement is critical in several disciplines but analysis methods often neglect key information by adopting each location as sampling unit, rather than each individual. We introduce a novel statistical method that, by focusing on individuals, enables better identification of temporal dynamics of connectivity, traits of individuals that explain emergent movement patterns, and sites that play a critical role in connecting subpopulations. We apply this method to two examples that span movement networks that vary considerably in size and questions: movements of an endangered raptor, the snail kite (Rostrhamus sociabilis plumbeus), and human movement in Florida inferred from Twitter. For snail kites, our method reveals substantial differences in movement strategies for different bird cohorts and temporal changes in connectivity driven by the invasion of an exotic food resource, illustrating the challenge of identifying critical connectivity sites for conservation in the presence of global change. For human movement, our method is able to reliably determine the origin of Florida visitors and identify distinct movement patterns within Florida for visitors from different places, providing near real-time information on the spatial and temporal patterns of tourists. These results emphasize the need to integrate individual variation to generate new insights when modeling movement data.

  19. Task-Related Edge Density (TED)—A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain

    PubMed Central

    Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus

    2016-01-01

    The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach “Task-related Edge Density” (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function. PMID:27341204

  20. Task-Related Edge Density (TED)-A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain.

    PubMed

    Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus

    2016-01-01

    The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach "Task-related Edge Density" (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function.

  1. Statistical physics approaches to Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Peng, Shouyong

    Alzheimer's disease (AD) is the most common cause of late life dementia. In the brain of an AD patient, neurons are lost and spatial neuronal organizations (microcolumns) are disrupted. An adequate quantitative analysis of microcolumns requires that we automate the neuron recognition stage in the analysis of microscopic images of human brain tissue. We propose a recognition method based on statistical physics. Specifically, Monte Carlo simulations of an inhomogeneous Potts model are applied for image segmentation. Unlike most traditional methods, this method improves the recognition of overlapped neurons, and thus improves the overall recognition percentage. Although the exact causes of AD are unknown, as experimental advances have revealed the molecular origin of AD, they have continued to support the amyloid cascade hypothesis, which states that early stages of aggregation of amyloid beta (Abeta) peptides lead to neurodegeneration and death. X-ray diffraction studies reveal the common cross-beta structural features of the final stable aggregates-amyloid fibrils. Solid-state NMR studies also reveal structural features for some well-ordered fibrils. But currently there is no feasible experimental technique that can reveal the exact structure or the precise dynamics of assembly and thus help us understand the aggregation mechanism. Computer simulation offers a way to understand the aggregation mechanism on the molecular level. Because traditional all-atom continuous molecular dynamics simulations are not fast enough to investigate the whole aggregation process, we apply coarse-grained models and discrete molecular dynamics methods to increase the simulation speed. First we use a coarse-grained two-bead (two beads per amino acid) model. Simulations show that peptides can aggregate into multilayer beta-sheet structures, which agree with X-ray diffraction experiments. To better represent the secondary structure transition happening during aggregation, we refine the model to four beads per amino acid. Typical essential interactions, such as backbone hydrogen bond, hydrophobic and electrostatic interactions, are incorporated into our model. We study the aggregation of Abeta16-22, a peptide that can aggregate into a well-ordered fibrillar structure in experiments. Our results show that randomly-oriented monomers can aggregate into fibrillar subunits, which agree not only with X-ray diffraction experiments but also with solid-state NMR studies. Our findings demonstrate that coarse-grained models and discrete molecular dynamics simulations can help researchers understand the aggregation mechanism of amyloid peptides.

  2. Revealing Risks in Adaptation Planning: expanding Uncertainty Treatment and dealing with Large Projection Ensembles during Planning Scenario development

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2015-12-01

    Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.

  3. Nonclassical light revealed by the joint statistics of simultaneous measurements.

    PubMed

    Luis, Alfredo

    2016-04-15

    Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.

  4. Diagnosis of students' ability in a statistical course based on Rasch probabilistic outcome

    NASA Astrophysics Data System (ADS)

    Mahmud, Zamalia; Ramli, Wan Syahira Wan; Sapri, Shamsiah; Ahmad, Sanizah

    2017-06-01

    Measuring students' ability and performance are important in assessing how well students have learned and mastered the statistical courses. Any improvement in learning will depend on the student's approaches to learning, which are relevant to some factors of learning, namely assessment methods carrying out tasks consisting of quizzes, tests, assignment and final examination. This study has attempted an alternative approach to measure students' ability in an undergraduate statistical course based on the Rasch probabilistic model. Firstly, this study aims to explore the learning outcome patterns of students in a statistics course (Applied Probability and Statistics) based on an Entrance-Exit survey. This is followed by investigating students' perceived learning ability based on four Course Learning Outcomes (CLOs) and students' actual learning ability based on their final examination scores. Rasch analysis revealed that students perceived themselves as lacking the ability to understand about 95% of the statistics concepts at the beginning of the class but eventually they had a good understanding at the end of the 14 weeks class. In terms of students' performance in their final examination, their ability in understanding the topics varies at different probability values given the ability of the students and difficulty of the questions. Majority found the probability and counting rules topic to be the most difficult to learn.

  5. Unbiased estimators for spatial distribution functions of classical fluids

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Jarzynski, Christopher

    2005-01-01

    We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.

  6. The Relationship between Psychological Wellbeing and Body Image in Pregnant Women

    PubMed Central

    Fahami, Fariba; Amini-Abchuyeh, Maryam; Aghaei, Asghar

    2018-01-01

    Background: The aim of the present study was to determine the association between body image and psychological wellbeing during pregnancy. Materials and Methods: This descriptive correlational study was conducted on 320 pregnant women who were referred to health centers in Isfahan, Iran, during 2016 and had the inclusion criteria. They were selected by nonprobability convenient sampling. Data were gathered using standard psychological wellbeing and body image satisfaction questionnaires. The data were analyzed using Statistical Package for the Social Sciences software by descriptive and inferential statistical methods. Results: The results showed that the mean (SD) score of psychological wellbeing among participants was 77.50 (10.10) and their mean (SD) score of satisfaction with body image was 89.30 (14.60). Moreover, the results revealed a positive and significant relationship between the scores of psychological wellbeing and body image satisfaction (r=0.354, p <0.001). The results of regression analysis showed that the two variables of self-acceptance (t = 5.6, p <0.001) and personal growth (t = 2.06, p = 0.04)) can predict body image in pregnant women. Conclusions: The findings revealed a significant positive relationship between body image satisfaction and psychological wellbeing. Therefore, the training of positive attitude with respect to body image or increasing the level of knowledge on psychological wellbeing can create a positive cycle for these variables, and thus, make the pregnancy more enjoyable and acceptable. PMID:29861752

  7. [Effect of occupational stress on mental health].

    PubMed

    Yu, Shan-fa; Zhang, Rui; Ma, Liang-qing; Gu, Gui-zhen; Yang, Yan; Li, Kui-rong

    2003-02-01

    To study the effect of job psychological demands and job control on mental health and their interaction. 93 male freight train dispatchers were evaluated by using revised Job Demand-Control Scale and 7 strain scales. Stepwise regression analysis, Univariate ANOVA, Kruskal-Wallis H and Modian methods were used in statistic analysis. Kruskal-Wallis H and Modian methods analysis revealed the difference in mental health scores among groups of decision latitude (mean rank 55.57, 47.95, 48.42, 33.50, P < 0.05), the differences in scores of mental health (37.45, 40.01, 58.35), job satisfaction (53.18, 46.91, 32.43), daily life strains (33.00, 44.96, 56.12) and depression (36.45, 42.25, 53.61) among groups of job time demands (P < 0.05) were all statistically significant. ANOVA showed that job time demands and decision latitude had interaction effects on physical complains (R(2) = 0.24), state-anxiety (R(2) = 0.26), and daytime fatigue (R(2) = 0.28) (P < 0.05). Regression analysis revealed a significant job time demands and job decision latitude interaction effect as well as significant main effects of the some independent variables on different job strains (R(2) > 0.05). Job time demands and job decision latitude have direct and interactive effects on psychosomatic health, the more time demands, the more psychological strains, the effect of job time demands is greater than that of job decision latitude.

  8. Nanoparticles Prepared From N,N-Dimethyl-N-Octyl Chitosan as the Novel Approach for Oral Delivery of Insulin: Preparation, Statistical Optimization and In-vitro Characterization

    PubMed Central

    Shamsa, Elnaz Sadat; Mahjub, Reza; Mansoorpour, Maryam; Rafiee-Tehrani, Morteza; Abedin Dorkoosh, Farid

    2018-01-01

    In this study, N,N-Dimethyl-N-Octyl chitosan was synthesized. Nanoparticles containing insulin were prepared using PEC method and were statistically optimized using the Box-Behnken response surface methodology. The independent factors were considered to be the insulin concentration, concentration and pH of the polymer solution, while the dependent factors were characterized as the size, zeta potential, PdI and entrapment efficiency. The optimized nanoparticles were morphologically studied using SEM. The cytotoxicity of the nanoparticles on the Caco-2 cell culture was studied using the MTT cytotoxicity assay method, while the permeation of the insulin nanoparticles across the Caco-2 cell monolayer was also determined. The optimized nanoparticles posed appropriate physicochemical properties. The SEM morphological studies showed spherical to sub-spherical nanoparticles with no sign of aggregation. The in-vitro release study showed that 95.5 ± 1.40% of the loaded insulin was released in 400 min. The permeability studies revealed significant enhancement in the insulin permeability using nanoparticles prepared from octyl chitosan at 240 min (11.3 ± 0.78%). The obtained data revealed that insulin nanoparticles prepared from N,N-Dimethyl-N-Octyl chitosan can be considered as the good candidate for oral delivery of insulin compared to nanoparticles prepared from N,N,N-trimethyl chitosan.

  9. Morphological study of the palatal rugae in western Indian population.

    PubMed

    Gondivkar, Shailesh M; Patel, Swetal; Gadbail, Amol R; Gaikwad, Rahul N; Chole, Revant; Parikh, Rima V

    2011-10-01

    The aim of this study was to identify and compare the different morphological rugae patterns in males and females of western Indian population, which may be an additional method of identification in cases of crimes or aircraft accidents. A total of 108 plaster casts, equally distributed between the sexes and belonging to similar age-group, were examined for different biometric characteristics of the palatal rugae including number, shape, length, direction and unification and their incidence recorded. Association between these rugae biometric characteristics and sex were tested using chi-square analysis and statistical descriptors were identified for each of these parameters using the SPSS 15.0. The study revealed a statistically significant difference in the total number of rugae between the two sexes (P = 0.000). The different types of rugae between the males and females were statistically compared. The female showed a highly significant difference in the sinuous (P = 0.002) and primary type (P = 0.000) while the male had a significant difference in the unification (P = 0.005). The predominant direction of the rugae was found to be forward relative to backward. It may be concluded that the rugae pattern can be an additional method of differentiation between the male and female in conjunction with the other methods such as visual, fingerprints, and dental characteristics in forensic sciences. Copyright © 2011 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. A survey and evaluations of histogram-based statistics in alignment-free sequence comparison.

    PubMed

    Luczak, Brian B; James, Benjamin T; Girgis, Hani Z

    2017-12-06

    Since the dawn of the bioinformatics field, sequence alignment scores have been the main method for comparing sequences. However, alignment algorithms are quadratic, requiring long execution time. As alternatives, scientists have developed tens of alignment-free statistics for measuring the similarity between two sequences. We surveyed tens of alignment-free k-mer statistics. Additionally, we evaluated 33 statistics and multiplicative combinations between the statistics and/or their squares. These statistics are calculated on two k-mer histograms representing two sequences. Our evaluations using global alignment scores revealed that the majority of the statistics are sensitive and capable of finding similar sequences to a query sequence. Therefore, any of these statistics can filter out dissimilar sequences quickly. Further, we observed that multiplicative combinations of the statistics are highly correlated with the identity score. Furthermore, combinations involving sequence length difference or Earth Mover's distance, which takes the length difference into account, are always among the highest correlated paired statistics with identity scores. Similarly, paired statistics including length difference or Earth Mover's distance are among the best performers in finding the K-closest sequences. Interestingly, similar performance can be obtained using histograms of shorter words, resulting in reducing the memory requirement and increasing the speed remarkably. Moreover, we found that simple single statistics are sufficient for processing next-generation sequencing reads and for applications relying on local alignment. Finally, we measured the time requirement of each statistic. The survey and the evaluations will help scientists with identifying efficient alternatives to the costly alignment algorithm, saving thousands of computational hours. The source code of the benchmarking tool is available as Supplementary Materials. © The Author 2017. Published by Oxford University Press.

  11. A system of registration and statistics.

    PubMed

    Blayo, C

    1993-06-01

    In 1971, WHO recommended obligatory reporting to countries preparing to legalize induced abortion, however, there is no registration of abortions in Austria. Greece, Luxembourg, and Portugal, or in Northern Ireland, Ireland, and Malta, where abortion is prohibited, or in Switzerland, where it is limited. Albania is preparing to institute such a provision. Registration is not always complete in Germany, France, Italy, Poland, and Spain, and in the republics of the former USSR, particularly Lithuania. The data gathered are often further impoverished at the stage of the publication of the statistics. Certain estimations, or even results of surveys, make up for these shortcomings. A retrospective survey of a sample representing all women age 15 years or older would allow the reconstruction of statistics on abortions of past years. Systematic registration must be accompanied by the publication of a statistical record. Sterilization appears to be spreading in Europe, but it is only very rarely registered. The proportion of couples sterilized is sometimes obtained by surveys, but there is hardly any information on the characteristics of this group. On the other hand, the practice of contraception can be easily assessed, as in the majority of countries contraceptives are dispensed through pharmacies, public family planning centers, and private practitioners. Family planning centers sometimes are sources of statistical data. In some countries producers' associations make statistics available on the sale of contraceptives. Exact surveys facilitate the characterization of the users and reveal the methods they employ. Many countries carried out such surveys at the end of the 1970s under the aegis of world fertility surveys. It is urgent to invest in data collection suitable for learning the proportion of women who utilize each method of contraception in all the countries of Europe.

  12. Corrosion resistance assessment of Co-Cr alloy frameworks fabricated by CAD/CAM milling, laser sintering, and casting methods.

    PubMed

    Tuna, Süleyman Hakan; Özçiçek Pekmez, Nuran; Kürkçüoğlu, Işin

    2015-11-01

    The effects of fabrication methods on the corrosion resistance of frameworks produced with Co-Cr alloys are not clear. The purpose of this in vitro study was to evaluate the electrochemical corrosion resistance of Co-Cr alloy specimens that were fabricated by conventional casting, milling, and laser sintering. The specimens fabricated with 3 different methods were investigated by potentiodynamic tests and electrochemical impedance spectroscopy in an artificial saliva. Ions released into the artificial saliva were estimated with inductively coupled plasma-mass spectrometry, and the results were statistically analyzed. The specimen surfaces were investigated with scanning electron microscopy before and after the tests. In terms of corrosion current and Rct properties, statistically significant differences were found both among the means of the methods and among the means of the material groups (P<.05). With regard to ions released, a statistically significant difference was found among the material groups (P<.05); however, no difference was found among the methods. Scanning electron microscopic imaging revealed that the specimens produced by conventional casting were affected to a greater extent by etching and electrochemical corrosion than those produced by milling and laser sintering. The corrosion resistance of a Co-Cr alloy specimens fabricated by milling or laser sintering was greater than that of the conventionally cast alloy specimens. The Co-Cr specimens produced by the same method also differed from one another in terms of corrosion resistance. These differences may be related to the variations in the alloy compositions. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  13. Empirical analysis of online human dynamics

    NASA Astrophysics Data System (ADS)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  14. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples.

    PubMed

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-07-05

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell.

  15. Multiresolution multiscale active mask segmentation of fluorescence microscope images

    NASA Astrophysics Data System (ADS)

    Srinivasa, Gowri; Fickus, Matthew; Kovačević, Jelena

    2009-08-01

    We propose an active mask segmentation framework that combines the advantages of statistical modeling, smoothing, speed and flexibility offered by the traditional methods of region-growing, multiscale, multiresolution and active contours respectively. At the crux of this framework is a paradigm shift from evolving contours in the continuous domain to evolving multiple masks in the discrete domain. Thus, the active mask framework is particularly suited to segment digital images. We demonstrate the use of the framework in practice through the segmentation of punctate patterns in fluorescence microscope images. Experiments reveal that statistical modeling helps the multiple masks converge from a random initial configuration to a meaningful one. This obviates the need for an involved initialization procedure germane to most of the traditional methods used to segment fluorescence microscope images. While we provide the mathematical details of the functions used to segment fluorescence microscope images, this is only an instantiation of the active mask framework. We suggest some other instantiations of the framework to segment different types of images.

  16. On vital aid: the why, what and how of validation

    PubMed Central

    Kleywegt, Gerard J.

    2009-01-01

    Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such tech­niques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968

  17. Structural parameters of young star clusters: fractal analysis

    NASA Astrophysics Data System (ADS)

    Hetem, A.

    2017-07-01

    A unified view of star formation in the Universe demand detailed and in-depth studies of young star clusters. This work is related to our previous study of fractal statistics estimated for a sample of young stellar clusters (Gregorio-Hetem et al. 2015, MNRAS 448, 2504). The structural properties can lead to significant conclusions about the early stages of cluster formation: 1) virial conditions can be used to distinguish warm collapsed; 2) bound or unbound behaviour can lead to conclusions about expansion; and 3) fractal statistics are correlated to the dynamical evolution and age. The technique of error bars estimation most used in the literature is to adopt inferential methods (like bootstrap) to estimate deviation and variance, which are valid only for an artificially generated cluster. In this paper, we expanded the number of studied clusters, in order to enhance the investigation of the cluster properties and dynamic evolution. The structural parameters were compared with fractal statistics and reveal that the clusters radial density profile show a tendency of the mean separation of the stars increase with the average surface density. The sample can be divided into two groups showing different dynamic behaviour, but they have the same dynamic evolution, since the entire sample was revealed as being expanding objects, for which the substructures do not seem to have been completely erased. These results are in agreement with the simulations adopting low surface densities and supervirial conditions.

  18. A spatial cluster analysis of tractor overturns in Kentucky from 1960 to 2002

    USGS Publications Warehouse

    Saman, D.M.; Cole, H.P.; Odoi, A.; Myers, M.L.; Carey, D.I.; Westneat, S.C.

    2012-01-01

    Background: Agricultural tractor overturns without rollover protective structures are the leading cause of farm fatalities in the United States. To our knowledge, no studies have incorporated the spatial scan statistic in identifying high-risk areas for tractor overturns. The aim of this study was to determine whether tractor overturns cluster in certain parts of Kentucky and identify factors associated with tractor overturns. Methods: A spatial statistical analysis using Kulldorff's spatial scan statistic was performed to identify county clusters at greatest risk for tractor overturns. A regression analysis was then performed to identify factors associated with tractor overturns. Results: The spatial analysis revealed a cluster of higher than expected tractor overturns in four counties in northern Kentucky (RR = 2.55) and 10 counties in eastern Kentucky (RR = 1.97). Higher rates of tractor overturns were associated with steeper average percent slope of pasture land by county (p = 0.0002) and a greater percent of total tractors with less than 40 horsepower by county (p<0.0001). Conclusions: This study reveals that geographic hotspots of tractor overturns exist in Kentucky and identifies factors associated with overturns. This study provides policymakers a guide to targeted county-level interventions (e.g., roll-over protective structures promotion interventions) with the intention of reducing tractor overturns in the highest risk counties in Kentucky. ?? 2012 Saman et al.

  19. The Essential Genome of Escherichia coli K-12.

    PubMed

    Goodall, Emily C A; Robinson, Ashley; Johnston, Iain G; Jabbari, Sara; Turner, Keith A; Cunningham, Adam F; Lund, Peter A; Cole, Jeffrey A; Henderson, Ian R

    2018-02-20

    Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. IMPORTANCE Incentives to define lists of genes that are essential for bacterial survival include the identification of potential targets for antibacterial drug development, genes required for rapid growth for exploitation in biotechnology, and discovery of new biochemical pathways. To identify essential genes in Escherichia coli , we constructed a transposon mutant library of unprecedented density. Initial automated analysis of the resulting data revealed many discrepancies compared to the literature. We now report more extensive statistical analysis supported by both literature searches and detailed inspection of high-density TraDIS sequencing data for each putative essential gene for the E. coli model laboratory organism. This paper is important because it provides a better understanding of the essential genes of E. coli , reveals the limitations of relying on automated analysis alone, and provides a new standard for the analysis of TraDIS data. Copyright © 2018 Goodall et al.

  20. PA02.22. Hypolipidemic effect of different coconut oil extracts of vyosakatvivaradi formulation in wistar rats.

    PubMed Central

    Mahapatra, Anita; Rajurkar, Sudhir; Eranezhath, Sujith; Manohar, Ram

    2013-01-01

    Purpose: To study the preventive and therapeutic Hypolipidemic effect of different coconut oil extracts of Vyosakatvivaradi formulation. Method: High fat diet was fed for 21 days to induce Hyperlipidemia. 110 weanling wistar rats randomly divided in to Eleven groups, four in treatment, four in preventive group, two control and one standarad group. Four test drugs – 1. VCO (virgin coconut oil) + HERBS, 2. TCO (Traditional coconut oil) + HERBS, 3. CCO (Commercial coconut oil) + HERBS, 4. TCO + Coconut Milk + HERBS were administered at the dose rate of 0.06ml tid orally for 28 days in treatment group and 28 days in preventive group from the day one of experiment and the results were compared with Simvastatin 10 mg. All the animals were anesthetized using anesthetic ether and pooled blood samples from each group were collected on day Zero, day 21st and on termination day i.e. day 28th after start of actual treatment. Result: Animals in all groups did not reveal any change in their behavior or visible adverse reaction throughout the experimental period. Statistically significant reduction in mean triglyceride values in test drug -4 animals revealed preventive. Statistically significant reduction in the mean cholesterol level (mg/dl) was observed in test drug -4 animals. Statistically significant increase in the mean HDL level 5% level of significance was observed in preventive dose of test drug 3. Microscopic observations of liver, kidney and aorta revealed no significant change. Conclusion: The medicated oil “CCO + HERBS” and TCO + Coconut Milk + HERBS” showed encouraging therapeutic and preventive effects on hyperlipidemia. However, the oil “TCO + Coconut Milk + HERBS” is observed to be better than the oil CCO + HERBS. Though the oils “VCO + HERBS” and “TCO + HERBS” exhibited moderate hypolipidemic action.

  1. Imaging of Al/Fe ratios in synthetic Al-goethite revealed by nanoscale secondary ion mass spectrometry.

    PubMed

    Pohl, Lydia; Kölbl, Angelika; Werner, Florian; Mueller, Carsten W; Höschen, Carmen; Häusler, Werner; Kögel-Knabner, Ingrid

    2018-04-30

    Aluminium (Al)-substituted goethite is ubiquitous in soils and sediments. The extent of Al-substitution affects the physicochemical properties of the mineral and influences its macroscale properties. Bulk analysis only provides total Al/Fe ratios without providing information with respect to the Al-substitution of single minerals. Here, we demonstrate that nanoscale secondary ion mass spectrometry (NanoSIMS) enables the precise determination of Al-content in single minerals, while simultaneously visualising the variation of the Al/Fe ratio. Al-substituted goethite samples were synthesized with increasing Al concentrations of 0.1, 3, and 7 % and analysed by NanoSIMS in combination with established bulk spectroscopic methods (XRD, FTIR, Mössbauer spectroscopy). The high spatial resolution (50-150 nm) of NanoSIMS is accompanied by a high number of single-point measurements. We statistically evaluated the Al/Fe ratios derived from NanoSIMS, while maintaining the spatial information and reassigning it to its original localization. XRD analyses confirmed increasing concentration of incorporated Al within the goethite structure. Mössbauer spectroscopy revealed 11 % of the goethite samples generated at high Al concentrations consisted of hematite. The NanoSIMS data show that the Al/Fe ratios are in agreement with bulk data derived from total digestion and demonstrated small spatial variability between single-point measurements. More advantageously, statistical analysis and reassignment of single-point measurements allowed us to identify distinct spots with significantly higher or lower Al/Fe ratios. NanoSIMS measurements confirmed the capacity to produce images, which indicated the uniform increase in Al-concentrations in goethite. Using a combination of statistical analysis with information from complementary spectroscopic techniques (XRD, FTIR and Mössbauer spectroscopy) we were further able to reveal spots with lower Al/Fe ratios as hematite. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Statistical software applications used in health services research: analysis of published studies in the U.S

    PubMed Central

    2011-01-01

    Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990

  3. Improving suicide mortality statistics in Tarragona (Catalonia, Spain) between 2004-2012.

    PubMed

    Barbería, Eneko; Gispert, Rosa; Gallo, Belén; Ribas, Gloria; Puigdefàbregas, Anna; Freitas, Adriana; Segú, Elena; Torralba, Pilar; García-Sayago, Francisco; Estarellas, Aina

    2016-07-20

    Monitoring and preventing suicidal behaviour requires, among other data, knowing suicide deaths precisely. They often appear under-reported or misclassified in the official mortality statistics. The aim of this study is to analyse the under-reporting found in the suicide mortality statistics of Tarragona (a province of Catalonia, Spain). The analysis takes into account all suicide deaths that occurred in the Tarragona Area of the Catalan Institute of Legal Medicine and Forensic Sciences (TA-CILMFS) between 2004 and 2012. The sources of information were the death data files of the Catalan Mortality Register, as well as the Autopsies Files of the TA-CILMFS. Suicide rates and socio-demographic profiles were statistically compared between the suicide initially reported and the final one. The mean percentage of non-reported cases in the period was 16.2%, with a minimum percentage of 2.2% in 2005 and a maximum of 26.8% in 2009. The crude mortality rate by suicide rose from 6.6 to 7.9 per 100,000 inhabitants once forensic data were incorporated. Small differences were detected between the socio-demographic profile of the suicide initially reported and the final one. Supplementary information was obtained on the suicide method, which revealed a significant increase in poisoning and suicides involving trains. An exhaustive review of suicide deaths data from forensic sources has led to an improvement in the under-reported statistical information. It also improves the knowledge of the method of suicide and personal characteristics. Copyright © 2016 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Comparative analysis of targeted metabolomics: dominance-based rough set approach versus orthogonal partial least square-discriminant analysis.

    PubMed

    Blasco, H; Błaszczyński, J; Billaut, J C; Nadal-Desbarats, L; Pradat, P F; Devos, D; Moreau, C; Andres, C R; Emond, P; Corcia, P; Słowiński, R

    2015-02-01

    Metabolomics is an emerging field that includes ascertaining a metabolic profile from a combination of small molecules, and which has health applications. Metabolomic methods are currently applied to discover diagnostic biomarkers and to identify pathophysiological pathways involved in pathology. However, metabolomic data are complex and are usually analyzed by statistical methods. Although the methods have been widely described, most have not been either standardized or validated. Data analysis is the foundation of a robust methodology, so new mathematical methods need to be developed to assess and complement current methods. We therefore applied, for the first time, the dominance-based rough set approach (DRSA) to metabolomics data; we also assessed the complementarity of this method with standard statistical methods. Some attributes were transformed in a way allowing us to discover global and local monotonic relationships between condition and decision attributes. We used previously published metabolomics data (18 variables) for amyotrophic lateral sclerosis (ALS) and non-ALS patients. Principal Component Analysis (PCA) and Orthogonal Partial Least Square-Discriminant Analysis (OPLS-DA) allowed satisfactory discrimination (72.7%) between ALS and non-ALS patients. Some discriminant metabolites were identified: acetate, acetone, pyruvate and glutamine. The concentrations of acetate and pyruvate were also identified by univariate analysis as significantly different between ALS and non-ALS patients. DRSA correctly classified 68.7% of the cases and established rules involving some of the metabolites highlighted by OPLS-DA (acetate and acetone). Some rules identified potential biomarkers not revealed by OPLS-DA (beta-hydroxybutyrate). We also found a large number of common discriminating metabolites after Bayesian confirmation measures, particularly acetate, pyruvate, acetone and ascorbate, consistent with the pathophysiological pathways involved in ALS. DRSA provides a complementary method for improving the predictive performance of the multivariate data analysis usually used in metabolomics. This method could help in the identification of metabolites involved in disease pathogenesis. Interestingly, these different strategies mostly identified the same metabolites as being discriminant. The selection of strong decision rules with high value of Bayesian confirmation provides useful information about relevant condition-decision relationships not otherwise revealed in metabolomics data. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    PubMed

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  6. Reticulate evolutionary history and extensive introgression in mosquito species revealed by phylogenetic network analysis

    PubMed Central

    Wen, Dingqiao; Yu, Yun; Hahn, Matthew W.; Nakhleh, Luay

    2016-01-01

    The role of hybridization and subsequent introgression has been demonstrated in an increasing number of species. Recently, Fontaine et al. (Science, 347, 2015, 1258524) conducted a phylogenomic analysis of six members of the Anopheles gambiae species complex. Their analysis revealed a reticulate evolutionary history and pointed to extensive introgression on all four autosomal arms. The study further highlighted the complex evolutionary signals that the co-occurrence of incomplete lineage sorting (ILS) and introgression can give rise to in phylogenomic analyses. While tree-based methodologies were used in the study, phylogenetic networks provide a more natural model to capture reticulate evolutionary histories. In this work, we reanalyse the Anopheles data using a recently devised framework that combines the multispecies coalescent with phylogenetic networks. This framework allows us to capture ILS and introgression simultaneously, and forms the basis for statistical methods for inferring reticulate evolutionary histories. The new analysis reveals a phylogenetic network with multiple hybridization events, some of which differ from those reported in the original study. To elucidate the extent and patterns of introgression across the genome, we devise a new method that quantifies the use of reticulation branches in the phylogenetic network by each genomic region. Applying the method to the mosquito data set reveals the evolutionary history of all the chromosomes. This study highlights the utility of ‘network thinking’ and the new insights it can uncover, in particular in phylogenomic analyses of large data sets with extensive gene tree incongruence. PMID:26808290

  7. Research Education in Undergraduate Occupational Therapy Programs.

    ERIC Educational Resources Information Center

    Petersen, Paul; And Others

    1992-01-01

    Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)

  8. Double-survey estimates of bald eagle populations in Oregon

    USGS Publications Warehouse

    Anthony, R.G.; Garrett, Monte G.; Isaacs, F.B.

    1999-01-01

    The literature on abundance of birds of prey is almost devoid of population estimates with statistical rigor. Therefore, we surveyed bald eagle (Haliaeetus leucocephalus) populations on the Crooked and lower Columbia rivers of Oregon and used the double-survey method to estimate populations and sighting probabilities for different survey methods (aerial, boat, vehicle) and bald eagle ages (adults vs. subadults). Sighting probabilities were consistently 20%. The results revealed variable and negative bias (percent relative bias = -9 to -70%) of direct counts and emphasized the importance of estimating populations where some measure of precision and ability to conduct inference tests are available. We recommend use of the double-survey method to estimate abundance of bald eagle populations and other raptors in open habitats.

  9. Bayesian learning

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.

  10. A new statistical distance scale for planetary nebulae

    NASA Astrophysics Data System (ADS)

    Ali, Alaa; Ismail, H. A.; Alsolami, Z.

    2015-05-01

    In the first part of the present article we discuss the consistency among different individual distance methods of Galactic planetary nebulae, while in the second part we develop a new statistical distance scale based on a calibrating sample of well determined distances. A set composed of 315 planetary nebulae with individual distances are extracted from the literature. Inspecting the data set indicates that the accuracy of distances is varying among different individual methods and also among different sources where the same individual method was applied. Therefore, we derive a reliable weighted mean distance for each object by considering the influence of the distance error and the weight of each individual method. The results reveal that the discussed individual methods are consistent with each other, except the gravity method that produces higher distances compared to other individual methods. From the initial data set, we construct a standard calibrating sample consists of 82 objects. This sample is restricted only to the objects with distances determined from at least two different individual methods, except few objects with trusted distances determined from the trigonometric, spectroscopic, and cluster membership methods. In addition to the well determined distances for this sample, it shows a lot of advantages over that used in the prior distance scales. This sample is used to recalibrate the mass-radius and radio surface brightness temperature-radius relationships. An average error of ˜30 % is estimated for the new distance scale. The newly distance scale is compared with the most widely used statistical scales in literature, where the results show that it is roughly similar to the majority of them within ˜±20 % difference. Furthermore, the new scale yields a weighted mean distance to the Galactic center of 7.6±1.35 kpc, which in good agreement with the very recent measure of Malkin 2013.

  11. A probabilistic approach to photovoltaic generator performance prediction

    NASA Astrophysics Data System (ADS)

    Khallat, M. A.; Rahman, S.

    1986-09-01

    A method for predicting the performance of a photovoltaic (PV) generator based on long term climatological data and expected cell performance is described. The equations for cell model formulation are provided. Use of the statistical model for characterizing the insolation level is discussed. The insolation data is fitted to appropriate probability distribution functions (Weibull, beta, normal). The probability distribution functions are utilized to evaluate the capacity factors of PV panels or arrays. An example is presented revealing the applicability of the procedure.

  12. Statistical methods used in articles published by the Journal of Periodontal and Implant Science.

    PubMed

    Choi, Eunsil; Lyu, Jiyoung; Park, Jinyoung; Kim, Hae-Young

    2014-12-01

    The purposes of this study were to assess the trend of use of statistical methods including parametric and nonparametric methods and to evaluate the use of complex statistical methodology in recent periodontal studies. This study analyzed 123 articles published in the Journal of Periodontal & Implant Science (JPIS) between 2010 and 2014. Frequencies and percentages were calculated according to the number of statistical methods used, the type of statistical method applied, and the type of statistical software used. Most of the published articles considered (64.4%) used statistical methods. Since 2011, the percentage of JPIS articles using statistics has increased. On the basis of multiple counting, we found that the percentage of studies in JPIS using parametric methods was 61.1%. Further, complex statistical methods were applied in only 6 of the published studies (5.0%), and nonparametric statistical methods were applied in 77 of the published studies (38.9% of a total of 198 studies considered). We found an increasing trend towards the application of statistical methods and nonparametric methods in recent periodontal studies and thus, concluded that increased use of complex statistical methodology might be preferred by the researchers in the fields of study covered by JPIS.

  13. Transmit Designs for the MIMO Broadcast Channel With Statistical CSI

    NASA Astrophysics Data System (ADS)

    Wu, Yongpeng; Jin, Shi; Gao, Xiqi; McKay, Matthew R.; Xiao, Chengshan

    2014-09-01

    We investigate the multiple-input multiple-output broadcast channel with statistical channel state information available at the transmitter. The so-called linear assignment operation is employed, and necessary conditions are derived for the optimal transmit design under general fading conditions. Based on this, we introduce an iterative algorithm to maximize the linear assignment weighted sum-rate by applying a gradient descent method. To reduce complexity, we derive an upper bound of the linear assignment achievable rate of each receiver, from which a simplified closed-form expression for a near-optimal linear assignment matrix is derived. This reveals an interesting construction analogous to that of dirty-paper coding. In light of this, a low complexity transmission scheme is provided. Numerical examples illustrate the significant performance of the proposed low complexity scheme.

  14. What Can Be Learned from Inverse Statistics?

    NASA Astrophysics Data System (ADS)

    Ahlgren, Peter Toke Heden; Dahl, Henrik; Jensen, Mogens Høgh; Simonsen, Ingve

    One stylized fact of financial markets is an asymmetry between the most likely time to profit and to loss. This gain-loss asymmetry is revealed by inverse statistics, a method closely related to empirically finding first passage times. Many papers have presented evidence about the asymmetry, where it appears and where it does not. Also, various interpretations and explanations for the results have been suggested. In this chapter, we review the published results and explanations. We also examine the results and show that some are at best fragile. Similarly, we discuss the suggested explanations and propose a new model based on Gaussian mixtures. Apart from explaining the gain-loss asymmetry, this model also has the potential to explain other stylized facts such as volatility clustering, fat tails, and power law behavior of returns.

  15. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  16. Statistical Analysis of Human Body Movement and Group Interactions in Response to Music

    NASA Astrophysics Data System (ADS)

    Desmet, Frank; Leman, Marc; Lesaffre, Micheline; de Bruyn, Leen

    Quantification of time series that relate to physiological data is challenging for empirical music research. Up to now, most studies have focused on time-dependent responses of individual subjects in controlled environments. However, little is known about time-dependent responses of between-subject interactions in an ecological context. This paper provides new findings on the statistical analysis of group synchronicity in response to musical stimuli. Different statistical techniques were applied to time-dependent data obtained from an experiment on embodied listening in individual and group settings. Analysis of inter group synchronicity are described. Dynamic Time Warping (DTW) and Cross Correlation Function (CCF) were found to be valid methods to estimate group coherence of the resulting movements. It was found that synchronicity of movements between individuals (human-human interactions) increases significantly in the social context. Moreover, Analysis of Variance (ANOVA) revealed that the type of music is the predominant factor in both the individual and the social context.

  17. A comparative study on effect of e-learning and instructor-led methods on nurses’ documentation competency

    PubMed Central

    Abbaszadeh, Abbas; Sabeghi, Hakimeh; Borhani, Fariba; Heydari, Abbas

    2011-01-01

    BACKGROUND: Accurate recording of the nursing care indicates the care performance and its quality, so that, any failure in documentation can be a reason for inadequate patient care. Therefore, improving nurses’ skills in this field using effective educational methods is of high importance. Since traditional teaching methods are not suitable for communities with rapid knowledge expansion and constant changes, e-learning methods can be a viable alternative. To show the importance of e-learning methods on nurses’ care reporting skills, this study was performed to compare the e-learning methods with the traditional instructor-led methods. METHODS: This was a quasi-experimental study aimed to compare the effect of two teaching methods (e-learning and lecture) on nursing documentation and examine the differences in acquiring competency on documentation between nurses who participated in the e-learning (n = 30) and nurses in a lecture group (n = 31). RESULTS: The results of the present study indicated that statistically there was no significant difference between the two groups. The findings also revealed that statistically there was no significant correlation between the two groups toward demographic variables. However, we believe that due to benefits of e-learning against traditional instructor-led method, and according to their equal effect on nurses’ documentation competency, it can be a qualified substitute for traditional instructor-led method. CONCLUSIONS: E-learning as a student-centered method as well as lecture method equally promote competency of the nurses on documentation. Therefore, e-learning can be used to facilitate the implementation of nursing educational programs. PMID:22224113

  18. SEM method for direct visual tracking of nanoscale morphological changes of platinum based electrocatalysts on fixed locations upon electrochemical or thermal treatments.

    PubMed

    Zorko, Milena; Jozinović, Barbara; Bele, Marjan; Hodnik, Nejc; Gaberšček, Miran

    2014-05-01

    A general method for tracking morphological surface changes on a nanometer scale with scanning electron microscopy (SEM) is introduced. We exemplify the usefulness of the method by showing consecutive SEM images of an identical location before and after the electrochemical and thermal treatments of platinum-based nanoparticles deposited on a high surface area carbon. Observations reveal an insight into platinum based catalyst degradation occurring during potential cycling treatment. The presence of chloride clearly increases the rate of degradation. At these conditions the dominant degradation mechanism seems to be the platinum dissolution with some subsequent redeposition on the top of the catalyst film. By contrast, at the temperature of 60°C, under potentiostatic conditions some carbon corrosion and particle aggregation was observed. Temperature treatment simulating the annealing step of the synthesis reveals sintering of small platinum based composite aggregates into uniform spherical particles. The method provides a direct proof of induced surface phenomena occurring on a chosen location without the usual statistical uncertainty in usual, random SEM observations across relatively large surface areas. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. [Cognitive impairments in persons exposed to radiation during the period of prenatal development].

    PubMed

    Burtovaya, E Yu; Kantina, T E; Belova, M V; Akleyev, A V

    2015-01-01

    To assess the cognitive status in persons exposed to ionizing radiation in prenatal period. The study included in-utero exposed people (n = 77), and the comparison group (n = 73), which consisted of people who lived in the territories of the Chelyabinsk Oblast that were not radioactive. The following methods were used: clinical, clinical-psychological (Mini-Mental State Examination (MMSE), the WAIS test, the proverb interpretation task, neurophysiological (EEG) methods, laboratory-based methods (cholesterol, high and low-density lipoproteins, triglycerides, cortisol, melatonin), and methods of statistical data processing. The number of people with non-psychotic mental disorders with the prevalence of organic mental disorders (cognitive and asthenic) was significantly higher among in-utero exposed subjects. A neurophysiological study revealed more severe changes in the bioelectric brain activity with the presence of pathological and theta-rhythms in exposed persons. The clinical-psychological study revealed a significant decrease in the analytic/synthetic ability in exposed people and significantly lower level of the general and verbal IQ. These changes were accompanied by higher levels of cortisol and melatonin which led to the activation and tension of the adaptation mechanisms in in-utero exposed subjects.

  20. Grand average ERP-image plotting and statistics: A method for comparing variability in event-related single-trial EEG activities across subjects and conditions

    PubMed Central

    Delorme, Arnaud; Miyakoshi, Makoto; Jung, Tzyy-Ping; Makeig, Scott

    2014-01-01

    With the advent of modern computing methods, modeling trial-to-trial variability in biophysical recordings including electroencephalography (EEG) has become of increasingly interest. Yet no widely used method exists for comparing variability in ordered collections of single-trial data epochs across conditions and subjects. We have developed a method based on an ERP-image visualization tool in which potential, spectral power, or some other measure at each time point in a set of event-related single-trial data epochs are represented as color coded horizontal lines that are then stacked to form a 2-D colored image. Moving-window smoothing across trial epochs can make otherwise hidden event-related features in the data more perceptible. Stacking trials in different orders, for example ordered by subject reaction time, by context-related information such as inter-stimulus interval, or some other characteristic of the data (e.g., latency-window mean power or phase of some EEG source) can reveal aspects of the multifold complexities of trial-to-trial EEG data variability. This study demonstrates new methods for computing and visualizing grand ERP-image plots across subjects and for performing robust statistical testing on the resulting images. These methods have been implemented and made freely available in the EEGLAB signal-processing environment that we maintain and distribute. PMID:25447029

  1. Statistical Validation of Automatic Methods for Hippocampus Segmentation in MR Images of Epileptic Patients

    PubMed Central

    Hosseini, Mohammad-Parsa; Nazem-Zadeh, Mohammad R.; Pompili, Dario; Soltanian-Zadeh, Hamid

    2015-01-01

    Hippocampus segmentation is a key step in the evaluation of mesial Temporal Lobe Epilepsy (mTLE) by MR images. Several automated segmentation methods have been introduced for medical image segmentation. Because of multiple edges, missing boundaries, and shape changing along its longitudinal axis, manual outlining still remains the benchmark for hippocampus segmentation, which however, is impractical for large datasets due to time constraints. In this study, four automatic methods, namely FreeSurfer, Hammer, Automatic Brain Structure Segmentation (ABSS), and LocalInfo segmentation, are evaluated to find the most accurate and applicable method that resembles the bench-mark of hippocampus. Results from these four methods are compared against those obtained using manual segmentation for T1-weighted images of 157 symptomatic mTLE patients. For performance evaluation of automatic segmentation, Dice coefficient, Hausdorff distance, Precision, and Root Mean Square (RMS) distance are extracted and compared. Among these four automated methods, ABSS generates the most accurate results and the reproducibility is more similar to expert manual outlining by statistical validation. By considering p-value<0.05, the results of performance measurement for ABSS reveal that, Dice is 4%, 13%, and 17% higher, Hausdorff is 23%, 87%, and 70% lower, precision is 5%, -5%, and 12% higher, and RMS is 19%, 62%, and 65% lower compared to LocalInfo, FreeSurfer, and Hammer, respectively. PMID:25571043

  2. Precise Evaluation of Anthropometric 2D Software Processing of Hand in Comparison with Direct Method

    PubMed Central

    Habibi, Ehsanollah; Soury, Shiva; Zadeh, Akbar Hasan

    2013-01-01

    Various studies carried out on different photo anthropometry, but each one had some deficiencies which during the years they have been resolved. The objective of this paper is to test the efficiency of two-dimensional image processing software in photo anthropometry of hand. In this applied research, 204 office workers and industrial workers were selected. Their hands were measured by manual with photo anthropometric methods. In this study, designing the “Hand Photo Anthropometry Set,” we tried to fix the angle and distance of the camera in all of the photos. Thus, some of the common mistakes in photo anthropometric method got controlled. The taken photos were analyzed by Digimizer software, version 4.1.1.0 and Digital Caliper (Model: Mitutoyo Corp., Tokyo, Japan) was used via manual method. t-test statistical test on data revealed that there is no significant difference between the manual and photo anthropometric results (P > 0.05) and the correlation coefficients for hand dimensions are similar in both methods illustrated in the range of 0.71-0.95. The statistical analyses showed that photo anthropometry can be replaced with manual methods. Furthermore, it can provide a great help to develop an anthropometric database for work gloves manufacturers. Since the hand anthropometry is a necessary input for tool design, this survey can be used to determine the percentiles of workers’ hands. PMID:24696802

  3. Why Is Statistics Perceived as Difficult and Can Practice during Training Change Perceptions? Insights from a Prospective Mathematics Teacher

    ERIC Educational Resources Information Center

    Fitzmaurice, Olivia; Leavy, Aisling; Hannigan, Ailish

    2014-01-01

    An investigation into prospective mathematics/statistics teachers' (n = 134) conceptual understanding of statistics and attitudes to statistics carried out at the University of Limerick revealed an overall positive attitude to statistics but a perception that it can be a difficult subject, in particular that it requires a great deal of discipline…

  4. Downscaling Thermal Infrared Radiance for Subpixel Land Surface Temperature Retrieval

    PubMed Central

    Liu, Desheng; Pu, Ruiliang

    2008-01-01

    Land surface temperature (LST) retrieved from satellite thermal sensors often consists of mixed temperature components. Retrieving subpixel LST is therefore needed in various environmental and ecological studies. In this paper, we developed two methods for downscaling coarse resolution thermal infrared (TIR) radiance for the purpose of subpixel temperature retrieval. The first method was developed on the basis of a scale-invariant physical model on TIR radiance. The second method was based on a statistical relationship between TIR radiance and land cover fraction at high spatial resolution. The two methods were applied to downscale simulated 990-m ASTER TIR data to 90-m resolution. When validated against the original 90-m ASTER TIR data, the results revealed that both downscaling methods were successful in capturing the general patterns of the original data and resolving considerable spatial details. Further quantitative assessments indicated a strong agreement between the true values and the estimated values by both methods. PMID:27879844

  5. Downscaling Thermal Infrared Radiance for Subpixel Land Surface Temperature Retrieval.

    PubMed

    Liu, Desheng; Pu, Ruiliang

    2008-04-06

    Land surface temperature (LST) retrieved from satellite thermal sensors often consists of mixed temperature components. Retrieving subpixel LST is therefore needed in various environmental and ecological studies. In this paper, we developed two methods for downscaling coarse resolution thermal infrared (TIR) radiance for the purpose of subpixel temperature retrieval. The first method was developed on the basis of a scale-invariant physical model on TIR radiance. The second method was based on a statistical relationship between TIR radiance and land cover fraction at high spatial resolution. The two methods were applied to downscale simulated 990-m ASTER TIR data to 90-m resolution. When validated against the original 90-m ASTER TIR data, the results revealed that both downscaling methods were successful in capturing the general patterns of the original data and resolving considerable spatial details. Further quantitative assessments indicated a strong agreement between the true values and the estimated values by both methods.

  6. A method of 2D/3D registration of a statistical mouse atlas with a planar X-ray projection and an optical photo.

    PubMed

    Wang, Hongkai; Stout, David B; Chatziioannou, Arion F

    2013-05-01

    The development of sophisticated and high throughput whole body small animal imaging technologies has created a need for improved image analysis and increased automation. The registration of a digital mouse atlas to individual images is a prerequisite for automated organ segmentation and uptake quantification. This paper presents a fully-automatic method for registering a statistical mouse atlas with individual subjects based on an anterior-posterior X-ray projection and a lateral optical photo of the mouse silhouette. The mouse atlas was trained as a statistical shape model based on 83 organ-segmented micro-CT images. For registration, a hierarchical approach is applied which first registers high contrast organs, and then estimates low contrast organs based on the registered high contrast organs. To register the high contrast organs, a 2D-registration-back-projection strategy is used that deforms the 3D atlas based on the 2D registrations of the atlas projections. For validation, this method was evaluated using 55 subjects of preclinical mouse studies. The results showed that this method can compensate for moderate variations of animal postures and organ anatomy. Two different metrics, the Dice coefficient and the average surface distance, were used to assess the registration accuracy of major organs. The Dice coefficients vary from 0.31 ± 0.16 for the spleen to 0.88 ± 0.03 for the whole body, and the average surface distance varies from 0.54 ± 0.06 mm for the lungs to 0.85 ± 0.10mm for the skin. The method was compared with a direct 3D deformation optimization (without 2D-registration-back-projection) and a single-subject atlas registration (instead of using the statistical atlas). The comparison revealed that the 2D-registration-back-projection strategy significantly improved the registration accuracy, and the use of the statistical mouse atlas led to more plausible organ shapes than the single-subject atlas. This method was also tested with shoulder xenograft tumor-bearing mice, and the results showed that the registration accuracy of most organs was not significantly affected by the presence of shoulder tumors, except for the lungs and the spleen. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Application of short-data methods on extreme surge levels

    NASA Astrophysics Data System (ADS)

    Feng, X.

    2014-12-01

    Tropical cyclone-induced storm surges are among the most destructive natural hazards that impact the United States. Unfortunately for academic research, the available time series for extreme surge analysis are very short. The limited data introduces uncertainty and affects the accuracy of statistical analyses of extreme surge levels. This study deals with techniques applicable to data sets less than 20 years, including simulation modelling and methods based on the parameters of the parent distribution. The verified water levels from water gauges spread along the Southwest and Southeast Florida Coast, as well as the Florida Keys, are used in this study. Methods to calculate extreme storm surges are described and reviewed, including 'classical' methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. Incorporating global-warming influence, the statistical analysis reveals enhanced extreme surge magnitudes and frequencies during warm years, while reduced levels of extreme surge activity are observed in the same study domain during cold years. Furthermore, a non-stationary GEV distribution is applied to predict the extreme surge levels with warming sea surface temperatures. The non-stationary GEV distribution indicates that with 1 Celsius degree warming in sea surface temperature from the baseline climate, the 100-year return surge level in Southwest and Southeast Florida will increase by up to 40 centimeters. The considered statistical approaches for extreme surge estimation based on short data sets will be valuable to coastal stakeholders, including urban planners, emergency managers, and the hurricane and storm surge forecasting and warning system.

  8. Exercise reduces depressive symptoms in adults with arthritis: Evidential value

    PubMed Central

    Kelley, George A; Kelley, Kristi S

    2016-01-01

    AIM To determine whether evidential value exists that exercise reduces depression in adults with arthritis and other rheumatic conditions. METHODS Utilizing data derived from a prior meta-analysis of 29 randomized controlled trials comprising 2449 participants (1470 exercise, 979 control) with fibromyalgia, osteoarthritis, rheumatoid arthritis or systemic lupus erythematosus, a new method, P-curve, was utilized to assess for evidentiary worth as well as dismiss the possibility of discriminating reporting of statistically significant results regarding exercise and depression in adults with arthritis and other rheumatic conditions. Using the method of Stouffer, Z-scores were calculated to examine selective-reporting bias. An alpha (P) value < 0.05 was deemed statistically significant. In addition, average power of the tests included in P-curve, adjusted for publication bias, was calculated. RESULTS Fifteen of 29 studies (51.7%) with exercise and depression results were statistically significant (P < 0.05) while none of the results were statistically significant with respect to exercise increasing depression in adults with arthritis and other rheumatic conditions. Right-skew to dismiss selective reporting was identified (Z = −5.28, P < 0.0001). In addition, the included studies did not lack evidential value (Z = 2.39, P = 0.99), nor did they lack evidential value and were P-hacked (Z = 5.28, P > 0.99). The relative frequencies of P-values were 66.7% at 0.01, 6.7% each at 0.02 and 0.03, 13.3% at 0.04 and 6.7% at 0.05. The average power of the tests included in P-curve, corrected for publication bias, was 69%. Diagnostic plot results revealed that the observed power estimate was a better fit than the alternatives. CONCLUSION Evidential value results provide additional support that exercise reduces depression in adults with arthritis and other rheumatic conditions. PMID:27489782

  9. Statistics provide guidance for indigenous organic carbon detection on Mars missions.

    PubMed

    Sephton, Mark A; Carter, Jonathan N

    2014-08-01

    Data from the Viking and Mars Science Laboratory missions indicate the presence of organic compounds that are not definitively martian in origin. Both contamination and confounding mineralogies have been suggested as alternatives to indigenous organic carbon. Intuitive thought suggests that we are repeatedly obtaining data that confirms the same level of uncertainty. Bayesian statistics may suggest otherwise. If an organic detection method has a true positive to false positive ratio greater than one, then repeated organic matter detection progressively increases the probability of indigeneity. Bayesian statistics also reveal that methods with higher ratios of true positives to false positives give higher overall probabilities and that detection of organic matter in a sample with a higher prior probability of indigenous organic carbon produces greater confidence. Bayesian statistics, therefore, provide guidance for the planning and operation of organic carbon detection activities on Mars. Suggestions for future organic carbon detection missions and instruments are as follows: (i) On Earth, instruments should be tested with analog samples of known organic content to determine their true positive to false positive ratios. (ii) On the mission, for an instrument with a true positive to false positive ratio above one, it should be recognized that each positive detection of organic carbon will result in a progressive increase in the probability of indigenous organic carbon being present; repeated measurements, therefore, can overcome some of the deficiencies of a less-than-definitive test. (iii) For a fixed number of analyses, the highest true positive to false positive ratio method or instrument will provide the greatest probability that indigenous organic carbon is present. (iv) On Mars, analyses should concentrate on samples with highest prior probability of indigenous organic carbon; intuitive desires to contrast samples of high prior probability and low prior probability of indigenous organic carbon should be resisted.

  10. Comparative evaluation of topographical data of dental implant surfaces applying optical interferometry and scanning electron microscopy.

    PubMed

    Kournetas, N; Spintzyk, S; Schweizer, E; Sawada, T; Said, F; Schmid, P; Geis-Gerstorfer, J; Eliades, G; Rupp, F

    2017-08-01

    Comparability of topographical data of implant surfaces in literature is low and their clinical relevance often equivocal. The aim of this study was to investigate the ability of scanning electron microscopy and optical interferometry to assess statistically similar 3-dimensional roughness parameter results and to evaluate these data based on predefined criteria regarded relevant for a favorable biological response. Four different commercial dental screw-type implants (NanoTite Certain Prevail, TiUnite Brånemark Mk III, XiVE S Plus and SLA Standard Plus) were analyzed by stereo scanning electron microscopy and white light interferometry. Surface height, spatial and hybrid roughness parameters (Sa, Sz, Ssk, Sku, Sal, Str, Sdr) were assessed from raw and filtered data (Gaussian 50μm and 5μm cut-off-filters), respectively. Data were statistically compared by one-way ANOVA and Tukey-Kramer post-hoc test. For a clinically relevant interpretation, a categorizing evaluation approach was used based on predefined threshold criteria for each roughness parameter. The two methods exhibited predominantly statistical differences. Dependent on roughness parameters and filter settings, both methods showed variations in rankings of the implant surfaces and differed in their ability to discriminate the different topographies. Overall, the analyses revealed scale-dependent roughness data. Compared to the pure statistical approach, the categorizing evaluation resulted in much more similarities between the two methods. This study suggests to reconsider current approaches for the topographical evaluation of implant surfaces and to further seek after proper experimental settings. Furthermore, the specific role of different roughness parameters for the bioresponse has to be studied in detail in order to better define clinically relevant, scale-dependent and parameter-specific thresholds and ranges. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  11. A comparative study on effect of e-learning and instructor-led methods on nurses' documentation competency.

    PubMed

    Abbaszadeh, Abbas; Sabeghi, Hakimeh; Borhani, Fariba; Heydari, Abbas

    2011-01-01

    Accurate recording of the nursing care indicates the care performance and its quality, so that, any failure in documentation can be a reason for inadequate patient care. Therefore, improving nurses' skills in this field using effective educational methods is of high importance. Since traditional teaching methods are not suitable for communities with rapid knowledge expansion and constant changes, e-learning methods can be a viable alternative. To show the importance of e-learning methods on nurses' care reporting skills, this study was performed to compare the e-learning methods with the traditional instructor-led methods. This was a quasi-experimental study aimed to compare the effect of two teaching methods (e-learning and lecture) on nursing documentation and examine the differences in acquiring competency on documentation between nurses who participated in the e-learning (n = 30) and nurses in a lecture group (n = 31). The results of the present study indicated that statistically there was no significant difference between the two groups. The findings also revealed that statistically there was no significant correlation between the two groups toward demographic variables. However, we believe that due to benefits of e-learning against traditional instructor-led method, and according to their equal effect on nurses' documentation competency, it can be a qualified substitute for traditional instructor-led method. E-learning as a student-centered method as well as lecture method equally promote competency of the nurses on documentation. Therefore, e-learning can be used to facilitate the implementation of nursing educational programs.

  12. Statistical inference methods for sparse biological time series data.

    PubMed

    Ndukum, Juliet; Fonseca, Luís L; Santos, Helena; Voit, Eberhard O; Datta, Susmita

    2011-04-25

    Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR) spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values <0.0001). We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures, based on ANOVA likelihood ratio tests, for testing the significance of differences between short time course data under different biological perturbations.

  13. How Well Can We Detect Lineage-Specific Diversification-Rate Shifts? A Simulation Study of Sequential AIC Methods.

    PubMed

    May, Michael R; Moore, Brian R

    2016-11-01

    Evolutionary biologists have long been fascinated by the extreme differences in species numbers across branches of the Tree of Life. This has motivated the development of statistical methods for detecting shifts in the rate of lineage diversification across the branches of phylogenic trees. One of the most frequently used methods, MEDUSA, explores a set of diversification-rate models, where each model assigns branches of the phylogeny to a set of diversification-rate categories. Each model is first fit to the data, and the Akaike information criterion (AIC) is then used to identify the optimal diversification model. Surprisingly, the statistical behavior of this popular method is uncharacterized, which is a concern in light of: (1) the poor performance of the AIC as a means of choosing among models in other phylogenetic contexts; (2) the ad hoc algorithm used to visit diversification models, and; (3) errors that we reveal in the likelihood function used to fit diversification models to the phylogenetic data. Here, we perform an extensive simulation study demonstrating that MEDUSA (1) has a high false-discovery rate (on average, spurious diversification-rate shifts are identified [Formula: see text] of the time), and (2) provides biased estimates of diversification-rate parameters. Understanding the statistical behavior of MEDUSA is critical both to empirical researchers-in order to clarify whether these methods can make reliable inferences from empirical datasets-and to theoretical biologists-in order to clarify the specific problems that need to be solved in order to develop more reliable approaches for detecting shifts in the rate of lineage diversification. [Akaike information criterion; extinction; lineage-specific diversification rates; phylogenetic model selection; speciation.]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  14. How Well Can We Detect Lineage-Specific Diversification-Rate Shifts? A Simulation Study of Sequential AIC Methods

    PubMed Central

    May, Michael R.; Moore, Brian R.

    2016-01-01

    Evolutionary biologists have long been fascinated by the extreme differences in species numbers across branches of the Tree of Life. This has motivated the development of statistical methods for detecting shifts in the rate of lineage diversification across the branches of phylogenic trees. One of the most frequently used methods, MEDUSA, explores a set of diversification-rate models, where each model assigns branches of the phylogeny to a set of diversification-rate categories. Each model is first fit to the data, and the Akaike information criterion (AIC) is then used to identify the optimal diversification model. Surprisingly, the statistical behavior of this popular method is uncharacterized, which is a concern in light of: (1) the poor performance of the AIC as a means of choosing among models in other phylogenetic contexts; (2) the ad hoc algorithm used to visit diversification models, and; (3) errors that we reveal in the likelihood function used to fit diversification models to the phylogenetic data. Here, we perform an extensive simulation study demonstrating that MEDUSA (1) has a high false-discovery rate (on average, spurious diversification-rate shifts are identified ≈30% of the time), and (2) provides biased estimates of diversification-rate parameters. Understanding the statistical behavior of MEDUSA is critical both to empirical researchers—in order to clarify whether these methods can make reliable inferences from empirical datasets—and to theoretical biologists—in order to clarify the specific problems that need to be solved in order to develop more reliable approaches for detecting shifts in the rate of lineage diversification. [Akaike information criterion; extinction; lineage-specific diversification rates; phylogenetic model selection; speciation.] PMID:27037081

  15. The critical period hypothesis in second language acquisition: a statistical critique and a reanalysis.

    PubMed

    Vanhove, Jan

    2013-01-01

    In second language acquisition research, the critical period hypothesis (cph) holds that the function between learners' age and their susceptibility to second language input is non-linear. This paper revisits the indistinctness found in the literature with regard to this hypothesis's scope and predictions. Even when its scope is clearly delineated and its predictions are spelt out, however, empirical studies-with few exceptions-use analytical (statistical) tools that are irrelevant with respect to the predictions made. This paper discusses statistical fallacies common in cph research and illustrates an alternative analytical method (piecewise regression) by means of a reanalysis of two datasets from a 2010 paper purporting to have found cross-linguistic evidence in favour of the cph. This reanalysis reveals that the specific age patterns predicted by the cph are not cross-linguistically robust. Applying the principle of parsimony, it is concluded that age patterns in second language acquisition are not governed by a critical period. To conclude, this paper highlights the role of confirmation bias in the scientific enterprise and appeals to second language acquisition researchers to reanalyse their old datasets using the methods discussed in this paper. The data and R commands that were used for the reanalysis are provided as supplementary materials.

  16. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    PubMed

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Replicability of time-varying connectivity patterns in large resting state fMRI samples

    PubMed Central

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L.; Stephen, Julia M.; Claus, Eric D.; Mayer, Andrew R.; Calhoun, Vince D.

    2018-01-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain’s inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. PMID:28916181

  18. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    PubMed

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  19. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†

    PubMed Central

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-01-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370

  20. Measuring Student Learning in Social Statistics: A Pretest-Posttest Study of Knowledge Gain

    ERIC Educational Resources Information Center

    Delucchi, Michael

    2014-01-01

    This study used a pretest-posttest design to measure student learning in undergraduate statistics. Data were derived from 185 students enrolled in six different sections of a social statistics course taught over a seven-year period by the same sociology instructor. The pretest-posttest instrument reveals statistically significant gains in…

  1. An Improved LC-ESI-MS/MS Method to Quantify Pregabalin in Human Plasma and Dry Plasma Spot for Therapeutic Monitoring and Pharmacokinetic Applications.

    PubMed

    Dwivedi, Jaya; Namdev, Kuldeep K; Chilkoti, Deepak C; Verma, Surajpal; Sharma, Swapnil

    2018-06-06

    Therapeutic drug monitoring (TDM) of anti-epileptic drugs provides a valid clinical tool in optimization of overall therapy. However, TDM is challenging due to the high biological samples (plasma/blood) storage/shipment costs and the limited availability of laboratories providing TDM services. Sampling in the form of dry plasma spot (DPS) or dry blood spot (DBS) is a suitable alternative to overcome these issues. An improved, simple, rapid, and stability indicating method for quantification of pregabalin in human plasma and DPS has been developed and validated. Analyses were performed on liquid chromatography tandem mass spectrometer under positive ionization mode of electrospray interface. Pregabain-d4 was used as internal standard, and the chromatographic separations were performed on Poroshell 120 EC-C18 column using an isocratic mobile phase flow rate of 1 mL/min. Stability of pregabalin in DPS was evaluated under simulated real-time conditions. Extraction procedures from plasma and DPS samples were compared using statistical tests. The method was validated considering the FDA method validation guideline. The method was linear over the concentration range of 20-16000 ng/mL and 100-10000 ng/mL in plasma and DPS, respectively. DPS samples were found stable for only one week upon storage at room temperature and for at least four weeks at freezing temperature (-20 ± 5 °C). Method was applied for quantification of pregabalin in over 600 samples of a clinical study. Statistical analyses revealed that two extraction procedures in plasma and DPS samples showed statistically insignificant difference and can be used interchangeably without any bias. Proposed method involves simple and rapid steps of sample processing that do not require a pre- or post-column derivatization procedure. The method is suitable for routine pharmacokinetic analysis and therapeutic monitoring of pregabalin.

  2. Nigerian pharmacists’ self-perceived competence and confidence to plan and conduct pharmacy practice research

    PubMed Central

    Usman, Mohammad N.; Umar, Muhammad D.

    2018-01-01

    Background: Recent studies have revealed that pharmacists have interest in conducting research. However, lack of confidence is a major barrier. Objective: This study evaluated pharmacists’ self-perceived competence and confidence to plan and conduct health-related research. Method: This cross sectional study was conducted during the 89th Annual National Conference of the Pharmaceutical Society of Nigeria in November 2016. An adapted questionnaire was validated and administered to 200 pharmacist delegates during the conference. Result: Overall, 127 questionnaires were included in the analysis. At least 80% of the pharmacists had previous health-related research experience. Pharmacist’s competence and confidence scores were lowest for research skills such as: using software for statistical analysis, choosing and applying appropriate inferential statistical test and method, and outlining detailed statistical plan to be used in data analysis. Highest competence and confidence scores were observed for conception of research idea, literature search and critical appraisal of literature. Pharmacists with previous research experience had higher competence and confidence scores than those with no previous research experience (p<0.05). The only predictor of moderate-to-extreme self-competence and confidence was having at least one journal article publication during the last 5 years. Conclusion: Nigerian pharmacists indicated interest to participate in health-related research. However, self-competence and confidence to plan and conduct research were low. This was particularly so for skills related to statistical analysis. Training programs and building of Pharmacy Practice Research Network are recommended to enhance pharmacist’s research capacity. PMID:29619141

  3. Imaging Young Stellar Objects with VLTi/PIONIER

    NASA Astrophysics Data System (ADS)

    Kluska, J.; Malbet, F.; Berger, J.-P.; Benisty, M.; Lazareff, B.; Le Bouquin, J.-B.; Baron, F.; Dominik, C.; Isella, A.; Juhasz, A.; Kraus, S.; Lachaume, R.; Ménard, F.; Millan-Gabet, R.; Monnier, J.; Pinte, C.; Soulez, F.; Tallon, M.; Thi, W.-F.; Thiébaut, É.; Zins, G.

    2014-04-01

    Optical interferometry imaging is designed to help us to reveal complex astronomical sources without a prior model. Among these complex objects are the young stars and their environments, which have a typical morphology with a point-like source, surrounded by circumstellar material with unknown morphology. To image them, we have developed a numerical method that removes completely the stellar point source and reconstructs the rest of the image, using the differences in the spectral behavior between the star and its circumstellar material. We aim to reveal the first Astronomical Units of these objects where many physical phenomena could interplay: the dust sublimation causing a puffed-up inner rim, a dusty halo, a dusty wind or an inner gaseous component. To investigate more deeply these regions, we carried out the first Large Program survey of HAeBe stars with two main goals: statistics on the geometry of these objects at the first astronomical unit scale and imaging their very close environment. The images reveal the environment, which is not polluted by the star and allows us to derive the best fit for the flux ratio and the spectral slope. We present the first images from this survey and the application of the imaging method on other astronomical objects.

  4. 28 CFR 22.27 - Notification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL...: (1) That the information will only be used or revealed for research or statistical purposes; and (2... or statistical purposes; and (3) That participation in the project in question is voluntary and may...

  5. 28 CFR 22.27 - Notification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE CONFIDENTIALITY OF IDENTIFIABLE RESEARCH AND STATISTICAL...: (1) That the information will only be used or revealed for research or statistical purposes; and (2... or statistical purposes; and (3) That participation in the project in question is voluntary and may...

  6. Dental and Chronological Ages as Determinants of Peak Growth Period and Its Relationship with Dental Calcification Stages

    PubMed Central

    Litsas, George; Lucchese, Alessandra

    2016-01-01

    Purpose: To investigate the relationship between dental, chronological, and cervical vertebral maturation growth in the peak growth period, as well as to study the association between the dental calcification phases and the skeletal maturity stages during the same growth period. Methods: Subjects were selected from orthodontic pre-treatment cohorts consisting of 420 subjects where 255 were identified and enrolled into the study, comprising 145 girls and 110 boys. The lateral cephalometric and panoramic radiographs were examined from the archives of the Department of Orthodontics, Aristotle University of Thessaloniki, Greece. Dental age was assessed according to the method of Demirjian, and skeletal maturation according to the Cervical Vertebral Maturation Method. Statistical elaboration included Spearman Brown formula, descriptive statistics, Pearson’s correlation coefficient and regression analysis, paired samples t-test, and Spearman’s rho correlation coefficient. Results: Chronological and dental age showed a high correlation for both gender(r =0.741 for boys, r = 0.770 for girls, p<0.001). The strongest correlation was for the CVM Stage IV for both males (r=0.554) and females (r=0.68). The lowest correlation was for the CVM Stage III in males (r=0.433, p<0.001) and for the CVM Stage II in females (r=0.393, p>0.001). The t-test revealed statistically significant differences between these variables (p<0.001) during the peak period. A statistically significant correlation (p<0.001) between tooth calcification and CVM stages was determined. The second molars showed the highest correlation with CVM stages (CVMS) (r= 0.65 for boys, r = 0.72 for girls). Conclusion: Dental age was more advanced than chronological for both boys and girls for all CVMS. During the peak period these differences were more pronounced. Moreover, all correlations between skeletal and dental stages were statistically significant. The second molars showed the highest correlation whereas the canines showed the lowest correlation for both gender. PMID:27335610

  7. In pursuit of a science of agriculture: the role of statistics in field experiments.

    PubMed

    Parolini, Giuditta

    2015-09-01

    Since the beginning of the twentieth century statistics has reshaped the experimental cultures of agricultural research taking part in the subtle dialectic between the epistemic and the material that is proper to experimental systems. This transformation has become especially relevant in field trials and the paper will examine the British agricultural institution, Rothamsted Experimental Station, where statistical methods nowadays popular in the planning and analysis of field experiments were developed in the 1920s. At Rothamsted statistics promoted randomisation over systematic arrangements, factorisation over one-question trials, and emphasised the importance of the experimental error in assessing field trials. These changes in methodology transformed also the material culture of agricultural science, and a new body, the Field Plots Committee, was created to manage the field research of the agricultural institution. Although successful, the vision of field experimentation proposed by the Rothamsted statisticians was not unproblematic. Experimental scientists closely linked to the farming community questioned it in favour of a field research that could be more easily understood by farmers. The clash between the two agendas reveals how the role attributed to statistics in field experimentation defined different pursuits of agricultural research, alternately conceived of as a scientists' science or as a farmers' science.

  8. Novel insights into the interplay between ventral neck muscles in individuals with whiplash-associated disorders.

    PubMed

    Peterson, Gunnel; Nilsson, David; Trygg, Johan; Falla, Deborah; Dedering, Åsa; Wallman, Thorne; Peolsson, Anneli

    2015-10-16

    Chronic whiplash-associated disorder (WAD) is common after whiplash injury, with considerable personal, social, and economic burden. Despite decades of research, factors responsible for continuing pain and disability are largely unknown, and diagnostic tools are lacking. Here, we report a novel model of mechanical ventral neck muscle function recorded from non-invasive, real-time, ultrasound measurements. We calculated the deformation area and deformation rate in 23 individuals with persistent WAD and compared them to 23 sex- and age-matched controls. Multivariate statistics were used to analyse interactions between ventral neck muscles, revealing different interplay between muscles in individuals with WAD and healthy controls. Although the cause and effect relation cannot be established from this data, for the first time, we reveal a novel method capable of detecting different neck muscle interplay in people with WAD. This non-invasive method stands to make a major breakthrough in the assessment and diagnosis of people following a whiplash trauma.

  9. Predicting Physical Interactions between Protein Complexes*

    PubMed Central

    Clancy, Trevor; Rødland, Einar Andreas; Nygard, Ståle; Hovig, Eivind

    2013-01-01

    Protein complexes enact most biochemical functions in the cell. Dynamic interactions between protein complexes are frequent in many cellular processes. As they are often of a transient nature, they may be difficult to detect using current genome-wide screens. Here, we describe a method to computationally predict physical interactions between protein complexes, applied to both humans and yeast. We integrated manually curated protein complexes and physical protein interaction networks, and we designed a statistical method to identify pairs of protein complexes where the number of protein interactions between a complex pair is due to an actual physical interaction between the complexes. An evaluation against manually curated physical complex-complex interactions in yeast revealed that 50% of these interactions could be predicted in this manner. A community network analysis of the highest scoring pairs revealed a biologically sensible organization of physical complex-complex interactions in the cell. Such analyses of proteomes may serve as a guide to the discovery of novel functional cellular relationships. PMID:23438732

  10. Harnessing the complexity of gene expression data from cancer: from single gene to structural pathway methods

    PubMed Central

    2012-01-01

    High-dimensional gene expression data provide a rich source of information because they capture the expression level of genes in dynamic states that reflect the biological functioning of a cell. For this reason, such data are suitable to reveal systems related properties inside a cell, e.g., in order to elucidate molecular mechanisms of complex diseases like breast or prostate cancer. However, this is not only strongly dependent on the sample size and the correlation structure of a data set, but also on the statistical hypotheses tested. Many different approaches have been developed over the years to analyze gene expression data to (I) identify changes in single genes, (II) identify changes in gene sets or pathways, and (III) identify changes in the correlation structure in pathways. In this paper, we review statistical methods for all three types of approaches, including subtypes, in the context of cancer data and provide links to software implementations and tools and address also the general problem of multiple hypotheses testing. Further, we provide recommendations for the selection of such analysis methods. Reviewers This article was reviewed by Arcady Mushegian, Byung-Soo Kim and Joel Bader. PMID:23227854

  11. Rainfall Prediction of Indian Peninsula: Comparison of Time Series Based Approach and Predictor Based Approach using Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Dash, Y.; Mishra, S. K.; Panigrahi, B. K.

    2017-12-01

    Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.

  12. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  13. MEG Frequency Analysis Depicts the Impaired Neurophysiological Condition of Ischemic Brain

    PubMed Central

    Ikeda, Hidetoshi; Tsuyuguchi, Naohiro; Uda, Takehiro; Okumura, Eiichi; Asakawa, Takashi; Haruta, Yasuhiro; Nishiyama, Hideki; Okada, Toyoji; Kamada, Hajime; Ohata, Kenji; Miki, Yukio

    2016-01-01

    Purpose Quantitative imaging of neuromagnetic fields based on automated region of interest (ROI) setting was analyzed to determine the characteristics of cerebral neural activity in ischemic areas. Methods Magnetoencephalography (MEG) was used to evaluate spontaneous neuromagnetic fields in the ischemic areas of 37 patients with unilateral internal carotid artery (ICA) occlusive disease. Voxel-based time-averaged intensity of slow waves was obtained in two frequency bands (0.3–4 Hz and 4–8 Hz) using standardized low-resolution brain electromagnetic tomography (sLORETA) modified for a quantifiable method (sLORETA-qm). ROIs were automatically applied to the anterior cerebral artery (ACA), anterior middle cerebral artery (MCAa), posterior middle cerebral artery (MCAp), and posterior cerebral artery (PCA) using statistical parametric mapping (SPM). Positron emission tomography with 15O-gas inhalation (15O-PET) was also performed to evaluate cerebral blood flow (CBF) and oxygen extraction fraction (OEF). Statistical analyses were performed using laterality index of MEG and 15O-PET in each ROI with respect to distribution and intensity. Results MEG revealed statistically significant laterality in affected MCA regions, including 4–8 Hz waves in MCAa, and 0.3–4 Hz and 4–8 Hz waves in MCAp (95% confidence interval: 0.020–0.190, 0.030–0.207, and 0.034–0.213), respectively. We found that 0.3–4 Hz waves in MCAp were highly correlated with CBF in MCAa and MCAp (r = 0.74, r = 0.68, respectively), whereas 4–8 Hz waves were moderately correlated with CBF in both the MCAa and MCAp (r = 0.60, r = 0.63, respectively). We also found that 4–8 Hz waves in MCAp were statistically significant for misery perfusion identified on 15O-PET (p<0.05). Conclusions Quantitatively imaged spontaneous neuromagnetic fields using the automated ROI setting enabled clear depiction of cerebral ischemic areas. Frequency analysis may reveal unique neural activity that is distributed in the impaired vascular metabolic territory, in which the cerebral infarction has not yet been completed. PMID:27992543

  14. Skeletal maturation in individuals with Down's syndrome: Comparison between PGS curve, cervical vertebrae and bones of the hand and wrist

    PubMed Central

    Carinhena, Glauber; Siqueira, Danilo Furquim; Sannomiya, Eduardo Kazuo

    2014-01-01

    Introduction This study was conducted with the aim of adapting the methods developed by Martins and Sakima to assess skeletal maturation by cervical vertebrae in the pubertal growth spurt (PGS) curve. It also aimed to test the reliability and agreement between those methods and the method of hand and wrist radiograph when compared two by two and all together. Methods The sample comprised 72 radiographs, with 36 lateral radiographs of the head and 36 hand-wrist radiographs of 36 subjects with Down's syndrome (DS), 13 female and 23 male, aged between 8 years and 6 months and 18 years and 7 months, with an average age of 13 years and 10 months. Results and Conclusions Results revealed that adapting the methods developed by Martins and Sakima to assess skeletal maturation by cervical vertebrae in the curve of PGS is practical and useful in determining the stage of growth and development of individuals. The stages of maturation evaluated by cervical vertebrae and ossification centers observed in radiographs of the hand and wrist were considered reliable, with excellent level of agreement between the methods by Hassel and Farman as well as Baccetti, Franchi and McNamara Jr and Martins and Sakima. Additionally, results revealed an agreement that ranged between reasonable to good for the three methods used to assess the skeletal maturation, showing statistical significance. PMID:25279522

  15. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  16. Enhancing residents’ neonatal resuscitation competency through unannounced simulation-based training

    PubMed Central

    Surcouf, Jeffrey W.; Chauvin, Sheila W.; Ferry, Jenelle; Yang, Tong; Barkemeyer, Brian

    2013-01-01

    Background Almost half of pediatric third-year residents surveyed in 2000 had never led a resuscitation event. With increasing restrictions on residency work hours and a decline in patient volume in some hospitals, there is potential for fewer opportunities. Purpose Our primary purpose was to test the hypothesis that an unannounced mock resuscitation in a high-fidelity in-situ simulation training program would improve both residents’ self-confidence and observed performance of adopted best practices in neonatal resuscitation. Methods Each pediatric and medicine–pediatric resident in one pediatric residency program responded to an unannounced scenario that required resuscitation of the high fidelity infant simulator. Structured debriefing followed in the same setting, and a second cycle of scenario response and debriefing occurred before ending the 1-hour training experience. Measures included pre- and post-program confidence questionnaires and trained observer assessments of live and videotaped performances. Results Statistically significant pre–post gains for self-confidence were observed for 8 of the 14 NRP critical behaviors (p=0.00–0.03) reflecting knowledge, technical, and non-technical (teamwork) skills. The pre–post gain in overall confidence score was statistically significant (p=0.00). With a maximum possible assessment score of 41, the average pre–post gain was 8.28 and statistically significant (p<0.001). Results of the video-based assessments revealed statistically significant performance gains (p<0.0001). Correlation between live and video-based assessments were strong for pre–post training scenario performances (pre: r=0.64, p<0.0001; post: r=0.75, p<0.0001). Conclusions Results revealed high receptivity to in-situ, simulation-based training and significant positive gains in confidence and observed competency-related abilities. Results support the potential for other applications in residency and continuing education. PMID:23522399

  17. High-resolution modeling of thermal thresholds and environmental influences on coral bleaching for local and regional reef management.

    PubMed

    Kumagai, Naoki H; Yamano, Hiroya

    2018-01-01

    Coral reefs are one of the world's most threatened ecosystems, with global and local stressors contributing to their decline. Excessive sea-surface temperatures (SSTs) can cause coral bleaching, resulting in coral death and decreases in coral cover. A SST threshold of 1 °C over the climatological maximum is widely used to predict coral bleaching. In this study, we refined thermal indices predicting coral bleaching at high-spatial resolution (1 km) by statistically optimizing thermal thresholds, as well as considering other environmental influences on bleaching such as ultraviolet (UV) radiation, water turbidity, and cooling effects. We used a coral bleaching dataset derived from the web-based monitoring system Sango Map Project, at scales appropriate for the local and regional conservation of Japanese coral reefs. We recorded coral bleaching events in the years 2004-2016 in Japan. We revealed the influence of multiple factors on the ability to predict coral bleaching, including selection of thermal indices, statistical optimization of thermal thresholds, quantification of multiple environmental influences, and use of multiple modeling methods (generalized linear models and random forests). After optimization, differences in predictive ability among thermal indices were negligible. Thermal index, UV radiation, water turbidity, and cooling effects were important predictors of the occurrence of coral bleaching. Predictions based on the best model revealed that coral reefs in Japan have experienced recent and widespread bleaching. A practical method to reduce bleaching frequency by screening UV radiation was also demonstrated in this paper.

  18. High-resolution modeling of thermal thresholds and environmental influences on coral bleaching for local and regional reef management

    PubMed Central

    Yamano, Hiroya

    2018-01-01

    Coral reefs are one of the world’s most threatened ecosystems, with global and local stressors contributing to their decline. Excessive sea-surface temperatures (SSTs) can cause coral bleaching, resulting in coral death and decreases in coral cover. A SST threshold of 1 °C over the climatological maximum is widely used to predict coral bleaching. In this study, we refined thermal indices predicting coral bleaching at high-spatial resolution (1 km) by statistically optimizing thermal thresholds, as well as considering other environmental influences on bleaching such as ultraviolet (UV) radiation, water turbidity, and cooling effects. We used a coral bleaching dataset derived from the web-based monitoring system Sango Map Project, at scales appropriate for the local and regional conservation of Japanese coral reefs. We recorded coral bleaching events in the years 2004–2016 in Japan. We revealed the influence of multiple factors on the ability to predict coral bleaching, including selection of thermal indices, statistical optimization of thermal thresholds, quantification of multiple environmental influences, and use of multiple modeling methods (generalized linear models and random forests). After optimization, differences in predictive ability among thermal indices were negligible. Thermal index, UV radiation, water turbidity, and cooling effects were important predictors of the occurrence of coral bleaching. Predictions based on the best model revealed that coral reefs in Japan have experienced recent and widespread bleaching. A practical method to reduce bleaching frequency by screening UV radiation was also demonstrated in this paper. PMID:29473007

  19. Report on von Willebrand Disease in Malaysia

    PubMed Central

    Periayah, Mercy Halleluyah; Halim, Ahmad Sukari; Saad, Arman Zaharil Mat; Yaacob, Nik Soriani; Karim, Faraizah Abdul

    2016-01-01

    BACKGROUND: Von Willebrand disease (vWD) is an inherited hemostatic disorder that affects the hemostasis pathway. The worldwide prevalence of vWD is estimated to be 1% of the general population but only 0.002% in Malaysia. AIM: Our present paper has been written to disclose the statistical counts on the number of vWD cases reported from 2011 to 2013. MATERIAL AND METHODS: This article is based on sociodemographic data, diagnoses and laboratory findings of vWD in Malaysia. A total of 92 patients were reported to have vWD in Malaysia from 2011 to 2013. RESULTS: Sociodemographic-analysis revealed that 60% were females, 63% were of the Malay ethnicity, 41.3% were in the 19-44 year old age group and 15.2% were from Sabah, with the East region having the highest registered number of vWD cases. In Malaysia, most patients are predominately affected by vWD type 1 (77.2%). Factor 8, von Willebrand factor: Antigen and vWF: Collagen-Binding was the strongest determinants in the laboratory profiles of vWD. CONCLUSION: This report has been done with great interest to provide an immense contribution from Malaysia, by revealing the statistical counts on vWD from 2011-2013. PMID:27275342

  20. Drug-nutrient interaction counseling programs in upper midwestern hospitals: 1986 survey results.

    PubMed

    Jones, C M; Reddick, J E

    1989-02-01

    A mail survey was conducted to determine the characteristics of drug-nutrient counseling programs provided to hospitalized patients. The survey population included general medical-surgical type hospitals with 175 or more bed capacity in five upper Midwestern states: Illinois, Iowa, Michigan, Minnesota, and Wisconsin. The average return from 289 surveys was 75%. A mean of 64% of responding hospitals provide patient counseling on drug-nutrient interactions. All statistical analysis was by chi-square. Calculations revealed that less than 50% of hospitals require a physician's order to provide drug-nutrient interaction counseling. More than 50% involve a registered dietitian in providing such counseling. The monoamine oxidase inhibitor drugs were cited most frequently as the group of drugs for which counseling was needed. Other drug groups for which patient counseling is needed include: diuretics, anticoagulants, tetracyclines, oral hypoglycemics, insulin, antihypertensives and/or cardiac drugs, antibiotics, and corticosteroids. Having the dietitian or other dietary personnel scan the patient chart was cited most often as the preferred method for detection of patients taking the drugs. A final statistical calculation revealed that there is no difference between teaching and nonteaching hospitals in the number providing a drug-nutrient counseling program.

  1. A signal detection method for temporal variation of adverse effect with vaccine adverse event reporting system data.

    PubMed

    Cai, Yi; Du, Jingcheng; Huang, Jing; Ellenberg, Susan S; Hennessy, Sean; Tao, Cui; Chen, Yong

    2017-07-05

    To identify safety signals by manual review of individual report in large surveillance databases is time consuming; such an approach is very unlikely to reveal complex relationships between medications and adverse events. Since the late 1990s, efforts have been made to develop data mining tools to systematically and automatically search for safety signals in surveillance databases. Influenza vaccines present special challenges to safety surveillance because the vaccine changes every year in response to the influenza strains predicted to be prevalent that year. Therefore, it may be expected that reporting rates of adverse events following flu vaccines (number of reports for a specific vaccine-event combination/number of reports for all vaccine-event combinations) may vary substantially across reporting years. Current surveillance methods seldom consider these variations in signal detection, and reports from different years are typically collapsed together to conduct safety analyses. However, merging reports from different years ignores the potential heterogeneity of reporting rates across years and may miss important safety signals. Reports of adverse events between years 1990 to 2013 were extracted from the Vaccine Adverse Event Reporting System (VAERS) database and formatted into a three-dimensional data array with types of vaccine, groups of adverse events and reporting time as the three dimensions. We propose a random effects model to test the heterogeneity of reporting rates for a given vaccine-event combination across reporting years. The proposed method provides a rigorous statistical procedure to detect differences of reporting rates among years. We also introduce a new visualization tool to summarize the result of the proposed method when applied to multiple vaccine-adverse event combinations. We applied the proposed method to detect safety signals of FLU3, an influenza vaccine containing three flu strains, in the VAERS database. We showed that it had high statistical power to detect the variation in reporting rates across years. The identified vaccine-event combinations with significant different reporting rates over years suggested potential safety issues due to changes in vaccines which require further investigation. We developed a statistical model to detect safety signals arising from heterogeneity of reporting rates of a given vaccine-event combinations across reporting years. This method detects variation in reporting rates over years with high power. The temporal trend of reporting rate across years may reveal the impact of vaccine update on occurrence of adverse events and provide evidence for further investigations.

  2. Pain Elimination during Injection with Newer Electronic Devices: A Comparative Evaluation in Children

    PubMed Central

    Saha, Sonali; Jaiswal, JN; Samadi, Firoza

    2014-01-01

    ABSTRACT Aim: The present study was taken up to clinically evaluate and compare effectiveness of transcutaneous electrical nerve stimulator (TENS) and comfort control syringe (CCS) in various pediatric dental procedures as an alternative to the conventional method of local anesthesia (LA) administration. Materials and methods: Ninety healthy children having at least one deciduous molar tooth indicated for extraction in either maxillary right or left quadrant in age group of 6 to 10 years were randomly divided into three equal groups having 30 subjects each. Group I: LA administration using conventional syringe, group II: LA administration using TENS along with the conventional syringe, group III: LA administration using CCS. After LA by the three techniques, pain, anxiety and heart rate were measured. Statistical analysis: The observations, thus, obtained were subjected to statistical analysis using analysis of variance (ANOVA), student t-test and paired t-test. Results: The mean pain score was maximum in group I followed by group II, while group III revealed the minimum pain, where LA was administered using CCS. Mean anxiety score was maximum in group I followed by group II, while group III revealed the minimum score. Mean heart rate was maximum in group I followed in descending order by groups II and III. Conclusion: The study supports the belief that CCS could be a viable alternative in comparison to the other two methods of LA delivery in children. How to cite this article: Bansal N, Saha S, Jaiswal JN, Samadi F. Pain Elimination during Injection with Newer Electronic Devices: A Comparative Evaluation in Children. Int J Clin Pediatr Dent 2014;7(2):71-76. PMID:25356003

  3. Estimation of Salivary Glucose and Glycogen Content in Exfoliated Buccal Mucosal Cells of Patients with Type II Diabetes Mellitus

    PubMed Central

    Gopinathan, Deepa Moothedathu; Sukumaran, Sunil

    2015-01-01

    Background Diabetes mellitus is a common metabolic disorder which shows an increasing incidence worldwide. Constant monitoring of blood glucose in diabetic patient is required which involves painful invasive techniques. Saliva is gaining acceptance as diagnostic tool for various systemic diseases which can be collected noninvasively and by individuals with limited training. Aim The aim of the present study was to analyse the possibility of using salivary glucose and glycogen content of buccal mucosal cells as a diagnostic marker in Type II Diabetes mellitus patients which can be considered as adjuvant diagnostic tool to the gold standards. Materials and Methods Sample consists of 30 study and 30 control groups. Saliva was collected by passive drool method.Intravenous blood samples were collected for glucose estimation. Exfoliated buccal mucosal cells were collected from apparently normal buccal mucosa, smeared on dry glass slide and stained with PAS. Blood and salivary glucose are estimated by Glucose Oxidase endpoint method. For Glycogen estimation, number of PAS positive cells in fifty unfolded cells was analysed. Results The results of the present study revealed a significant increase in the salivary glucose level and the number of PAS positive buccal mucosal cells in the diabetics than in the controls. The correlation between the fasting serum glucose and fasting salivary glucose and also that between fasting serum glucose and PAS positive cells was statistically significant. But the correlation between the staining intensity and fasting serum glucose was statistically insignificant. Conclusion With the results of the present study it is revealed that salivary glucose and PAS positive cells are increased in diabetics which can be considered as adjuvant diagnostic tool for Diabetes mellitus. PMID:26155572

  4. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.

    PubMed

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2017-01-01

    This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.

  5. Determination of antioxidant power of red and white wines by a new electrochemical method and its correlation with polyphenolic content.

    PubMed

    Alonso, Angeles M; Domínguez, Cristina; Guillén, Dominico A; Barroso, Carmelo G

    2002-05-22

    A new method for measuring the antioxidant power of wine has been developed based on the accelerated electrochemical oxidation of 2,2'-azino-bis(3-ethylbenzthiazoline-6-sulfonic acid) (ABTS). The calibration (R = 0.9922) and repeatability study (RSD = 7%) have provided good statistical parameters. The method is easy and quick to apply and gives reliable results, requiring only the monitoring of time and absorbance. It has been applied to various red and white wines of different origins. The results have been compared with those obtained by the total antioxidant status (TAS) method. Both methods reveal that the more antioxidant wines are those with higher polyphenolic content. From the HPLC study of the polyphenolic content of the same samples, it is confirmed that there is a positive correlation between the resveratrol content of a wine and its antioxidant power.

  6. Characterizing microstructural features of biomedical samples by statistical analysis of Mueller matrix images

    NASA Astrophysics Data System (ADS)

    He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui

    2017-03-01

    As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.

  7. The community pharmacist's role in reducing CVD risk factors in Lebanon: a cross-sectional longitudinal study.

    PubMed

    Fahs, Iqbal; Hallit, Souheil; Rahal, Mohamad; Malaeb, Diana

    2018-06-13


    To assess the role of pharmacist in modifying CVDs risk factors among Lebanese adults in urban and rural areas.
    Materials (Subjects) and Methods
    In a prospective survey, 865 out of 1000 participants aged ≥ 45, previously interviewed, agreed to be followed at 1 and 2 year time points. Parameters including blood pressure, lipid profile, blood glucose, average number of risk factors, and atherosclerotic cardiovascular disease (ASCVD) risk were assessed and evaluated at the beginning of the study, then after 1 and 2 years.
    Results:
    After patient's education and during both follow ups, the mean average body mass index (BMI) and systolic blood pressure (SBP) statistically decreased significantly. The lipid profile as well statistically improved significantly during both follow-ups. to around 9%. Further statistically significant improvements in ASCVD risk occurred during the second follow-up to around 8%. Monitoring parameters revealed statistical significant improvements as well.
    Conclusion:
    This study showed that a plan that includes pharmacists, who regularly monitor and follow-up patients, could improve CVD prevention through reduction of risk factors.
    . ©2018The Author(s). Published by S. Karger AG, Basel.

  8. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    NASA Astrophysics Data System (ADS)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  9. Esthetic evaluation of maxillary single-tooth implants in the esthetic zone

    PubMed Central

    Cho, Hae-Lyung; Lee, Jae-Kwan; Um, Heung-Sik

    2010-01-01

    Purpose The aim of this study is to assess the influence exerted by the observer's dental specialization and compare patients' opinion with observers' opinion of the esthetics of maxillary single-tooth implants in the esthetic zone. Methods Forty-one adult patients, who were treated with a single implant in the esthetic zone, were enrolled in this study. Eight observers (2 periodontists, 2 prosthodontists, 2 orthodontists and 2 senior dental students) applied the pink esthetic score (PES)/white esthetic score (WES) to 41 implant-supported single restorations twice with an interval of 4 weeks. We used a visual analog scale (VAS) to assess the patient's satisfaction with the treatment outcome from an esthetic point of view. Results In the PES/WES, very good and moderate intraobserver agreements were noted between the first and second rating. The mean total PES/WES was 11.19 ± 3.59. The mean PES was 5.17 ± 2.29 and mean WES was 6.02 ± 1.96. In the total PES/WES, the difference between the groups was not significant. However, in the WES, the difference between the groups was significant and prosthodontists were found to have assigned poorer ratings than the other groups. Periodontists gave higher ratings than prosthodontists and senior dental students. Orthodontists were clearly more critical than the other observers. The statistical analysis revealed statistically significant correlation between patients' esthetic perception and dentists' perception of the anterior tooth. However, the correlation between the total PES/WES and the VAS score for the first premolar was not statistically significant. Conclusions The PES/WES is an objective tool in rating the esthetics of implant supported single crowns and adjacent soft tissues. Orthodontists were the most critical observers, while periodontists were more generous than other observers. The statistical analysis revealed a statistically significant correlation between patients' esthetic perception and dentists' perception of the anterior tooth. PMID:20827328

  10. Estimation of social value of statistical life using willingness-to-pay method in Nanjing, China.

    PubMed

    Yang, Zhao; Liu, Pan; Xu, Xin

    2016-10-01

    Rational decision making regarding the safety related investment programs greatly depends on the economic valuation of traffic crashes. The primary objective of this study was to estimate the social value of statistical life in the city of Nanjing in China. A stated preference survey was conducted to investigate travelers' willingness to pay for traffic risk reduction. Face-to-face interviews were conducted at stations, shopping centers, schools, and parks in different districts in the urban area of Nanjing. The respondents were categorized into two groups, including motorists and non-motorists. Both the binary logit model and mixed logit model were developed for the two groups of people. The results revealed that the mixed logit model is superior to the fixed coefficient binary logit model. The factors that significantly affect people's willingness to pay for risk reduction include income, education, gender, age, drive age (for motorists), occupation, whether the charged fees were used to improve private vehicle equipment (for motorists), reduction in fatality rate, and change in travel cost. The Monte Carlo simulation method was used to generate the distribution of value of statistical life (VSL). Based on the mixed logit model, the VSL had a mean value of 3,729,493 RMB ($586,610) with a standard deviation of 2,181,592 RMB ($343,142) for motorists; and a mean of 3,281,283 RMB ($505,318) with a standard deviation of 2,376,975 RMB ($366,054) for non-motorists. Using the tax system to illustrate the contribution of different income groups to social funds, the social value of statistical life was estimated. The average social value of statistical life was found to be 7,184,406 RMB ($1,130,032). Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Revision by means of computer-mediated peer discussions

    NASA Astrophysics Data System (ADS)

    Soong, Benson; Mercer, Neil; Er, Siew Shin

    2010-05-01

    In this article, we provide a discussion on our revision method (termed prescriptive tutoring) aimed at revealing students' misconceptions and misunderstandings by getting them to solve physics problems with an anonymous partner via the computer. It is currently being implemented and evaluated in a public secondary school in Singapore, and statistical analysis of our initial small-scale study shows that students in the experimental group significantly outperformed students in both the control and alternative intervention groups. In addition, students in the experimental group perceived that they had gained improved understanding of the physics concepts covered during the intervention, and reported that they would like to continue revising physics concepts using the intervention methods.

  12. Homeostatic signature of anabolic steroids in cattle using 1H-13C HMBC NMR metabonomics.

    PubMed

    Dumas, Marc-Emmanuel; Canlet, Cécile; Vercauteren, Joseph; André, François; Paris, Alain

    2005-01-01

    We used metabonomics to discriminate the urinary signature of different anabolic steroid treatments in cattle having different physiological backgrounds (age, sex, and race). (1)H-(13)C heteronuclear multiple bonding connectivity NMR spectroscopy and multivariate statistical methods reveal that metabolites such as trimethylamine-N-oxide, dimethylamine, hippurate, creatine, creatinine, and citrate characterize the biological fingerprint of anabolic treatment. These urinary biomarkers suggest an overall homeostatic adaptation in nitrogen and energy metabolism. From results obtained in this study, it is now possible to consider metabonomics as a complementary method usable to improve doping control strategies to detect fraudulent anabolic treatment in cattle since the oriented global metabolic response provides helpful discrimination.

  13. Teaching beyond the walls: A mixed method study of prospective elementary teacher's belief systems about science instruction

    NASA Astrophysics Data System (ADS)

    Asim, Sumreen

    This mixed method study investigated K-6 teacher candidates' beliefs about informal science instruction prior to and after their experiences in a 15-week science methods course and in comparison to a non-intervention group. The study is predicated by the literature that supports the extent to which teachers' beliefs influence their instructional practices. The intervention integrated the six strands of learning science in informal science education (NRC, 2009) and exposed candidates to out-of-school-time environments (NRC, 2010). Participants included 17 candidates in the intervention and 75 in the comparison group. All were undergraduate K-6 teacher candidates at one university enrolled in different sections of a required science methods course. All the participants completed the Beliefs about Science Teaching (BAT) survey. Reflective journals, drawings, interviews, and microteaching protocols were collected from participants in the intervention. There was no statistically significant difference in pre or post BAT scores of the two groups; However, there was a statistically significant interaction effect for the intervention group over time. Analysis of the qualitative data revealed that the intervention candidates displayed awareness of each of the six strands of learning science in informal environments and commitment to out-of-school-time learning of science. This study supports current reform efforts favoring integration of informal science instructional strategies in science methods courses of elementary teacher education programs.

  14. Screening molecular associations with lipid membranes using natural abundance 13C cross-polarization magic-angle spinning NMR and principal component analysis.

    PubMed

    Middleton, David A; Hughes, Eleri; Madine, Jillian

    2004-08-11

    We describe an NMR approach for detecting the interactions between phospholipid membranes and proteins, peptides, or small molecules. First, 1H-13C dipolar coupling profiles are obtained from hydrated lipid samples at natural isotope abundance using cross-polarization magic-angle spinning NMR methods. Principal component analysis of dipolar coupling profiles for synthetic lipid membranes in the presence of a range of biologically active additives reveals clusters that relate to different modes of interaction of the additives with the lipid bilayer. Finally, by representing profiles from multiple samples in the form of contour plots, it is possible to reveal statistically significant changes in dipolar couplings, which reflect perturbations in the lipid molecules at the membrane surface or within the hydrophobic interior.

  15. Beyond δ: Tailoring marked statistics to reveal modified gravity

    NASA Astrophysics Data System (ADS)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models which attempt to explain the accelerated expansion of the universe through large-scale modifications to General Relativity (GR), must satisfy the stringent experimental constraints of GR in the solar system. Viable candidates invoke a “screening” mechanism, that dynamically suppresses deviations in high density environments, making their overall detection challenging even for ambitious future large-scale structure surveys. We present methods to efficiently simulate the non-linear properties of such theories, and consider how a series of statistics that reweight the density field to accentuate deviations from GR can be applied to enhance the overall signal-to-noise ratio in differentiating the models from GR. Our results demonstrate that the cosmic density field can yield additional, invaluable cosmological information, beyond the simple density power spectrum, that will enable surveys to more confidently discriminate between modified gravity models and ΛCDM.

  16. Application of one-way ANOVA in completely randomized experiments

    NASA Astrophysics Data System (ADS)

    Wahid, Zaharah; Izwan Latiff, Ahmad; Ahmad, Kartini

    2017-12-01

    This paper describes an application of a statistical technique one-way ANOVA in completely randomized experiments with three replicates. This technique was employed to a single factor with four levels and multiple observations at each level. The aim of this study is to investigate the relationship between chemical oxygen demand index and location on-sites. Two different approaches are employed for the analyses; critical value and p-value. It also presents key assumptions of the technique to be satisfied by the data in order to obtain valid results. Pairwise comparisons by Turkey method are also considered and discussed to determine where the significant differences among the means is after the ANOVA has been performed. The results revealed that there are statistically significant relationship exist between the chemical oxygen demand index and the location on-sites.

  17. Bodily maps of emotions.

    PubMed

    Nummenmaa, Lauri; Glerean, Enrico; Hari, Riitta; Hietanen, Jari K

    2014-01-14

    Emotions are often felt in the body, and somatosensory feedback has been proposed to trigger conscious emotional experiences. Here we reveal maps of bodily sensations associated with different emotions using a unique topographical self-report method. In five experiments, participants (n = 701) were shown two silhouettes of bodies alongside emotional words, stories, movies, or facial expressions. They were asked to color the bodily regions whose activity they felt increasing or decreasing while viewing each stimulus. Different emotions were consistently associated with statistically separable bodily sensation maps across experiments. These maps were concordant across West European and East Asian samples. Statistical classifiers distinguished emotion-specific activation maps accurately, confirming independence of topographies across emotions. We propose that emotions are represented in the somatosensory system as culturally universal categorical somatotopic maps. Perception of these emotion-triggered bodily changes may play a key role in generating consciously felt emotions.

  18. Skeletal maturation in individuals with Down's syndrome: comparison between PGS curve, cervical vertebrae and bones of the hand and wrist.

    PubMed

    Carinhena, Glauber; Siqueira, Danilo Furquim; Sannomiya, Eduardo Kazuo

    2014-01-01

    This study was conducted with the aim of adapting the methods developed by Martins and Sakima to assess skeletal maturation by cervical vertebrae in the pubertal growth spurt (PGS) curve. It also aimed to test the reliability and agreement between those methods and the method of hand and wrist radiograph when compared two by two and all together.  The sample comprised 72 radiographs, with 36 lateral radiographs of the head and 36 hand-wrist radiographs of 36 subjects with Down's syndrome (DS), 13 female and 23 male, aged between 8 years and 6 months and 18 years and 7 months, with an average age of 13 years and 10 months.  Results revealed that adapting the methods developed by Martins and Sakima to assess skeletal maturation by cervical vertebrae in the PGS curve is practical and useful in determining the stage of growth and development of individuals. The stages of maturation evaluated by cervical vertebrae and ossification centers observed in radiographs of the hand and wrist were considered reliable, with excellent level of agreement between the methods by Hassel and Farman as well as Baccetti, Franchi and McNamara Jr and Martins and Sakima. Additionally, results revealed an agreement that ranged between reasonable to good for the three methods used to assess the skeletal maturation, showing statistical significance.

  19. 28 CFR 22.22 - Revelation of identifiable data.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... STATISTICAL INFORMATION § 22.22 Revelation of identifiable data. (a) Except as noted in paragraph (b) of this section, research and statistical information relating to a private person may be revealed in identifiable... Act. (3) Persons or organizations for research or statistical purposes. Information may only be...

  20. 28 CFR 22.23 - Privacy certification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... STATISTICAL INFORMATION § 22.23 Privacy certification. (a) Each applicant for BJA, OJJDP, BJS, NIJ, or OJP... approval of a grant application or contract proposal which has a research or statistical project component... revealed for research or statistical purposes and that compliance with requests for information is not...

  1. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    PubMed

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  2. Temperature rise, sea level rise and increased radiative forcing - an application of cointegration methods

    NASA Astrophysics Data System (ADS)

    Schmith, Torben; Thejll, Peter; Johansen, Søren

    2016-04-01

    We analyse the statistical relationship between changes in global temperature, global steric sea level and radiative forcing in order to reveal causal relationships. There are in this, however, potential pitfalls due to the trending nature of the time series. We therefore apply a statistical method called cointegration analysis, originating from the field of econometrics, which is able to correctly handle the analysis of series with trends and other long-range dependencies. Further, we find a relationship between steric sea level and temperature and find that temperature causally depends on the steric sea level, which can be understood as a consequence of the large heat capacity of the ocean. This result is obtained both when analyzing observed data and data from a CMIP5 historical model run. Finally, we find that in the data from the historical run, the steric sea level, in turn, is driven by the external forcing. Finally, we demonstrate that combining these two results can lead to a novel estimate of radiative forcing back in time based on observations.

  3. Statistical Physics of Population Genetics in the Low Population Size Limit

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder

    The understanding of evolutionary processes lends itself naturally to theory and computation, and the entire field of population genetics has benefited greatly from the influx of methods from applied mathematics for decades. However, in spite of all this effort, there are a number of key dynamical models of evolution that have resisted analytical treatment. In addition, modern DNA sequencing technologies have magnified the amount of genetic data available, revealing an excess of rare genetic variants in human genomes, challenging the predictions of conventional theory. Here I will show that methods from statistical physics can be used to model the distribution of genetic variants, incorporating selection and spatial degrees of freedom. In particular, a functional path-integral formulation of the Wright-Fisher process maps exactly to the dynamics of a particle in an effective potential, beyond the mean field approximation. In the small population size limit, the dynamics are dominated by instanton-like solutions which determine the probability of fixation in short timescales. These results are directly relevant for understanding the unusual genetic variant distribution at moving frontiers of populations.

  4. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples

    PubMed Central

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-01-01

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell. DOI: http://dx.doi.org/10.7554/eLife.26580.001 PMID:28678007

  5. FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions

    PubMed Central

    Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro

    2017-01-01

    Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396

  6. Assessing socioeconomic vulnerability to dengue fever in Cali, Colombia: statistical vs expert-based modeling

    PubMed Central

    2013-01-01

    Background As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Methods Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. Results The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., < 15 years) and illiterate residents, as well as a high proportion of individuals being either unemployed or doing housework. Conclusions Both modeling approaches reveal similar outputs, indicating that in the absence of local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability “hotspots” into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention strategies. The results support decision makers to allocate resources in a manner that may reduce existing susceptibilities and strengthen resilience, and thus help to reduce the burden of vector-borne diseases. PMID:23945265

  7. First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation

    DOE PAGES

    Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...

    2016-12-02

    Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.

  8. Interocular suppression

    NASA Astrophysics Data System (ADS)

    Tuna, Ana Rita; Almeida Neves Carrega, Filipa; Nunes, Amélia Fernandes

    2017-08-01

    The objective of this work is to quantify the suppressive imbalance, based on the manipulation of ocular luminance, between a group of subjects with normal binocular vision and a group of subjects with amblyopia. The result reveals that there are statistically significant differences in interocular dominance between two groups, evidencing a greater suppressive imbalance in amblyopic subjects. The technique used, proved to be a simple, easy to apply and economic method, for quantified ocular dominance. It is presented as a technique with the potential to accompany subjects with a marked dominance in one of the eyes that makes fusion difficult.

  9. Quantitation of flavonoid constituents in citrus fruits.

    PubMed

    Kawaii, S; Tomono, Y; Katase, E; Ogawa, K; Yano, M

    1999-09-01

    Twenty-four flavonoids have been determined in 66 Citrus species and near-citrus relatives, grown in the same field and year, by means of reversed phase high-performance liquid chromatography analysis. Statistical methods have been applied to find relations among the species. The F ratios of 21 flavonoids obtained by applying ANOVA analysis are significant, indicating that a classification of the species using these variables is reasonable to pursue. Principal component analysis revealed that the distributions of Citrus species belonging to different classes were largely in accordance with Tanaka's classification system.

  10. Hidden messenger revealed in Hawking radiation: A resolution to the paradox of black hole information loss

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Cai, Qing-yu; You, Li; Zhan, Ming-sheng

    2009-05-01

    Using standard statistical method, we discover the existence of correlations among Hawking radiations (of tunneled particles) from a black hole. The information carried by such correlations is quantified by mutual information between sequential emissions. Through a careful counting of the entropy taken out by the emitted particles, we show that the black hole radiation as tunneling is an entropy conservation process. While information is leaked out through the radiation, the total entropy is conserved. Thus, we conclude the black hole evaporation process is unitary.

  11. [The algorithm for planning cosmonauts' timeline in flight (by the results of long-duration Mir mission)].

    PubMed

    Nechaev, A P

    2001-01-01

    Results of the investigation of the interrelation between cosmonauts' erroneous actions and work and rest schedule intensity in fourteen long-duration Mir missions are presented in the paper. The statistical significance of this dependence has been established, and its nonlinear character has been revealed. An algorithm of short-range planning of crew operations has been developed based on determination of critical crew work loads deduced from increases in erroneous actions. Together with other methods the suggested approach may be used to raise cosmonauts' performance reliability.

  12. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  13. Novel Spectrofluorimetric Method for the Determination of Perindopril Erbumine Based on Fluorescence Quenching of Rhodamine B.

    PubMed

    Fael, Hanan; Sakur, Amir Al-Haj

    2015-11-01

    A novel, simple and specific spectrofluorimetric method was developed and validated for the determination of perindopril erbumine (PDE). The method is based on the fluorescence quenching of Rhodamine B upon adding perindopril erbumine. The quenched fluorescence was monitored at 578 nm after excitation at 500 nm. The optimization of the reaction conditions such as the solvent, reagent concentration, and reaction time were investigated. Under the optimum conditions, the fluorescence quenching was linear over a concentration range of 1.0-6.0 μg/mL. The proposed method was fully validated and successfully applied to the analysis of perindopril erbumine in pure form and tablets. Statistical comparison of the results obtained by the developed and reference methods revealed no significant differences between the methods compared in terms of accuracy and precision. The method was shown to be highly specific in the presence of indapamide, a diuretic that is commonly combined with perindopril erbumine. The mechanism of rhodamine B quenching was also discussed.

  14. [Comparative analysis of modification of Misgav-Ladach and Pfannenstiel methods for cesarean section in the material of Fetal-Maternal Clinical Department PMMH-RI between 1994-1999].

    PubMed

    Pawłowicz, P; Wilczyński, J; Stachowiak, G

    2000-04-01

    Comparative analysis of own modification of Misgav-Ladach (mML) and Pfannenstiel methods for caesarean section in the material of Fetal-Maternal Medicine Clinical Department PMMH-RI between 1994-99. Study group consists of 242 patients. In all women from this group we performed caesarean section using Misgav-Ladach method. Among all patients from control group counting 285 women we performed caesarean section applying Pfannenstiel method. To analyse clinical postoperative course in both groups we took account several parameters. Statistical analysis revealed that most of clinical postoperative course parameters was significantly better values in the study group we performed caesarean section using Misgav-Ladach method. The benefits of Misgav-Ladach method, with less pain post-operatively and quicker recovery, are all a by-product of doing the least harm during surgery and removing every unnecessary step. This method is appealing for its simplicity, ease of execution and its time-saving advantage.

  15. The impact of the learning contract on self-directed learning and satisfaction in nursing students in a clinical setting.

    PubMed

    Sajadi, Mahboobeh; Fayazi, Neda; Fournier, Andrew; Abedi, Ahmad Reza

    2017-01-01

    Background: The most important responsibilities of an education system are to create self-directed learning opportunities and develop the required skills for taking the responsibility for change. The present study aimed at determining the impact of a learning contract on self-directed learning and satisfaction of nursing students. Methods: A total of 59 nursing students participated in this experimental study. They were divided into six 10-member groups. To control the communications among the groups, the first 3 groups were trained using conventional learning methods and the second 3 groups using learning contract method. In the first session, a pretest was performed based on educational objectives. At the end of the training, the students in each group completed the questionnaires of self-directed learning and satisfaction. The results of descriptive and inferential statistical methods (dependent and independent t tests) were presented using SPSS. Results: There were no significant differences between the 2 groups in gender, grade point average of previous years, and interest toward nursing. However, the results revealed a significant difference between the 2 groups in the total score of self-directed learning (p= 0.019). Although the mean satisfaction score was higher in the intervention group, the difference was not statistically significant. Conclusion: This study suggested that the use of learning contract method in clinical settings enhances self-directed learning among nursing students. Because this model focuses on individual differences, the researcher highly recommends the application of this new method to educators.

  16. 3D volumetry comparison using 3T magnetic resonance imaging between normal and adenoma-containing pituitary glands.

    PubMed

    Roldan-Valadez, Ernesto; Garcia-Ulloa, Ana Cristina; Gonzalez-Gutierrez, Omar; Martinez-Lopez, Manuel

    2011-01-01

    Computed-assisted three-dimensional data (3D) allows for an accurate evaluation of volumes compared with traditional measurements. An in vitro method comparison between geometric volume and 3D volumetry to obtain reference data for pituitary volumes in normal pituitary glands (PGs) and PGs containing adenomas. Prospective, transverse, analytical study. Forty-eight subjects underwent brain magnetic resonance imaging (MRI) with 3D sequencing for computer-aided volumetry. PG phantom volumes by both methods were compared. Using the best volumetric method, volumes of normal PGs and PGs with adenoma were compared. Statistical analysis used the Bland-Altman method, t-statistics, effect size and linear regression analysis. Method comparison between 3D volumetry and geometric volume revealed a lower bias and precision for 3D volumetry. A total of 27 patients exhibited normal PGs (mean age, 42.07 ± 16.17 years), although length, height, width, geometric volume and 3D volumetry were greater in women than in men. A total of 21 patients exhibited adenomas (mean age 39.62 ± 10.79 years), and length, height, width, geometric volume and 3D volumetry were greater in men than in women, with significant volumetric differences. Age did not influence pituitary volumes on linear regression analysis. Results from the present study showed that 3D volumetry was more accurate than the geometric method. In addition, the upper normal limits of PGs overlapped with lower volume limits during early stage microadenomas.

  17. Comparative Gender Performance in Business Statistics.

    ERIC Educational Resources Information Center

    Mogull, Robert G.

    1989-01-01

    Comparative performance of male and female students in introductory and intermediate statistics classes was examined for over 16 years at a state university. Gender means from 97 classes and 1,609 males and 1,085 females revealed a probabilistic--although statistically insignificant--superior performance by female students that appeared to…

  18. Use of cervical vertebral dimensions for assessment of children growth.

    PubMed

    Caldas, Maria de Paula; Ambrosano, Gláucia Maria Bovi; Haiter-Neto, Francisco

    2007-04-01

    The purpose of this study was to investigate whether skeletal maturation using cephalometric radiographs could be used in a Brazilian population. The study population was selected from the files of the Oral Radiological Clinic of the Dental School of Piracicaba, Brazil and consisted of 128 girls and 110 boys (7.0 to 15.9 years old) who had cephalometric and hand-wrist radiographs taken on the same day. Cervical vertebral bone age was evaluated using the method described by Mito and colleagues in 2002. Bone age was assessed by the Tanner-Whitehouse (TW3) method and was used as a gold standard to determine the reliability of cervical vertebral bone age. An analysis of variance and Tukey's post-hoc test were used to compare cervical vertebral bone age, bone age and chronological age at 5% significance level. The analysis of the Brazilian female children data showed that there was a statistically significant difference (p<0.05) between cervical vertebral bone age and chronological age and between bone age and chronological age. However no statistically significant difference (p>0.05) was found between cervical vertebral bone age and bone age. Differently, the analysis of the male children data revealed a statistically significant difference (p<0.05) between cervical vertebral bone age and bone age and between cervical vertebral bone age and chronological age (p<0.05). The findings of the present study suggest that the method for objectively evaluating skeletal maturation on cephalometric radiographs by determination of vertebral bone age can be applied to Brazilian females only. The development of a new method to objectively evaluate cervical vertebral bone age in males is needed.

  19. A Network-Based Method to Assess the Statistical Significance of Mild Co-Regulation Effects

    PubMed Central

    Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna

    2013-01-01

    Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis. PMID:24039936

  20. Scanning electron microscopy analysis of hair index on Karachi's population for social and professional appearance enhancement.

    PubMed

    Ali, N; Zohra, R R; Qader, S A U; Mumtaz, M

    2015-06-01

    Hair texture, appearance and pigment play an important role in social and professional communication and maintaining an overall appearance. This study was especially designed for morphological assessment of hair damage caused to Karachi's population due to natural factors and cosmetic treatments using scanning electron microscopy (SEM) technique. Hair samples under the study of synthetic factor's effect were given several cosmetic treatments (hot straightened, bleached, synthetic dyed and henna dyed) whereas samples under natural factor's effect (variation in gender, age and pigmentation) were left untreated. Morphological assessment was performed using SEM technique. Results obtained were statistically analysed using minitab 16 and spss 18 softwares. Scanning electron microscopy images revealed less number of cuticular scales in males than females of same age although size of cuticular scales was found to be larger in males than in females. Mean hair index of white hair was greater than black hair of the same head as it is comparatively newly originated. Tukey's method revealed that among cosmetic treatments, bleaching and synthetic henna caused most of the damage to the hair. Statistical evaluation of results obtained from SEM analysis revealed that human scalp hair index show morphological variation with respect to age, gender, hair pigmentation, chemical and physical treatments. Individuals opting for cosmetic treatments could clearly visualize the extent of hair damage these may cause in long run. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  1. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the same six pairs of modalities were significantly different, but the JAFROC confidence intervals were about 32% smaller than ROC confidence intervals. This study shows that image processing has a significant impact on the detection of microcalcifications in digital mammograms. Objective measurements, such as described here, should be used by the manufacturers to select the optimal image processing algorithm.

  2. Comparison of Two Surface Contamination Sampling Techniques Conducted for the Characterization of Two Pajarito Site Manhattan Project National Historic Park Properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, Tammy Ann

    Technical Area-18 (TA-18), also known as Pajarito Site, is located on Los Alamos National Laboratory property and has historic buildings that will be included in the Manhattan Project National Historic Park. Characterization studies of metal contamination were needed in two of the four buildings that are on the historic registry in this area, a “battleship” bunker building (TA-18-0002) and the Pond cabin (TA-18-0029). However, these two buildings have been exposed to the elements, are decades old, and have porous and rough surfaces (wood and concrete). Due to these conditions, it was questioned whether standard wipe sampling would be adequate tomore » detect surface dust metal contamination in these buildings. Thus, micro-vacuum and surface wet wipe sampling techniques were performed side-by-side at both buildings and results were compared statistically. A two-tail paired t-test revealed that the micro-vacuum and wet wipe techniques were statistically different for both buildings. Further mathematical analysis revealed that the wet wipe technique picked up more metals from the surface than the microvacuum technique. Wet wipes revealed concentrations of beryllium and lead above internal housekeeping limits; however, using an yttrium normalization method with linear regression analysis between beryllium and yttrium revealed a correlation indicating that the beryllium levels were likely due to background and not operational contamination. PPE and administrative controls were implemented for National Park Service (NPS) and Department of Energy (DOE) tours as a result of this study. Overall, this study indicates that the micro-vacuum technique may not be an efficient technique to sample for metal dust contamination.« less

  3. Computer-based self-organized tectonic zoning: a tentative pattern recognition for Iran

    NASA Astrophysics Data System (ADS)

    Zamani, Ahmad; Hashemi, Naser

    2004-08-01

    Conventional methods of tectonic zoning are frequently characterized by two deficiencies. The first one is the large uncertainty involved in tectonic zoning based on non-quantitative and subjective analysis. Failure to interpret accurately a large amount of data "by eye" is the second. In order to alleviate each of these deficiencies, the multivariate statistical method of cluster analysis has been utilized to seek and separate zones with similar tectonic pattern and construct automated self-organized multivariate tectonic zoning maps. This analytical method of tectonic regionalization is particularly useful for showing trends in tectonic evolution of a region that could not be discovered by any other means. To illustrate, this method has been applied for producing a general-purpose numerical tectonic zoning map of Iran. While there are some similarities between the self-organized multivariate numerical maps and the conventional maps, the cluster solution maps reveal some remarkable features that cannot be observed on the current tectonic maps. The following specific examples need to be noted: (1) The much disputed extent and rigidity of the Lut Rigid Block, described as the microplate of east Iran, is clearly revealed on the self-organized numerical maps. (2) The cluster solution maps reveal a striking similarity between this microplate and the northern Central Iran—including the Great Kavir region. (3) Contrary to the conventional map, the cluster solution maps make a clear distinction between the East Iranian Ranges and the Makran Mountains. (4) Moreover, an interesting similarity between the Azarbaijan region in the northwest and the Makran Mountains in the southeast and between the Kopet Dagh Ranges in the northeast and the Zagros Folded Belt in the southwest of Iran are revealed in the clustering process. This new approach to tectonic zoning is a starting point and is expected to be improved and refined by collection of new data. The method is also a useful tool in studying neotectonics, seismotectonics, seismic zoning, and hazard estimation of the seismogenic regions.

  4. Development and optimization of a noncontact optical device for online monitoring of jaundice in human subjects.

    PubMed

    Polley, Nabarun; Saha, Srimoyee; Singh, Soumendra; Adhikari, Aniruddha; Das, Sukhen; Choudhury, Bhaskar Roy; Pal, Samir Kumar

    2015-06-01

    Jaundice is one of the notable markers of liver malfunction in our body, revealing a significant rise in the concentration of an endogenous yellow pigment bilirubin. We have described a method for measuring the optical spectrum of our conjunctiva and derived pigment concentration by using diffused reflection measurement. The method uses no prior model and is expected to work across the races (skin color) encompassing a wide range of age groups. An optical fiber-based setup capable of measuring the conjunctival absorption spectrum from 400 to 800 nm is used to monitor the level of bilirubin and is calibrated with the value measured from blood serum of the same human subject. We have also developed software in the LabVIEW platform for use in online monitoring of bilirubin levels in human subjects by nonexperts. The results demonstrate that relative absorption at 460 and 600 nm has a distinct correlation with that of the bilirubin concentration measured from blood serum. Statistical analysis revealed that our proposed method is in agreement with the conventional biochemical method. The innovative noncontact, low-cost technique is expected to have importance in monitoring jaundice in developing/underdeveloped countries, where the inexpensive diagnosis of jaundice with minimally trained manpower is obligatory.

  5. Development and optimization of a noncontact optical device for online monitoring of jaundice in human subjects

    NASA Astrophysics Data System (ADS)

    Polley, Nabarun; Saha, Srimoyee; Singh, Soumendra; Adhikari, Aniruddha; Das, Sukhen; Choudhury, Bhaskar Roy; Pal, Samir Kumar

    2015-06-01

    Jaundice is one of the notable markers of liver malfunction in our body, revealing a significant rise in the concentration of an endogenous yellow pigment bilirubin. We have described a method for measuring the optical spectrum of our conjunctiva and derived pigment concentration by using diffused reflection measurement. The method uses no prior model and is expected to work across the races (skin color) encompassing a wide range of age groups. An optical fiber-based setup capable of measuring the conjunctival absorption spectrum from 400 to 800 nm is used to monitor the level of bilirubin and is calibrated with the value measured from blood serum of the same human subject. We have also developed software in the LabVIEW platform for use in online monitoring of bilirubin levels in human subjects by nonexperts. The results demonstrate that relative absorption at 460 and 600 nm has a distinct correlation with that of the bilirubin concentration measured from blood serum. Statistical analysis revealed that our proposed method is in agreement with the conventional biochemical method. The innovative noncontact, low-cost technique is expected to have importance in monitoring jaundice in developing/underdeveloped countries, where the inexpensive diagnosis of jaundice with minimally trained manpower is obligatory.

  6. Pre-service elementary science teaching self-efficacy and teaching practices: A mixed-methods, dual-phase, embedded case study

    NASA Astrophysics Data System (ADS)

    Sangueza, Cheryl Ramirez

    This mixed-method, dual-phase, embedded-case study employed the Social Cognitive Theory and the construct of self-efficacy to examine the contributors to science teaching self-efficacy and science teaching practices across different levels of efficacy in six pre-service elementary teachers during their science methods course and student teaching experiences. Data sources included the Science Teaching Efficacy Belief Instrument (STEBI-B) for pre-service teachers, questionnaires, journals, reflections, student teaching lesson observations, and lesson debriefing notes. Results from the STEBI-B show that all participants measured an increase in efficacy throughout the study. The ANOVA analysis of the STEBI-B revealed a statistically significant increase in level of efficacy during methods course, student teaching, and from the beginning of the study to the end. Of interest in this study was the examination of the participants' science teaching practices across different levels of efficacy. Results of this analysis revealed how the pre-service elementary teachers in this study contextualized their experiences in learning to teach science and its influences on their science teaching practices. Key implications involves the value in exploring how pre-service teachers interpret their learning to teach experiences and how their interpretations influence the development of their science teaching practices.

  7. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion.

    PubMed

    Fröhlich, Fabian; Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J; Grima, Ramon; Hasenauer, Jan

    2016-07-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity.

  8. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion

    PubMed Central

    Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J.; Grima, Ramon; Hasenauer, Jan

    2016-01-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity. PMID:27447730

  9. Identification of statistically independent climatic pattern in GRACE and hydrological model data over West-Africa

    NASA Astrophysics Data System (ADS)

    Kusche, J.; Forootan, E.; Eicker, A.; Hoffmann-Dobrev, H.

    2012-04-01

    West-African countries have been exposed to changes in rainfall patterns over the last decades, including a significant negative trend. This causes adverse effects on water resources, for instance reduced freshwater availability, and changes in the frequency, duration and magnitude of droughts and floods. Extracting the main patterns of water storage change in West Africa from remote sensing and linking them to climate variability, is therefore an essential step to understand the hydrological aspects of the region. In this study, the higher order statistical method of Independent Component Analysis (ICA) is employed to extract statistically independent water storage patterns from monthly Gravity Recovery And Climate Experiment (GRACE), from the WaterGAP Global Hydrology Model (WGHM) and from Tropical Rainfall Measuring Mission (TRMM) products over West Africa, for the period 2002-2012. Then, to reveal the influences of climatic teleconnections on the individual patterns, these results were correlated to the El Nino-Southern Oscillation (ENSO) and the Indian Ocean Dipole (IOD) indices. To study the predictability of water storage changes, advanced statistical methods were applied on the main independent Sea Surface Temperature (SST) patterns over the Atlantic and Indian Oceans for the period 2002-2012 and the ICA results. Our results show a water storage decrease over the coastal regions of West Africa (including Sierra Leone, Liberia, Togo and Nigeria), associated with rainfall decrease. The comparison between GRACE estimations and WGHM results indicates some inconsistencies that underline the importance of forcing data for hydrological modeling of West Africa. Keywords: West Africa; GRACE-derived water storage; ICA; ENSO; IOD

  10. Template protection and its implementation in 3D face recognition systems

    NASA Astrophysics Data System (ADS)

    Zhou, Xuebing

    2007-04-01

    As biometric recognition systems are widely applied in various application areas, security and privacy risks have recently attracted the attention of the biometric community. Template protection techniques prevent stored reference data from revealing private biometric information and enhance the security of biometrics systems against attacks such as identity theft and cross matching. This paper concentrates on a template protection algorithm that merges methods from cryptography, error correction coding and biometrics. The key component of the algorithm is to convert biometric templates into binary vectors. It is shown that the binary vectors should be robust, uniformly distributed, statistically independent and collision-free so that authentication performance can be optimized and information leakage can be avoided. Depending on statistical character of the biometric template, different approaches for transforming biometric templates into compact binary vectors are presented. The proposed methods are integrated into a 3D face recognition system and tested on the 3D facial images of the FRGC database. It is shown that the resulting binary vectors provide an authentication performance that is similar to the original 3D face templates. A high security level is achieved with reasonable false acceptance and false rejection rates of the system, based on an efficient statistical analysis. The algorithm estimates the statistical character of biometric templates from a number of biometric samples in the enrollment database. For the FRGC 3D face database, the small distinction of robustness and discriminative power between the classification results under the assumption of uniquely distributed templates and the ones under the assumption of Gaussian distributed templates is shown in our tests.

  11. Investigating spousal concordance of diabetes through statistical analysis and data mining

    PubMed Central

    Liu, Chiu-Shong; Lung, Chi-Hsuan; Yang, Ya-Tun; Lin, Ming-Hung

    2017-01-01

    Objective Spousal clustering of diabetes merits attention. Whether old-age vulnerability or a shared family environment determines the concordance of diabetes is also uncertain. This study investigated the spousal concordance of diabetes and compared the risk of diabetes concordance between couples and noncouples by using nationally representative data. Methods A total of 22,572 individuals identified from the 2002–2013 National Health Insurance Research Database of Taiwan constituted 5,643 couples and 5,643 noncouples through 1:1 dual propensity score matching (PSM). Factors associated with concordance in both spouses with diabetes were analyzed at the individual level. The risk of diabetes concordance between couples and noncouples was compared at the couple level. Logistic regression was the main statistical method. Statistical data were analyzed using SAS 9.4. C&RT and Apriori of data mining conducted in IBM SPSS Modeler 13 served as a supplement to statistics. Results High odds of the spousal concordance of diabetes were associated with old age, middle levels of urbanization, and high comorbidities (all P < 0.05). The dual PSM analysis revealed that the risk of diabetes concordance was significantly higher in couples (5.19%) than in noncouples (0.09%; OR = 61.743, P < 0.0001). Conclusions A high concordance rate of diabetes in couples may indicate the influences of assortative mating and shared environment. Diabetes in a spouse implicates its risk in the partner. Family-based diabetes care that emphasizes the screening of couples at risk of diabetes by using the identified risk factors is suggested in prospective clinical practice interventions. PMID:28817654

  12. Statistical bias correction method applied on CMIP5 datasets over the Indian region during the summer monsoon season for climate change applications

    NASA Astrophysics Data System (ADS)

    Prasanna, V.

    2018-01-01

    This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.

  13. Error of the slanted edge method for measuring the modulation transfer function of imaging systems.

    PubMed

    Xie, Xufen; Fan, Hongda; Wang, Hongyuan; Wang, Zebin; Zou, Nianyu

    2018-03-01

    The slanted edge method is a basic approach for measuring the modulation transfer function (MTF) of imaging systems; however, its measurement accuracy is limited in practice. Theoretical analysis of the slanted edge MTF measurement method performed in this paper reveals that inappropriate edge angles and random noise reduce this accuracy. The error caused by edge angles is analyzed using sampling and reconstruction theory. Furthermore, an error model combining noise and edge angles is proposed. We verify the analyses and model with respect to (i) the edge angle, (ii) a statistical analysis of the measurement error, (iii) the full width at half-maximum of a point spread function, and (iv) the error model. The experimental results verify the theoretical findings. This research can be referential for applications of the slanted edge MTF measurement method.

  14. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  15. Science-Technology-Society literacy in college non-majors biology: Comparing problem/case studies based learning and traditional expository methods of instruction

    NASA Astrophysics Data System (ADS)

    Peters, John S.

    This study used a multiple response model (MRM) on selected items from the Views on Science-Technology-Society (VOSTS) survey to examine science-technology-society (STS) literacy among college non-science majors' taught using Problem/Case Studies Based Learning (PBL/CSBL) and traditional expository methods of instruction. An initial pilot investigation of 15 VOSTS items produced a valid and reliable scoring model which can be used to quantitatively assess student literacy on a variety of STS topics deemed important for informed civic engagement in science related social and environmental issues. The new scoring model allows for the use of parametric inferential statistics to test hypotheses about factors influencing STS literacy. The follow-up cross-institutional study comparing teaching methods employed Hierarchical Linear Modeling (HLM) to model the efficiency and equitability of instructional methods on STS literacy. A cluster analysis was also used to compare pre and post course patterns of student views on the set of positions expressed within VOSTS items. HLM analysis revealed significantly higher instructional efficiency in the PBL/CSBL study group for 4 of the 35 STS attitude indices (characterization of media vs. school science; tentativeness of scientific models; cultural influences on scientific research), and more equitable effects of traditional instruction on one attitude index (interdependence of science and technology). Cluster analysis revealed generally stable patterns of pre to post course views across study groups, but also revealed possible teaching method effects on the relationship between the views expressed within VOSTS items with respect to (1) interdependency of science and technology; (2) anti-technology; (3) socioscientific decision-making; (4) scientific/technological solutions to environmental problems; (5) usefulness of school vs. media characterizations of science; (6) social constructivist vs. objectivist views of theories; (7) impact of cultural religious/ethical views on science; (8) tentativeness of scientific models, evidence and predictions; (9) civic control of technological developments. This analysis also revealed common relationships between student views which would not have been revealed under the original unique response model (URM) of VOSTS and also common viewpoint patterns that warrant further qualitative exploration.

  16. Favre-Averaged Turbulence Statistics in Variable Density Mixing of Buoyant Jets

    NASA Astrophysics Data System (ADS)

    Charonko, John; Prestridge, Kathy

    2014-11-01

    Variable density mixing of a heavy fluid jet with lower density ambient fluid in a subsonic wind tunnel was experimentally studied using Particle Image Velocimetry and Planar Laser Induced Fluorescence to simultaneously measure velocity and density. Flows involving the mixing of fluids with large density ratios are important in a range of physical problems including atmospheric and oceanic flows, industrial processes, and inertial confinement fusion. Here we focus on buoyant jets with coflow. Results from two different Atwood numbers, 0.1 (Boussinesq limit) and 0.6 (non-Boussinesq case), reveal that buoyancy is important for most of the turbulent quantities measured. Statistical characteristics of the mixing important for modeling these flows such as the PDFs of density and density gradients, turbulent kinetic energy, Favre averaged Reynolds stress, turbulent mass flux velocity, density-specific volume correlation, and density power spectra were also examined and compared with previous direct numerical simulations. Additionally, a method for directly estimating Reynolds-averaged velocity statistics on a per-pixel basis is extended to Favre-averages, yielding improved accuracy and spatial resolution as compared to traditional post-processing of velocity and density fields.

  17. Rapid Identification of Candida Species by Using Nuclear Magnetic Resonance Spectroscopy and a Statistical Classification Strategy

    PubMed Central

    Himmelreich, Uwe; Somorjai, Ray L.; Dolenko, Brion; Lee, Ok Cha; Daniel, Heide-Marie; Murray, Ronan; Mountford, Carolyn E.; Sorrell, Tania C.

    2003-01-01

    Nuclear magnetic resonance (NMR) spectra were acquired from suspensions of clinically important yeast species of the genus Candida to characterize the relationship between metabolite profiles and species identification. Major metabolites were identified by using two-dimensional correlation NMR spectroscopy. One-dimensional proton NMR spectra were analyzed by using a staged statistical classification strategy. Analysis of NMR spectra from 442 isolates of Candida albicans, C. glabrata, C. krusei, C. parapsilosis, and C. tropicalis resulted in rapid, accurate identification when compared with conventional and DNA-based identification. Spectral regions used for the classification of the five yeast species revealed species-specific differences in relative amounts of lipids, trehalose, polyols, and other metabolites. Isolates of C. parapsilosis and C. glabrata with unusual PCR fingerprinting patterns also generated atypical NMR spectra, suggesting the possibility of intraspecies discontinuity. We conclude that NMR spectroscopy combined with a statistical classification strategy is a rapid, nondestructive, and potentially valuable method for identification and chemotaxonomic characterization that may be broadly applicable to fungi and other microorganisms. PMID:12902244

  18. Predicting Flowering Behavior and Exploring Its Genetic Determinism in an Apple Multi-family Population Based on Statistical Indices and Simplified Phenotyping.

    PubMed

    Durand, Jean-Baptiste; Allard, Alix; Guitton, Baptiste; van de Weg, Eric; Bink, Marco C A M; Costes, Evelyne

    2017-01-01

    Irregular flowering over years is commonly observed in fruit trees. The early prediction of tree behavior is highly desirable in breeding programmes. This study aims at performing such predictions, combining simplified phenotyping and statistics methods. Sequences of vegetative vs. floral annual shoots (AS) were observed along axes in trees belonging to five apple related full-sib families. Sequences were analyzed using Markovian and linear mixed models including year and site effects. Indices of flowering irregularity, periodicity and synchronicity were estimated, at tree and axis scales. They were used to predict tree behavior and detect QTL with a Bayesian pedigree-based analysis, using an integrated genetic map containing 6,849 SNPs. The combination of a Biennial Bearing Index (BBI) with an autoregressive coefficient (γ g ) efficiently predicted and classified the genotype behaviors, despite few misclassifications. Four QTLs common to BBIs and γ g and one for synchronicity were highlighted and revealed the complex genetic architecture of the traits. Irregularity resulted from high AS synchronism, whereas regularity resulted from either asynchronous locally alternating or continual regular AS flowering. A relevant and time-saving method, based on a posteriori sampling of axes and statistical indices is proposed, which is efficient to evaluate the tree breeding values for flowering regularity and could be transferred to other species.

  19. A persuasive concept of research-oriented teaching in Soil Biochemistry

    NASA Astrophysics Data System (ADS)

    Blagodatskaya, Evgenia; Kuzyakova, Irina

    2013-04-01

    One of the main problems of existing bachelor programs is disconnection of basic and experimental education: even during practical training the methods learned are not related to characterization of soil field experiments and observed soil processes. We introduce a multi-level research-oriented teaching system involving Bachelor students in four-semesters active study by integration the basic knowledge, experimental techniques, statistical approaches, project design and it's realization.The novelty of research-oriented teaching system is based 1) on linkage of ongoing experiment to the study of statistical methods and 2) on self-responsibility of students for interpretation of soil chemical and biochemical characteristics obtained in the very beginning of their study by analysing the set of soil samples allowing full-factorial data treatment. This experimental data set is related to specific soil stand and is used as a backbone of the teaching system accelerating the student's interest to soil studies and motivating them for application of basic knowledge from lecture courses. The multi-level system includes: 1) basic lecture course on soil biochemistry with analysis of research questions, 2) practical training course on laboratory analytics where small groups of students are responsible for analysis of soil samples related to the specific land-use/forest type/forest age; 3) training course on biotic (e.g. respiration) - abiotic (e.g. temperature, moisture, fire etc.) interactions in the same soil samples; 4) theoretical seminars where students present and make a first attempt to explain soil characteristics of various soil stands as affected by abiotic factors (first semester); 5) lecture and seminar course on soil statistics where students apply newly learned statistical methods to prove their conclusions and to find relationships between soil characteristics obtained during first semester; 6) seminar course on project design where students develop their scientific projects to study the uncertainties revealed in soil responses to abiotic factors (second and third semesters); 7) Lecture, seminar and training courses on estimation of active microbial biomass in soil where students realize their projects applying a new knowledge to the soils from the stands they are responsible for (fourth semester). Thus, during four semesters the students continuously combine the theoretical knowledge from the lectures with their own experimental experience, compare and discuss results of various groups during seminars and obtain the skills in project design. The successful application of research-oriented teaching system in University of Göttingen allowed each student the early-stage revealing knowledge gaps, accelerated their involvement in ongoing research projects, and motivated them to begin own scientific career.

  20. A Novel Analysis Of The Connection Between Indian Monsoon Rainfall And Solar Activity

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, S.; Narasimha, R.

    2005-12-01

    The existence of possible correlations between the solar cycle period as extracted from the yearly means of sunspot numbers and any periodicities that may be present in the Indian monsoon rainfall has been addressed using wavelet analysis. The wavelet transform coefficient maps of sunspot-number time series and those of the homogeneous Indian monsoon rainfall annual time series data reveal striking similarities, especially around the 11-year period. A novel method to analyse and quantify this similarity devising statistical schemes is suggested in this paper. The wavelet transform coefficient maxima at the 11-year period for the sunspot numbers and the monsoon rainfall have each been modelled as a point process in time and a statistical scheme for identifying a trend or dependence between the two processes has been devised. A regression analysis of parameters in these processes reveals a nearly linear trend with small but systematic deviations from the regressed line. Suitable function models for these deviations have been obtained through an unconstrained error minimisation scheme. These models provide an excellent fit to the time series of the given wavelet transform coefficient maxima obtained from actual data. Statistical significance tests on these deviations suggest with 99% confidence that the deviations are sample fluctuations obtained from normal distributions. In fact our earlier studies (see, Bhattacharyya and Narasimha, 2005, Geophys. Res. Lett., Vol. 32, No. 5) revealed that average rainfall is higher during periods of greater solar activity for all cases, at confidence levels varying from 75% to 99%, being 95% or greater in 3 out of 7 of them. Analysis using standard wavelet techniques reveals higher power in the 8--16 y band during the higher solar activity period, in 6 of the 7 rainfall time series, at confidence levels exceeding 99.99%. Furthermore, a comparison between the wavelet cross spectra of solar activity with rainfall and noise (including those simulating the rainfall spectrum and probability distribution) revealed that over the two test-periods respectively of high and low solar activity, the average cross power of the solar activity index with rainfall exceeds that with the noise at z-test confidence levels exceeding 99.99% over period-bands covering the 11.6 y sunspot cycle (see, Bhattacharyya and Narasimha, SORCE 2005 14-16th September, at Durango, Colorado USA). These results provide strong evidence for connections between Indian rainfall and solar activity. The present study reveals in addition the presence of subharmonics of the solar cycle period in the monsoon rainfall time series together with information on their phase relationships.

  1. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  2. Pitfalls in the statistical examination and interpretation of the correspondence between physician and patient satisfaction ratings and their relevance for shared decision making research

    PubMed Central

    2011-01-01

    Background The correspondence of satisfaction ratings between physicians and patients can be assessed on different dimensions. One may examine whether they differ between the two groups or focus on measures of association or agreement. The aim of our study was to evaluate methodological difficulties in calculating the correspondence between patient and physician satisfaction ratings and to show the relevance for shared decision making research. Methods We utilised a structured tool for cardiovascular prevention (arriba™) in a pragmatic cluster-randomised controlled trial. Correspondence between patient and physician satisfaction ratings after individual primary care consultations was assessed using the Patient Participation Scale (PPS). We used the Wilcoxon signed-rank test, the marginal homogeneity test, Kendall's tau-b, weighted kappa, percentage of agreement, and the Bland-Altman method to measure differences, associations, and agreement between physicians and patients. Results Statistical measures signal large differences between patient and physician satisfaction ratings with more favourable ratings provided by patients and a low correspondence regardless of group allocation. Closer examination of the raw data revealed a high ceiling effect of satisfaction ratings and only slight disagreement regarding the distributions of differences between physicians' and patients' ratings. Conclusions Traditional statistical measures of association and agreement are not able to capture a clinically relevant appreciation of the physician-patient relationship by both parties in skewed satisfaction ratings. Only the Bland-Altman method for assessing agreement augmented by bar charts of differences was able to indicate this. Trial registration ISRCTN: ISRCT71348772 PMID:21592337

  3. Spatial genetic analyses reveal cryptic population structure and migration patterns in a continuously harvested grey wolf (Canis lupus) population in north-eastern Europe.

    PubMed

    Hindrikson, Maris; Remm, Jaanus; Männil, Peep; Ozolins, Janis; Tammeleht, Egle; Saarma, Urmas

    2013-01-01

    Spatial genetics is a relatively new field in wildlife and conservation biology that is becoming an essential tool for unravelling the complexities of animal population processes, and for designing effective strategies for conservation and management. Conceptual and methodological developments in this field are therefore critical. Here we present two novel methodological approaches that further the analytical possibilities of STRUCTURE and DResD. Using these approaches we analyse structure and migrations in a grey wolf (Canislupus) population in north-eastern Europe. We genotyped 16 microsatellite loci in 166 individuals sampled from the wolf population in Estonia and Latvia that has been under strong and continuous hunting pressure for decades. Our analysis demonstrated that this relatively small wolf population is represented by four genetic groups. We also used a novel methodological approach that uses linear interpolation to statistically test the spatial separation of genetic groups. The new method, which is capable of using program STRUCTURE output, can be applied widely in population genetics to reveal both core areas and areas of low significance for genetic groups. We also used a recently developed spatially explicit individual-based method DResD, and applied it for the first time to microsatellite data, revealing a migration corridor and barriers, and several contact zones.

  4. [Aerosol deposition and clinical performance verified with a spacer device made in Brazil

    PubMed

    Camargos, P A; Rubim, J A; Simal, C J; Lasmar, L M

    2000-01-01

    OBJECTIVE: To assess the lung deposition pattern of radioaerosol and the clinical performance of a spacer developed and made in Brazil. METHODS: Qualitative - in a patient with cystic fibrosis - and semi-quantitative - in two healthy volunteers - assessment of pulmonary deposition of (99)mtechnetium was done using the Aerogama Medical oxigen driven nebulizer system attached to the spacer and a gama-camera (Siemens, model Orbiter) connected to a microcomputer. In the next step, clinical assessment was carried out in 50 asthmatic children, aged from four months to 13 years old with an acute attack, using conventional doses of albuterol through a metered dose inhaler attached to the spacer device. RESULTS: Qualitative assessment revealed a lung silhouette comparable with those obtained in the inhalation scintigraphy and semiquantitative assessment reveals that 7.5% to 8.0% of the inhaled (99m)technetium reached the volunteerś lungs. Statistically significant differences (p < 0.001) were observed comparing clinical scores at admission with those verified 20 and 40 minutes after albuterol inhalation; conversely, no significance was obtained for scores taken at 60 and 80 minutes. CONCLUSIONS: Although we used an alternative method, the scintigraphic assessment reveals an expected pattern of pulmonary deposition. Similarly, clinical performance in the treatment of an acute attack showed results comparable with those obtained with other spacers devices.

  5. Analyses of the Microbial Diversity across the Human Microbiome

    PubMed Central

    Li, Kelvin; Bihan, Monika; Yooseph, Shibu; Methé, Barbara A.

    2012-01-01

    Analysis of human body microbial diversity is fundamental to understanding community structure, biology and ecology. The National Institutes of Health Human Microbiome Project (HMP) has provided an unprecedented opportunity to examine microbial diversity within and across body habitats and individuals through pyrosequencing-based profiling of 16 S rRNA gene sequences (16 S) from habits of the oral, skin, distal gut, and vaginal body regions from over 200 healthy individuals enabling the application of statistical techniques. In this study, two approaches were applied to elucidate the nature and extent of human microbiome diversity. First, bootstrap and parametric curve fitting techniques were evaluated to estimate the maximum number of unique taxa, Smax, and taxa discovery rate for habitats across individuals. Next, our results demonstrated that the variation of diversity within low abundant taxa across habitats and individuals was not sufficiently quantified with standard ecological diversity indices. This impact from low abundant taxa motivated us to introduce a novel rank-based diversity measure, the Tail statistic, (“τ”), based on the standard deviation of the rank abundance curve if made symmetric by reflection around the most abundant taxon. Due to τ’s greater sensitivity to low abundant taxa, its application to diversity estimation of taxonomic units using taxonomic dependent and independent methods revealed a greater range of values recovered between individuals versus body habitats, and different patterns of diversity within habitats. The greatest range of τ values within and across individuals was found in stool, which also exhibited the most undiscovered taxa. Oral and skin habitats revealed variable diversity patterns, while vaginal habitats were consistently the least diverse. Collectively, these results demonstrate the importance, and motivate the introduction, of several visualization and analysis methods tuned specifically for next-generation sequence data, further revealing that low abundant taxa serve as an important reservoir of genetic diversity in the human microbiome. PMID:22719823

  6. A retrospective study to reveal factors associated with postoperative shoulder imbalance in patients with adolescent idiopathic scoliosis with double thoracic curve.

    PubMed

    Lee, Choon Sung; Hwang, Chang Ju; Lim, Eic Ju; Lee, Dong-Ho; Cho, Jae Hwan

    2016-12-01

    OBJECTIVE Postoperative shoulder imbalance (PSI) is a critical consideration after corrective surgery for a double thoracic curve (Lenke Type 2); however, the radiographic factors related to PSI remain unclear. The purpose of this study was to identify the radiographic factors related to PSI after corrective surgery for adolescent idiopathic scoliosis (AIS) in patients with a double thoracic curve. METHODS This study included 80 patients with Lenke Type 2 AIS who underwent corrective surgery. Patients were grouped according to the presence [PSI(+)] or absence [PSI(-)] of shoulder imbalance at the final follow-up examination (differences of 20, 15, and 10 mm were used). Various radiographic parameters, including the Cobb angle of the proximal and middle thoracic curves (PTC and MTC), radiographic shoulder height (RSH), clavicle angle, T-1 tilt, trunk shift, and proximal and distal wedge angles (PWA and DWA), were assessed before and after surgery and compared between groups. RESULTS Overall, postoperative RSH decreased with time in the PSI(-) group but not in the PSI(+) group. Statistical analyses revealed that the preoperative Risser grade (p = 0.048), postoperative PWA (p = 0.028), and postoperative PTC/MTC ratio (p = 0.011) correlated with PSI. Presence of the adding-on phenomenon was also correlated with PSI, although this result was not statistically significant (p = 0.089). CONCLUSIONS Postoperative shoulder imbalance is common after corrective surgery for Lenke Type 2 AIS and correlates with a higher Risser grade, a larger postoperative PWA, and a higher postoperative PTC/MTC ratio. Presence of the distal adding-on phenomenon is associated with an increased PSI trend, although this result was not statistically significant. However, preoperative factors other than the Risser grade that affect the development of PSI were not identified by the study. Additional studies are required to reveal the risk factors for the development of PSI.

  7. Possible influence of solar extreme events and related geomagnetic disturbances on human cardio-vascular state: Results of collaborative Bulgarian-Azerbaijani studies

    NASA Astrophysics Data System (ADS)

    Dimitrova, S.; Mustafa, F. R.; Stoilova, I.; Babayev, E. S.; Kazimov, E. A.

    2009-02-01

    This collaborative study is based on the analysis and comparison of results of coordinated experimental investigations conducted in Bulgaria and Azerbaijan for revealing a possible influence of solar activity changes and related geomagnetic activity variations on the human cardio-vascular state. Arterial blood pressure and heart rate of 86 healthy volunteers were measured on working days during a period of comparatively high solar and geomagnetic activity (2799 measurements in autumn 2001 and spring 2002) in Sofia. Daily experimental investigations of parameters of cardio-vascular health state were performed in Azerbaijan with a permanent group of examined persons. Heart rate and electrocardiograms were digitally registered (in total 1532 records) for seven functionally healthy persons on working days and Saturdays, in the Laboratory of Heliobiology at the Medical Center INAM in Baku, from 15.07.2006 to 13.11.2007. Obtained digital recordings were subjected to medical, statistical and spectral analyses. Special attention was paid to effects of solar extreme events, particularly those of November 2001 and December 2006. The statistical method of the analysis of variance (ANOVA) and post hoc analysis were applied to check the significance of the influence of geomagnetic activity on the cardio-vascular parameters under consideration. Results revealed statistically significant increments for the mean systolic and diastolic blood pressure values of the group with geomagnetic activity increase. Arterial blood pressure values started increasing two days prior to geomagnetic storms and kept their high values up to two days after the storms. Heart rate reaction was ambiguous and not significant for healthy persons examined (for both groups) under conditions with geomagnetic activity changes. It is concluded that heart rate for healthy persons at middle latitudes can be considered as a more stable physiological parameter which is not so sensitive to environmental changes while the dynamics of arterial blood pressure reveals a compensatory reaction of the human organism for adaptation.

  8. A Spiking Neural Network Methodology and System for Learning and Comparative Analysis of EEG Data From Healthy Versus Addiction Treated Versus Addiction Not Treated Subjects.

    PubMed

    Doborjeh, Maryam Gholami; Wang, Grace Y; Kasabov, Nikola K; Kydd, Robert; Russell, Bruce

    2016-09-01

    This paper introduces a method utilizing spiking neural networks (SNN) for learning, classification, and comparative analysis of brain data. As a case study, the method was applied to electroencephalography (EEG) data collected during a GO/NOGO cognitive task performed by untreated opiate addicts, those undergoing methadone maintenance treatment (MMT) for opiate dependence and a healthy control group. the method is based on an SNN architecture called NeuCube, trained on spatiotemporal EEG data. NeuCube was used to classify EEG data across subject groups and across GO versus NOGO trials, but also facilitated a deeper comparative analysis of the dynamic brain processes. This analysis results in a better understanding of human brain functioning across subject groups when performing a cognitive task. In terms of the EEG data classification, a NeuCube model obtained better results (the maximum obtained accuracy: 90.91%) when compared with traditional statistical and artificial intelligence methods (the maximum obtained accuracy: 50.55%). more importantly, new information about the effects of MMT on cognitive brain functions is revealed through the analysis of the SNN model connectivity and its dynamics. this paper presented a new method for EEG data modeling and revealed new knowledge on brain functions associated with mental activity which is different from the brain activity observed in a resting state of the same subjects.

  9. RankExplorer: Visualization of Ranking Changes in Large Time Series Data.

    PubMed

    Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin

    2012-12-01

    For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.

  10. Comparison of parameters of spinal curves in the sagittal plane measured by photogrammetry and inclinometry.

    PubMed

    Walicka-Cupryś, Katarzyna; Drzał-Grabiec, Justyna; Mrozkowiak, Mirosław

    2013-10-31

    BACKGROUND. The photogrammetric method and inclinometer-based measurements are commonly employed to assess the anteroposterior curvatures of the spine. These methods are used both in clinical trials and for screening purposes. The aim of the study was to compare the parameters used to characterise the anteroposterior spinal curvatures as measured by photogrammetry and inclinometry. MATERIAL AND METHODS. The study enrolled 341 subjects: 169 girls and 172 boys, aged 4 to 9 years, from kindergartens and primary schools in Rzeszów. The anteroposterior spinal curvatures were examined by photogrammetry and with a mechanical inclinometer. RESULTS. There were significant differences in the α angle between the inclinometric and photogrammetric assessment in the Student t test (p=0.017) and the Fisher Snedecor test (p=0.0001), with similar differences in the β angle (Student's t p=0.0001, Fisher Snedecor p=0.007). For the γ angle, significant differences were revealed with Student's t test (p=0.0001), but not with the Fisher Snedecor test (p = 0.22). CONCLUSIONS. 1. Measurements of inclination of particular segments of the spine obtained with the photogrammetric method and the inclinometric method in the same study group revealed statistically significant differences. 2. The results of measurements obtained by photogrammetry and inclinometry are not comparable. 3. Further research on agreement between measurements of the anteroposterior spinal curvatures obtained using the available measurement equipment is recommended.

  11. A Method to Search for Correlations of Ultra-high Energy Cosmic-Ray Masses with the Large-scale Structures in the Local Galaxy Density Field

    NASA Astrophysics Data System (ADS)

    Ivanov, A. A.

    2013-02-01

    One of the main goals of investigations using present and future giant extensive air shower (EAS) arrays is the mass composition of ultra-high energy cosmic rays (UHECRs). A new approach to the problem is presented, combining the analysis of arrival directions with the statistical test of the paired EAS samples. One of the ideas of the method is to search for possible correlations between UHECR masses and their separate sources; for instance, if there are two sources in different areas of the celestial sphere injecting different nuclei, but the fluxes are comparable so that arrival directions are isotropic, then the aim is to reveal a difference in the mass composition of cosmic-ray fluxes. The method is based on a non-parametric statistical test—the Wilcoxon signed-rank routine—which does not depend on the populations fitting any parameterized distributions. Two particular algorithms are proposed: first, using measurements of the depth of the EAS maximum position in the atmosphere; and second, relying on the age variance of air showers initiated by different primary particles. The formulated method is applied to the Yakutsk array data, in order to demonstrate the possibility of searching for a difference in average mass composition between the two UHECR sets, arriving particularly from the supergalactic plane and a complementary region.

  12. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  13. A comparative study of traditional lecture methods and interactive lecture methods in introductory geology courses for non-science majors at the college level

    NASA Astrophysics Data System (ADS)

    Hundley, Stacey A.

    In recent years there has been a national call for reform in undergraduate science education. The goal of this reform movement in science education is to develop ways to improve undergraduate student learning with an emphasis on developing more effective teaching practices. Introductory science courses at the college level are generally taught using a traditional lecture format. Recent studies have shown incorporating active learning strategies within the traditional lecture classroom has positive effects on student outcomes. This study focuses on incorporating interactive teaching methods into the traditional lecture classroom to enhance student learning for non-science majors enrolled in introductory geology courses at a private university. Students' experience and instructional preferences regarding introductory geology courses were identified from survey data analysis. The information gained from responses to the questionnaire was utilized to develop an interactive lecture introductory geology course for non-science majors. Student outcomes were examined in introductory geology courses based on two teaching methods: interactive lecture and traditional lecture. There were no significant statistical differences between the groups based on the student outcomes and teaching methods. Incorporating interactive lecture methods did not statistically improve student outcomes when compared to traditional lecture teaching methods. However, the responses to the survey revealed students have a preference for introductory geology courses taught with lecture and instructor-led discussions and students prefer to work independently or in small groups. The results of this study are useful to individuals who teach introductory geology courses and individuals who teach introductory science courses for non-science majors at the college level.

  14. Reading color barcodes using visual snakes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaub, Hanspeter

    2004-05-01

    Statistical pressure snakes are used to track a mono-color target in an unstructured environment using a video camera. The report discusses an algorithm to extract a bar code signal that is embedded within the target. The target is assumed to be rectangular in shape, with the bar code printed in a slightly different saturation and value in HSV color space. Thus, the visual snake, which primarily weighs hue tracking errors, will not be deterred by the presence of the color bar codes in the target. The bar code is generate with the standard 3 of 9 method. Using this method,more » the numeric bar codes reveal if the target is right-side-up or up-side-down.« less

  15. Revealing degree distribution of bursting neuron networks.

    PubMed

    Shen, Yu; Hou, Zhonghuai; Xin, Houwen

    2010-03-01

    We present a method to infer the degree distribution of a bursting neuron network from its dynamics. Burst synchronization (BS) of coupled Morris-Lecar neurons has been studied under the weak coupling condition. In the BS state, all the neurons start and end bursting almost simultaneously, while the spikes inside the burst are incoherent among the neurons. Interestingly, we find that the spike amplitude of a given neuron shows an excellent linear relationship with its degree, which makes it possible to estimate the degree distribution of the network by simple statistics of the spike amplitudes. We demonstrate the validity of this scheme on scale-free as well as small-world networks. The underlying mechanism of such a method is also briefly discussed.

  16. Revealing representational content with pattern-information fMRI--an introductory guide.

    PubMed

    Mur, Marieke; Bandettini, Peter A; Kriegeskorte, Nikolaus

    2009-03-01

    Conventional statistical analysis methods for functional magnetic resonance imaging (fMRI) data are very successful at detecting brain regions that are activated as a whole during specific mental activities. The overall activation of a region is usually taken to indicate involvement of the region in the task. However, such activation analysis does not consider the multivoxel patterns of activity within a brain region. These patterns of activity, which are thought to reflect neuronal population codes, can be investigated by pattern-information analysis. In this framework, a region's multivariate pattern information is taken to indicate representational content. This tutorial introduction motivates pattern-information analysis, explains its underlying assumptions, introduces the most widespread methods in an intuitive way, and outlines the basic sequence of analysis steps.

  17. Mechanical properties of silicate glasses exposed to a low-Earth orbit

    NASA Technical Reports Server (NTRS)

    Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.

    1992-01-01

    The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.

  18. A combined pre-clinical meta-analysis and randomized confirmatory trial approach to improve data validity for therapeutic target validation.

    PubMed

    Kleikers, Pamela W M; Hooijmans, Carlijn; Göb, Eva; Langhauser, Friederike; Rewell, Sarah S J; Radermacher, Kim; Ritskes-Hoitinga, Merel; Howells, David W; Kleinschnitz, Christoph; Schmidt, Harald H H W

    2015-08-27

    Biomedical research suffers from a dramatically poor translational success. For example, in ischemic stroke, a condition with a high medical need, over a thousand experimental drug targets were unsuccessful. Here, we adopt methods from clinical research for a late-stage pre-clinical meta-analysis (MA) and randomized confirmatory trial (pRCT) approach. A profound body of literature suggests NOX2 to be a major therapeutic target in stroke. Systematic review and MA of all available NOX2(-/y) studies revealed a positive publication bias and lack of statistical power to detect a relevant reduction in infarct size. A fully powered multi-center pRCT rejects NOX2 as a target to improve neurofunctional outcomes or achieve a translationally relevant infarct size reduction. Thus stringent statistical thresholds, reporting negative data and a MA-pRCT approach can ensure biomedical data validity and overcome risks of bias.

  19. Correlation Between University Students' Kinematic Achievement and Learning Styles

    NASA Astrophysics Data System (ADS)

    Çirkinoǧlu, A. G.; Dem&ircidot, N.

    2007-04-01

    In the literature, some researches on kinematics revealed that students have many difficulties in connecting graphs and physics. Also some researches showed that the method used in classroom affects students' further learning. In this study the correlation between university students' kinematics achieve and learning style are investigated. In this purpose Kinematics Achievement Test and Learning Style Inventory were applied to 573 students enrolled in general physics 1 courses at Balikesir University in the fall semester of 2005-2006. Kinematics Test, consists of 12 multiple choose and 6 open ended questions, was developed by researchers to assess students' understanding, interpreting, and drawing graphs. Learning Style Inventory, a 24 items test including visual, auditory, and kinesthetic learning styles, was developed and used by Barsch. The data obtained from in this study were analyzed necessary statistical calculations (T-test, correlation, ANOVA, etc.) by using SPSS statistical program. Based on the research findings, the tentative recommendations are made.

  20. [Dental practitioners in Israel: past, present and future].

    PubMed

    Mann, J; Vered, Y; Zini, A

    2010-04-01

    Since 1980 various studies have been published in Israel dealing with dental manpower issues, utilizing several methods such as manpower to population ratio. The dental literature pointed out that dentistry in Israel has an over supply of dentists and that manpower to population ratio is one of the highest in the world 1:770. All studies were based on the information provided by the Ministry of Health which showed that Israel has over 9500 dentists. The Israel Central Bureau of Statistics figures showed a much smaller number which was 5700 active dentists. This enormous gap in between two sources of information, following strict examination of the data revealed that the Bureau of Statistics information is reliable and hence, the real manpower to population ratio in Israel in 2008 was 1:1271. Prediction of manpower is extremely important and the base line information is crucial for future evaluations.

  1. Multi-element fingerprinting as a tool in origin authentication of four east China marine species.

    PubMed

    Guo, Lipan; Gong, Like; Yu, Yanlei; Zhang, Hong

    2013-12-01

    The contents of 25 elements in 4 types of commercial marine species from the East China Sea were determined by inductively coupled plasma mass spectrometry and atomic absorption spectrometry. The elemental composition was used to differentiate marine species according to geographical origin by multivariate statistical analysis. The results showed that principal component analysis could distinguish samples from different areas and reveal the elements which played the most important role in origin diversity. The established models by partial least squares discriminant analysis (PLS-DA) and by probabilistic neural network (PNN) can both precisely predict the origin of the marine species. Further study indicated that PLS-DA and PNN were efficacious in regional discrimination. The models from these 2 statistical methods, with an accuracy of 97.92% and 100%, respectively, could both distinguish samples from different areas without the need for species differentiation. © 2013 Institute of Food Technologists®

  2. Standardised method of determining vibratory perception thresholds for diagnosis and screening in neurological investigation.

    PubMed Central

    Goldberg, J M; Lindblom, U

    1979-01-01

    Vibration threshold determinations were made by means of an electromagnetic vibrator at three sites (carpal, tibial, and tarsal), which were primarily selected for examining patients with polyneuropathy. Because of the vast variation demonstrated for both vibrator output and tissue damping, the thresholds were expressed in terms of amplitude of stimulator movement measured by means of an accelerometer, instead of applied voltage which is commonly used. Statistical analysis revealed a higher power of discimination for amplitude measurements at all three stimulus sites. Digital read-out gave the best statistical result and was also most practical. Reference values obtained from 110 healthy males, 10 to 74 years of age, were highly correlated with age for both upper and lower extremities. The variance of the vibration perception threshold was less than that of the disappearance threshold, and determination of the perception threshold alone may be sufficient in most cases. PMID:501379

  3. A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.

    PubMed

    O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A

    2016-02-01

    The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.

  4. Effect of autogenic relaxation on depression among menopausal women in rural areas of Thiruvallur District (Tamil Nadu).

    PubMed

    Sujithra, S

    2014-01-01

    An experimental study was conducted among 60 menopausal women, 30 each in experimental and control group who met inclusion criteria. The menopausal women were identified in both the groups and level of depression was assessed using Cornell Dysthmia rating scale. Simple random sampling technique by lottery method was used for selecting the sample. Autogenic relaxation was practiced by the menopausal women for four weeks. The findings revealed that in experimental group, after intervention of autogenic relaxation on depression among menopausal women, 23 (76.7%) had mild depression. There was a statistically significant effectiveness in experimental group at the level of p < 0.05. There was a statistically significant association between the effectiveness of autogenic relaxation on depression among menopausal women in the post-experimental group with the type of family at the level of p < 0.05.

  5. Bodily maps of emotions

    PubMed Central

    Nummenmaa, Lauri; Glerean, Enrico; Hari, Riitta; Hietanen, Jari K.

    2014-01-01

    Emotions are often felt in the body, and somatosensory feedback has been proposed to trigger conscious emotional experiences. Here we reveal maps of bodily sensations associated with different emotions using a unique topographical self-report method. In five experiments, participants (n = 701) were shown two silhouettes of bodies alongside emotional words, stories, movies, or facial expressions. They were asked to color the bodily regions whose activity they felt increasing or decreasing while viewing each stimulus. Different emotions were consistently associated with statistically separable bodily sensation maps across experiments. These maps were concordant across West European and East Asian samples. Statistical classifiers distinguished emotion-specific activation maps accurately, confirming independence of topographies across emotions. We propose that emotions are represented in the somatosensory system as culturally universal categorical somatotopic maps. Perception of these emotion-triggered bodily changes may play a key role in generating consciously felt emotions. PMID:24379370

  6. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  7. Accelerating simulation for the multiple-point statistics algorithm using vector quantization

    NASA Astrophysics Data System (ADS)

    Zuo, Chen; Pan, Zhibin; Liang, Hao

    2018-03-01

    Multiple-point statistics (MPS) is a prominent algorithm to simulate categorical variables based on a sequential simulation procedure. Assuming training images (TIs) as prior conceptual models, MPS extracts patterns from TIs using a template and records their occurrences in a database. However, complex patterns increase the size of the database and require considerable time to retrieve the desired elements. In order to speed up simulation and improve simulation quality over state-of-the-art MPS methods, we propose an accelerating simulation for MPS using vector quantization (VQ), called VQ-MPS. First, a variable representation is presented to make categorical variables applicable for vector quantization. Second, we adopt a tree-structured VQ to compress the database so that stationary simulations are realized. Finally, a transformed template and classified VQ are used to address nonstationarity. A two-dimensional (2D) stationary channelized reservoir image is used to validate the proposed VQ-MPS. In comparison with several existing MPS programs, our method exhibits significantly better performance in terms of computational time, pattern reproductions, and spatial uncertainty. Further demonstrations consist of a 2D four facies simulation, two 2D nonstationary channel simulations, and a three-dimensional (3D) rock simulation. The results reveal that our proposed method is also capable of solving multifacies, nonstationarity, and 3D simulations based on 2D TIs.

  8. Comparison of Microleakage under Rebonded Stainless Steel Orthodontic Brackets Using Two Methods of Adhesive Removal: Sandblast and Laser

    PubMed Central

    Tudehzaeim, Mohamad Hossein; Yassaei, Soghra; Taherimoghadam, Shohreh

    2015-01-01

    Objectives: Debonding is a common occurrence in orthodontic treatment and a considerable number of orthodontists prefer to rebond the detached brackets because of economic issues. The aim of this study was to compare the microleakage beneath rebonded stainless steel brackets using two methods of adhesive removal namely sandblast and laser. Materials and Methods: Sixty human premolar teeth were randomly divided into three groups. Following bonding the brackets, group 1 served as the control group. Brackets in groups 2 and 3 were debonded, and adhesive removal from the bracket bases was done by means of sandblasting and Er-YAG laser, respectively. After rebonding, teeth in each group were stained with 2% methylene blue for 24 hours, sectioned and examined under a stereomicroscope. Marginal microleakage at the adhesive-enamel and bracket-adhesive interfaces in the occlusal and gingival margins was determined. Statistical analysis was done using the Kruskal-Wallis test. Results: Comparison of the microleakage scores among the three groups revealed no statistically significant difference (P > 0.05). At the enamel-adhesive interface, the gingival margins in all groups showed higher microleakage while in the adhesive-bracket interface, the occlusal margin exhibited greater microleakage. Conclusion: Er-YAG laser irradiation and sandblasting for adhesive removal from the debonded brackets yielded clinically acceptable microleakage scores. PMID:26056521

  9. [Review of research design and statistical methods in Chinese Journal of Cardiology].

    PubMed

    Zhang, Li-jun; Yu, Jin-ming

    2009-07-01

    To evaluate the research design and the use of statistical methods in Chinese Journal of Cardiology. Peer through the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology from December 2007 to November 2008. The most frequently used research designs are cross-sectional design (34%), prospective design (21%) and experimental design (25%). In all of the articles, 49 (25%) use wrong statistical methods, 29 (15%) lack some sort of statistic analysis, 23 (12%) have inconsistencies in description of methods. There are significant differences between different statistical methods (P < 0.001). The correction rates of multifactor analysis were low and repeated measurement datas were not used repeated measurement analysis. Many problems exist in Chinese Journal of Cardiology. Better research design and correct use of statistical methods are still needed. More strict review by statistician and epidemiologist is also required to improve the literature qualities.

  10. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    ERIC Educational Resources Information Center

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  11. Development of a Research Methods and Statistics Concept Inventory

    ERIC Educational Resources Information Center

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  12. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    PubMed

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  13. 76 FR 16199 - Hours of Service of Railroad Employees; Substantive Regulations for Train Employees Providing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... statistically significant relationship is evaluated by way of the correlation coefficient (r) with statistical... . The analysis revealed a significant high correlation between reduced predicted crew effectiveness (as...

  14. New Predictive Parameters of Bell’s Palsy: Neutrophil to Lymphocyte Ratio and Platelet to Lymphocyte Ratio

    PubMed Central

    Atan, Doğan; İkincioğulları, Aykut; Köseoğlu, Sabri; Özcan, Kürşat Murat; Çetin, Mehmet Ali; Ensari, Serdar; Dere, Hüseyin

    2015-01-01

    Background: Bell’s palsy is the most frequent cause of unilateral facial paralysis. Inflammation is thought to play an important role in the pathogenesis of Bell’s palsy. Aims: Neutrophil to lymphocyte ratio (NLR) and platelet to lymphocyte ratio (PLR) are simple and inexpensive tests which are indicative of inflammation and can be calculated by all physicians. The aim of this study was to reveal correlations of Bell’s palsy and degree of paralysis with NLR and PLR. Study Design: Case-control study. Methods: The retrospective study was performed January 2010 and December 2013. Ninety-nine patients diagnosed as Bell’s palsy were included in the Bell’s palsy group and ninety-nine healthy individuals with the same demographic characteristics as the Bell’s palsy group were included in the control group. As a result of analyses, NLR and PLR were calculated. Results: The mean NLR was 4.37 in the Bell’s palsy group and 1.89 in the control group with a statistically significant difference (p<0.001). The mean PLR was 137.5 in the Bell’s palsy group and 113.75 in the control group with a statistically significant difference (p=0.008). No statistically significant relation was detected between the degree of facial paralysis and NLR and PLR. Conclusion: The NLR and the PLR were significantly higher in patients with Bell’s palsy. This is the first study to reveal a relation between Bell’s palsy and PLR. NLR and PLR can be used as auxiliary parameters in the diagnosis of Bell’s palsy. PMID:26167340

  15. Assessment of the influence of vegetarian diet on the occurrence of erosive and abrasive cavities in hard tooth tissues.

    PubMed

    Herman, Katarzyna; Czajczyńska-Waszkiewicz, Agnieszka; Kowalczyk-Zając, Małgorzata; Dobrzyński, Maciej

    2011-11-25

    The aim of the study was to determine the potential relation between vegetarian diet and tooth erosion and abrasion. The examination included 46 vegetarians and the same number in the control group. Clinical research was carried out in order to detect the presence of abrasive and erosive changes and the level of hygiene in oral cavities. The questionnaire survey concerned dietary and hygienic habits. Statistical analysis of the data was conducted with Chi-square test and Mann-Whitney U test. The relations between following a vegetarian diet and the occurrence of non-carious cavities was tested with models of logistic regression. Tooth erosion was present among 39.1% of vegetarians and 23.9% of controls, while abrasion appeared among 26.1% and 10.9%, respectively, and the differences were statistically insignificant. The distribution of the changes was similar in both groups. Among vegetarians, significantly more frequent consumption of sour products (predominantly raw vegetables and fruit and tomatoes) was observed. The level of oral hygiene and hygienic habits were similar in both groups. The analysis of statistical regression did not reveal any relations between following a vegetarian diet and the occurrence of tooth erosion and abrasion. The results did not reveal any direct influence of vegetarian diet on the occurrence of erosive and abrasive changes. However, in the vegetarian group, more frequent consumption of some sour products and more commonly used horizontal brushing method were observed, with a slightly higher occurrence of non-carious cavities. Further research is required to obtain unambiguous conclusions.

  16. The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model

    NASA Astrophysics Data System (ADS)

    Verkley, Wim; Severijns, Camiel

    2014-05-01

    Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).

  17. Hematological change parameters in patients with pressure ulcer at long-term care hospital

    PubMed Central

    Neiva, Giselle Protta; Carnevalli, Julia Romualdo; Cataldi, Rodrigo Lessa; Furtado, Denise Mendes; Fabri, Rodrigo Luiz; Silva, Pâmela Souza

    2014-01-01

    Objective To assess factors associated with the development of pressure ulcers, and to compare the effectiveness of pharmacological treatments. Methods The factors associated with the development of pressure ulcers were compared in lesion-carrying patients (n=14) and non-carriers (n=16). Lesion-carrying patients were treated with 1% silver sulfadiazine or 0.6IU/g collagenase and were observed for 8 weeks. The data collected was analyzed with p<0.05 being statistically relevant. Results The prevalence of pressure ulcers was about 6%. The comparison of carrier and non-carrier groups of pressure ulcers revealed no statistically significant difference in its occurrence with respect to age, sex, skin color, mobility, or the use of diapers. However, levels of hemoglobin, hematocrit, and red blood cells were found to be statistically different between groups, being lower in lesion-carrying patients. There was no significant difference found in lesion area between patients treated with collagenase or silver sulfadiazine, although both groups showed an overall reduction in lesion area through the treatment course. Conclusion Hematologic parameters showed a statistically significant difference between the two groups. Regarding the treatment of ulcers, there was no difference in the area of the lesion found between the groups treated with collagenase and silver sulfadiazine. PMID:25295450

  18. Introducing 3D U-statistic method for separating anomaly from background in exploration geochemical data with associated software development

    NASA Astrophysics Data System (ADS)

    Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir

    2016-03-01

    The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.

  19. Time-Frequency Cross Mutual Information Analysis of the Brain Functional Networks Underlying Multiclass Motor Imagery.

    PubMed

    Gong, Anmin; Liu, Jianping; Chen, Si; Fu, Yunfa

    2018-01-01

    To study the physiologic mechanism of the brain during different motor imagery (MI) tasks, the authors employed a method of brain-network modeling based on time-frequency cross mutual information obtained from 4-class (left hand, right hand, feet, and tongue) MI tasks recorded as brain-computer interface (BCI) electroencephalography data. The authors explored the brain network revealed by these MI tasks using statistical analysis and the analysis of topologic characteristics, and observed significant differences in the reaction level, reaction time, and activated target during 4-class MI tasks. There was a great difference in the reaction level between the execution and resting states during different tasks: the reaction level of the left-hand MI task was the greatest, followed by that of the right-hand, feet, and tongue MI tasks. The reaction time required to perform the tasks also differed: during the left-hand and right-hand MI tasks, the brain networks of subjects reacted promptly and strongly, but there was a delay during the feet and tongue MI task. Statistical analysis and the analysis of network topology revealed the target regions of the brain network during different MI processes. In conclusion, our findings suggest a new way to explain the neural mechanism behind MI.

  20. Noise-gating to Clean Astrophysical Image Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeForest, C. E.

    I present a family of algorithms to reduce noise in astrophysical images and image sequences, preserving more information from the original data than is retained by conventional techniques. The family uses locally adaptive filters (“noise gates”) in the Fourier domain to separate coherent image structure from background noise based on the statistics of local neighborhoods in the image. Processing of solar data limited by simple shot noise or by additive noise reveals image structure not easily visible in the originals, preserves photometry of observable features, and reduces shot noise by a factor of 10 or more with little to nomore » apparent loss of resolution. This reveals faint features that were either not directly discernible or not sufficiently strongly detected for quantitative analysis. The method works best on image sequences containing related subjects, for example movies of solar evolution, but is also applicable to single images provided that there are enough pixels. The adaptive filter uses the statistical properties of noise and of local neighborhoods in the data to discriminate between coherent features and incoherent noise without reference to the specific shape or evolution of those features. The technique can potentially be modified in a straightforward way to exploit additional a priori knowledge about the functional form of the noise.« less

  1. Comparison of effectiveness of Calendula officinalis extract gel with lycopene gel for treatment of tobacco-induced homogeneous leukoplakia: A randomized clinical trial

    PubMed Central

    Singh, Manisha; Bagewadi, Anjana

    2017-01-01

    Aim: The aim of the study is to assess the efficacy of Calendula officinalis gel as cost-effective treatment modality in comparison to lycopene gel in the treatment of leukoplakia. Materials and Methods: The study comprised of sixty patients of clinically diagnosed and histopathologically confirmed cases of homogeneous leukoplakia which were divided into Group I and Group II with thirty patients each. Group I patients were dispensed C. officinalis extract gel whereas Group II patients were given lycopene gel. The therapy was instituted for 1 month to assess the change in the size of the lesion at the baseline and posttreatment. Results: The results revealed a statistically significant difference in both Group I and Group II when the pre- and post-treatment results were compared in the same group. The mean difference in the reduction in size before and after treatment for Group I was 2.0% ±1.0 cm while for the Group II, it was 1.57% ±0.87 cm. The intergroup comparison for the evaluation of reduction in the size of the lesion did not reveal statistically significant results. Conclusion: C. officinalis extract gel can be effectively used as an alternative to conventional treatment modality. PMID:28929051

  2. Cross-sectional and longitudinal evaluation of liver volume and total liver fat burden in adults with nonalcoholic steatohepatitis

    PubMed Central

    Tang, An; Chen, Joshua; Le, Thuy-Anh; Changchien, Christopher; Hamilton, Gavin; Middleton, Michael S.; Loomba, Rohit; Sirlin, Claude B.

    2014-01-01

    Purpose To explore the cross-sectional and longitudinal relationships between fractional liver fat content, liver volume, and total liver fat burden. Methods In 43 adults with non-alcoholic steatohepatitis participating in a clinical trial, liver volume was estimated by segmentation of magnitude-based low-flip-angle multiecho GRE images. The liver mean proton density fat fraction (PDFF) was calculated. The total liver fat index (TLFI) was estimated as the product of liver mean PDFF and liver volume. Linear regression analyses were performed. Results Cross-sectional analyses revealed statistically significant relationships between TLFI and liver mean PDFF (R2 = 0.740 baseline/0.791 follow-up, P < 0.001 baseline/P < 0.001 follow-up), and between TLFI and liver volume (R2 = 0.352/0.452, P < 0.001/< 0.001). Longitudinal analyses revealed statistically significant relationships between liver volume change and liver mean PDFF change (R2 = 0.556, P < 0.001), between TLFI change and liver mean PDFF change (R2 = 0.920, P < 0.001), and between TLFI change and liver volume change (R2 = 0.735, P < 0.001). Conclusion Liver segmentation in combination with MRI-based PDFF estimation may be used to monitor liver volume, liver mean PDFF, and TLFI in a clinical trial. PMID:25015398

  3. Noise-gating to Clean Astrophysical Image Data

    NASA Astrophysics Data System (ADS)

    DeForest, C. E.

    2017-04-01

    I present a family of algorithms to reduce noise in astrophysical images and image sequences, preserving more information from the original data than is retained by conventional techniques. The family uses locally adaptive filters (“noise gates”) in the Fourier domain to separate coherent image structure from background noise based on the statistics of local neighborhoods in the image. Processing of solar data limited by simple shot noise or by additive noise reveals image structure not easily visible in the originals, preserves photometry of observable features, and reduces shot noise by a factor of 10 or more with little to no apparent loss of resolution. This reveals faint features that were either not directly discernible or not sufficiently strongly detected for quantitative analysis. The method works best on image sequences containing related subjects, for example movies of solar evolution, but is also applicable to single images provided that there are enough pixels. The adaptive filter uses the statistical properties of noise and of local neighborhoods in the data to discriminate between coherent features and incoherent noise without reference to the specific shape or evolution of those features. The technique can potentially be modified in a straightforward way to exploit additional a priori knowledge about the functional form of the noise.

  4. First Images from the PIONIER/VLTI optical interferometry imaging survey of Herbig Ae/Be stars

    NASA Astrophysics Data System (ADS)

    Kluska, Jacques; Malbet, Fabien; Berger, Jean-Philippe; Benisty, Myriam; Lazareff, Bernard; Le Bouquin, Jean-Baptiste; Baron, Fabien; Dominik, Carsten; Isella, Andrea; Juhasz, Attila; Kraus, Stefan; Lachaume, Régis; Ménard, François; Millan-Gabet, Rafael; Monnier, John; Pinte, Christophe; Thi, Wing-Fai; Thiébaut, Eric; Zins, Gérard

    2013-07-01

    The morphology of the close environment of herbig stars is being revealed step by step and appears to be quite complex. Many physical phenomena could interplay : the dust sublimation causing a puffed-up inner rim, a dusty halo, a dusty wind or an inner gaseous component. To investigate more deeply these regions, getting images at the first Astronomical Unit scale is crucial. This has become possible with near infrared instruments on the VLTi. We are carrying out the first Large Program survey of HAeBe stars with statistics on the geometry of these objects at the first astronomical unit scale and the first images of the very close environment of some of them. We have developed a new numerical method specific to young stellar objects which removes the stellar component reconstructing an image of the environment only. To do so we are using the differences in the spectral behaviour between the star and its environment. The images reveal the environement which is not polluted by the star and allow us to derive the best fit for the flux ratio and the spectral slope between the two components (stellar and environmental). We present the results of the survey with some statistics and the frist images of Herbig stars made by PIONIER on the VLTi.

  5. Non-Markovian spin-resolved counting statistics and an anomalous relation between autocorrelations and cross correlations in a three-terminal quantum dot

    NASA Astrophysics Data System (ADS)

    Luo, JunYan; Yan, Yiying; Huang, Yixiao; Yu, Li; He, Xiao-Ling; Jiao, HuJun

    2017-01-01

    We investigate the noise correlations of spin and charge currents through an electron spin resonance (ESR)-pumped quantum dot, which is tunnel coupled to three electrodes maintained at an equivalent chemical potential. A recursive scheme is employed with inclusion of the spin degrees of freedom to account for the spin-resolved counting statistics in the presence of non-Markovian effects due to coupling with a dissipative heat bath. For symmetric spin-up and spin-down tunneling rates, an ESR-induced spin flip mechanism generates a pure spin current without an accompanying net charge current. The stochastic tunneling of spin carriers, however, produces universal shot noises of both charge and spin currents, revealing the effective charge and spin units of quasiparticles in transport. In the case of very asymmetric tunneling rates for opposite spins, an anomalous relationship between noise autocorrelations and cross correlations is revealed, where super-Poissonian autocorrelation is observed in spite of a negative cross correlation. Remarkably, with strong dissipation strength, non-Markovian memory effects give rise to a positive cross correlation of the charge current in the absence of a super-Poissonian autocorrelation. These unique noise features may offer essential methods for exploiting internal spin dynamics and various quasiparticle tunneling processes in mesoscopic transport.

  6. Diagnostics of psychophysiological states and motivation in elite athletes.

    PubMed

    Korobeynikov, G; Mazmanian, K; Korobeynikova, L; Jagiello, W

    2011-01-01

    Concepts explored in our study concerned identification of various types of motivation and their connection to psychophysiological states in elite judo and Greco-Roman wrestlers. We tried to figure out how do these different types of motivation interact to describe psychophysiological state in qualified wrestlers. Neuropsychological evaluation methods as simple (SRT) and choice reaction-time (CRT) tests, HRV measurements, psychological questionnaires. To explore obtained data methods of statistical analysis were used Obtained data show that different combinations of levels of motivation to achieve success and motivation to avoid failure provoke different psychophysiological states. Conducted experiment revealed that combination of high levels of both motivation to achievement of success and motivation to avoid failure provides better psychophysiological state in elite wrestlers compared to other groups with different combinations of motivational variables. Conducted experiment revealed that motivation to avoid failures had been formed as a personality formation, which compensates excessive tension, caused by high level of motivation to achieve and regulate the psychophysiological state. This can be viewed as an effect of training in athletes (Tab. 3, Fig. 1, Ref. 38).

  7. Load-embedded inertial measurement unit reveals lifting performance.

    PubMed

    Tammana, Aditya; McKay, Cody; Cain, Stephen M; Davidson, Steven P; Vitali, Rachel V; Ojeda, Lauro; Stirling, Leia; Perkins, Noel C

    2018-07-01

    Manual lifting of loads arises in many occupations as well as in activities of daily living. Prior studies explore lifting biomechanics and conditions implicated in lifting-induced injuries through laboratory-based experimental methods. This study introduces a new measurement method using load-embedded inertial measurement units (IMUs) to evaluate lifting tasks in varied environments outside of the laboratory. An example vertical load lifting task is considered that is included in an outdoor obstacle course. The IMU data, in the form of the load acceleration and angular velocity, is used to estimate load vertical velocity and three lifting performance metrics: the lifting time (speed), power, and motion smoothness. Large qualitative differences in these parameters distinguish exemplar high and low performance trials. These differences are further supported by subsequent statistical analyses of twenty three trials (including a total of 115 total lift/lower cycles) from fourteen healthy participants. Results reveal that lifting time is strongly correlated with lifting power (as expected) but also correlated with motion smoothness. Thus, participants who lift rapidly do so with significantly greater power using motions that minimize motion jerk. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Mapping anatomical correlations across cerebral cortex (MACACC) using cortical thickness from MRI.

    PubMed

    Lerch, Jason P; Worsley, Keith; Shaw, W Philip; Greenstein, Deanna K; Lenroot, Rhoshel K; Giedd, Jay; Evans, Alan C

    2006-07-01

    We introduce MACACC-Mapping Anatomical Correlations Across Cerebral Cortex-to study correlated changes within and across different cortical networks. The principal topic of investigation is whether the thickness of one area of the cortex changes in a statistically correlated fashion with changes in thickness of other cortical regions. We further extend these methods by introducing techniques to test whether different population groupings exhibit significantly varying MACACC patterns. The methods are described in detail and applied to a normal childhood development population (n = 292), and show that association cortices have the highest correlation strengths. Taking Brodmann Area (BA) 44 as a seed region revealed MACACC patterns strikingly similar to tractography maps obtained from diffusion tensor imaging. Furthermore, the MACACC map of BA 44 changed with age, older subjects featuring tighter correlations with BA 44 in the anterior portions of the superior temporal gyri. Lastly, IQ-dependent MACACC differences were investigated, revealing steeper correlations between BA 44 and multiple frontal and parietal regions for the higher IQ group, most significantly (t = 4.0) in the anterior cingulate.

  9. Effects of moist- and dry-heat cooking on the meat quality, microstructure and sensory characteristics of native chicken meat.

    PubMed

    Chumngoen, Wanwisa; Chen, Chih-Feng; Tan, Fa-Jui

    2018-01-01

    This study investigates the effects of moist- (water-cooking; WC) and dry-heat (oven-cooking; OC) on the quality, microstructure and sensory characteristics of native chicken breast meat. The results revealed that OC meat had a significantly higher cooking time, cooking loss, and shear force values and lower L* values. Protein solubility decreased after cooking in both cooking methods; however, no statistical difference was observed between WC and OC meats, whereas collagen solubility and myofibrillar fragmentation index (MFI) increased after cooking and WC meat exhibited higher collagen solubility and MFI (P < 0.05). The fiber diameter and sarcomere length decreased substantially after cooking, and fibril shrinkage was noticeable in OC meat (P < 0.05). Descriptive sensory analysis revealed that WC meat exhibited a significantly higher moisture release and lower initial hardness, chewdown hardness and residual loose particles. A darker color and enhanced chickeny flavor were observed for OC meat. Based on the unique sensory and physicochemical characteristics in demand, producers should employ appropriate cooking methods to optimize native chicken meat quality. © 2017 Japanese Society of Animal Science.

  10. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  11. Multiscale study for stochastic characterization of shale samples

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman; Javadpour, Farzam; Sahimi, Muhammad; Piri, Mohammad

    2016-03-01

    Characterization of shale reservoirs, which are typically of low permeability, is very difficult because of the presence of multiscale structures. While three-dimensional (3D) imaging can be an ultimate solution for revealing important complexities of such reservoirs, acquiring such images is costly and time consuming. On the other hand, high-quality 2D images, which are widely available, also reveal useful information about shales' pore connectivity and size. Most of the current modeling methods that are based on 2D images use limited and insufficient extracted information. One remedy to the shortcoming is direct use of qualitative images, a concept that we introduce in this paper. We demonstrate that higher-order statistics (as opposed to the traditional two-point statistics, such as variograms) are necessary for developing an accurate model of shales, and describe an efficient method for using 2D images that is capable of utilizing qualitative and physical information within an image and generating stochastic realizations of shales. We then further refine the model by describing and utilizing several techniques, including an iterative framework, for removing some possible artifacts and better pattern reproduction. Next, we introduce a new histogram-matching algorithm that accounts for concealed nanostructures in shale samples. We also present two new multiresolution and multiscale approaches for dealing with distinct pore structures that are common in shale reservoirs. In the multiresolution method, the original high-quality image is upscaled in a pyramid-like manner in order to achieve more accurate global and long-range structures. The multiscale approach integrates two images, each containing diverse pore networks - the nano- and microscale pores - using a high-resolution image representing small-scale pores and, at the same time, reconstructing large pores using a low-quality image. Eventually, the results are integrated to generate a 3D model. The methods are tested on two shale samples for which full 3D samples are available. The quantitative accuracy of the models is demonstrated by computing their morphological and flow properties and comparing them with those of the actual 3D images. The success of the method hinges upon the use of very different low- and high-resolution images.

  12. Spatial analyses for nonoverlapping objects with size variations and their application to coral communities.

    PubMed

    Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko

    2014-07-01

    Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  13. Measuring post-secondary stem majors' engagement in sustainability: The creation, assessment, and validation of an instrument for sustainability curricula evaluation

    NASA Astrophysics Data System (ADS)

    Little, David L., II

    Ongoing changes in values, pedagogy, and curriculum concerning sustainability education necessitate that strong curricular elements are identified in sustainability education. However, quantitative research in sustainability education is largely undeveloped or relies on outdated instruments. In part, this is because no widespread quantitative instrument for measuring related educational outcomes has been developed for the field, though their development is pivotal for future efforts in sustainability education related to STEM majors. This research study details the creation, evaluation, and validation of an instrument -- the STEM Sustainability Engagement Instrument (STEMSEI) -- designed to measure sustainability engagement in post-secondary STEM majors. The study was conducted in three phases, using qualitative methods in phase 1, a concurrent mixed methods design in phase 2, and a sequential mixed methods design in phase 3. The STEMSEI was able to successfully predict statistically significant differences in the sample (n= 1017) that were predicted by prior research in environmental education. The STEMSEI also revealed statistically significant differences between STEM majors' sustainability engagement with a large effect size (.203 ≤ eta2 ≤ .211). As hypothesized, statistically significant differences were found on the environmental scales across gender and present religion. With respect to gender, self-perceived measures of emotional engagement with environmental sustainability was higher with females while males had higher measures in cognitive engagement with respect to knowing information related to environmental sustainability. With respect to present religion, self-perceived measures of general engagement and emotional engagement in environmental sustainability were higher for non-Christians as compared to Christians. On the economic scales, statistically significant differences were found across gender. Specifically, measures of males' self-perceived cognitive engagement in knowing information related to economic sustainability were greater than those of females. Future research should establish the generalizability of these results and further test the validity of the STEMSEI.

  14. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  15. Clinical MR-mammography: are computer-assisted methods superior to visual or manual measurements for curve type analysis? A systematic approach.

    PubMed

    Baltzer, Pascal Andreas Thomas; Freiberg, Christian; Beger, Sebastian; Vag, Tibor; Dietzel, Matthias; Herzog, Aimee B; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A

    2009-09-01

    Enhancement characteristics after administration of a contrast agent are regarded as a major criterion for differential diagnosis in magnetic resonance mammography (MRM). However, no consensus exists about the best measurement method to assess contrast enhancement kinetics. This systematic investigation was performed to compare visual estimation with manual region of interest (ROI) and computer-aided diagnosis (CAD) analysis for time curve measurements in MRM. A total of 329 patients undergoing surgery after MRM (1.5 T) were analyzed prospectively. Dynamic data were measured using visual estimation, including ROI as well as CAD methods, and classified depending on initial signal increase and delayed enhancement. Pathology revealed 469 lesions (279 malignant, 190 benign). Kappa agreement between the methods ranged from 0.78 to 0.81. Diagnostic accuracies of 74.4% (visual), 75.7% (ROI), and 76.6% (CAD) were found without statistical significant differences. According to our results, curve type measurements are useful as a diagnostic criterion in breast lesions irrespective of the method used.

  16. Degree-strength correlation reveals anomalous trading behavior.

    PubMed

    Sun, Xiao-Qian; Shen, Hua-Wei; Cheng, Xue-Qi; Wang, Zhao-Yang

    2012-01-01

    Manipulation is an important issue for both developed and emerging stock markets. Many efforts have been made to detect manipulation in stock markets. However, it is still an open problem to identify the fraudulent traders, especially when they collude with each other. In this paper, we focus on the problem of identifying the anomalous traders using the transaction data of eight manipulated stocks and forty-four non-manipulated stocks during a one-year period. By analyzing the trading networks of stocks, we find that the trading networks of manipulated stocks exhibit significantly higher degree-strength correlation than the trading networks of non-manipulated stocks and the randomized trading networks. We further propose a method to detect anomalous traders of manipulated stocks based on statistical significance analysis of degree-strength correlation. Experimental results demonstrate that our method is effective at distinguishing the manipulated stocks from non-manipulated ones. Our method outperforms the traditional weight-threshold method at identifying the anomalous traders in manipulated stocks. More importantly, our method is difficult to be fooled by colluded traders.

  17. Measurement of Postmortem Pupil Size: A New Method with Excellent Reliability and Its Application to Pupil Changes in the Early Postmortem Period.

    PubMed

    Fleischer, Luise; Sehner, Susanne; Gehl, Axel; Riemer, Martin; Raupach, Tobias; Anders, Sven

    2017-05-01

    Measurement of postmortem pupil width is a potential component of death time estimation. However, no standardized measurement method has been described. We analyzed a total of 71 digital images for pupil-iris ratio using the software ImageJ. Images were analyzed three times by four different examiners. In addition, serial images from 10 cases were taken between 2 and 50 h postmortem to detect spontaneous pupil changes. Intra- and inter-rater reliability of the method was excellent (ICC > 0.95). The method is observer independent and yields consistent results, and images can be digitally stored and re-evaluated. The method seems highly eligible for forensic and scientific purposes. While statistical analysis of spontaneous pupil changes revealed a significant polynomial of quartic degree for postmortem time (p = 0.001), an obvious pattern was not detected. These results do not indicate suitability of spontaneous pupil changes for forensic death time estimation, as formerly suggested. © 2016 American Academy of Forensic Sciences.

  18. Agreement between self-reported data on medicine use and prescription records vary according to method of analysis and therapeutic group.

    PubMed

    Nielsen, Merete Willemoes; Søndergaard, Birthe; Kjøller, Mette; Hansen, Ebba Holme

    2008-09-01

    This study compared national self-reported data on medicine use and national prescription records at the individual level. Data from the nationally representative Danish health survey conducted in 2000 (n=16,688) were linked at the individual level to national prescription records covering 1999-2000. Kappa statistics and 95% confidence intervals were calculated. Applying the legend time method to medicine groups used mainly on a chronic basis revealed good to very good agreement between the two data sources, whereas medicines used as needed showed fair to moderate agreement. When a fixed-time window was applied for analysis, agreement was unchanged for medicines used mainly on a chronic basis, whereas agreement increased somewhat compared to the legend time method when analyzing medicines used as needed. Agreement between national self-reported data and national prescription records differed according to method of analysis and therapeutic group. A fixed-time window is an appropriate method of analysis for most therapeutic groups.

  19. RapidRIP quantifies the intracellular metabolome of 7 industrial strains of E. coli.

    PubMed

    McCloskey, Douglas; Xu, Julia; Schrübbers, Lars; Christensen, Hanne B; Herrgård, Markus J

    2018-04-25

    Fast metabolite quantification methods are required for high throughput screening of microbial strains obtained by combinatorial or evolutionary engineering approaches. In this study, a rapid RIP-LC-MS/MS (RapidRIP) method for high-throughput quantitative metabolomics was developed and validated that was capable of quantifying 102 metabolites from central, amino acid, energy, nucleotide, and cofactor metabolism in less than 5 minutes. The method was shown to have comparable sensitivity and resolving capability as compared to a full length RIP-LC-MS/MS method (FullRIP). The RapidRIP method was used to quantify the metabolome of seven industrial strains of E. coli revealing significant differences in glycolytic, pentose phosphate, TCA cycle, amino acid, and energy and cofactor metabolites were found. These differences translated to statistically and biologically significant differences in thermodynamics of biochemical reactions between strains that could have implications when choosing a host for bioprocessing. Copyright © 2018. Published by Elsevier Inc.

  20. Quality of reporting statistics in two Indian pharmacology journals

    PubMed Central

    Jaykaran; Yadav, Preeti

    2011-01-01

    Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766

  1. [Mathematical modeling for conditionality of cardiovascular disease by housing conditions].

    PubMed

    Meshkov, N A

    2014-01-01

    There was studied the influence of living conditions (housing area per capita, availability of housing water supply, sewerage and central heating) on the morbidity of the cardiovascular diseases in child and adult population. With the method of regression analysis the morbidity rate was established to significantly decrease with the increase in the area of housing, constructed models are statistically significant, respectively, p = 0.01 and p = 0.02. There was revealed the relationship of the morbidity rate of cardiovascular diseases in children and adults with the supply with housing central heating (p = 0.02 and p = 0.009).

  2. Serum levels of organochlorine pesticides in the general population of Thessaly, Greece, determined by HS-SPME GC-MS method.

    PubMed

    Koureas, Michalis; Karagkouni, Foteini; Rakitskii, Valerii; Hadjichristodoulou, Christos; Tsatsakis, Aristidis; Tsakalof, Andreas

    2016-07-01

    In this study, exposure levels of organochlorine pesticides (OCs) were determined in general population residing in Larissa, central Greece. Serum samples from 103 volunteers were analyzed by optimized headspace solid-phase microextraction gas chromatography-mass spectrometry, to detect and quantify OC levels. The most frequently detected analytes were p,p'-DDE (frequency 99%, median:1.25ng/ml) and Hexachlorobenzene (HCB) (frequency 69%, median: 0.13ng/ml). Statistical analysis revealed a significant relationship of p,p'-DDE and HCB levels with age. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Prediction of Muscle Performance During Dynamic Repetitive Exercise

    NASA Technical Reports Server (NTRS)

    Byerly, D. L.; Byerly, K. A.; Sognier, M. A.; Squires, W. G.

    2002-01-01

    A method for predicting human muscle performance was developed. Eight test subjects performed a repetitive dynamic exercise to failure using a Lordex spinal machine. Electromyography (EMG) data was collected from the erector spinae. Evaluation of the EMG data using a 5th order Autoregressive (AR) model and statistical regression analysis revealed that an AR parameter, the mean average magnitude of AR poles, can predict performance to failure as early as the second repetition of the exercise. Potential applications to the space program include evaluating on-orbit countermeasure effectiveness, maximizing post-flight recovery, and future real-time monitoring capability during Extravehicular Activity.

  4. An empirical analysis of the distribution of the duration of overshoots in a stationary gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Parrish, R. S.; Carter, M. C.

    1974-01-01

    This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.

  5. The prior statistics of object colors.

    PubMed

    Koenderink, Jan J

    2010-02-01

    The prior statistics of object colors is of much interest because extensive statistical investigations of reflectance spectra reveal highly non-uniform structure in color space common to several very different databases. This common structure is due to the visual system rather than to the statistics of environmental structure. Analysis involves an investigation of the proper sample space of spectral reflectance factors and of the statistical consequences of the projection of spectral reflectances on the color solid. Even in the case of reflectance statistics that are translationally invariant with respect to the wavelength dimension, the statistics of object colors is highly non-uniform. The qualitative nature of this non-uniformity is due to trichromacy.

  6. Pattern of structural brain changes in social anxiety disorder after cognitive behavioral group therapy: a longitudinal multimodal MRI study.

    PubMed

    Steiger, V R; Brühl, A B; Weidt, S; Delsignore, A; Rufer, M; Jäncke, L; Herwig, U; Hänggi, J

    2017-08-01

    Social anxiety disorder (SAD) is characterized by fears of social and performance situations. Cognitive behavioral group therapy (CBGT) has in general positive effects on symptoms, distress and avoidance in SAD. Prior studies found increased cortical volumes and decreased fractional anisotropy (FA) in SAD compared with healthy controls (HCs). Thirty-three participants diagnosed with SAD attended in a 10-week CBGT and were scanned before and after therapy. We applied three neuroimaging methods-surface-based morphometry, diffusion tensor imaging and network-based statistics-each with specific longitudinal processing protocols, to investigate CBGT-induced structural brain alterations of the gray and white matter (WM). Surface-based morphometry revealed a significant cortical volume reduction (pre- to post-treatment) in the left inferior parietal cortex, as well as a positive partial correlation between treatment success (indexed by reductions in Liebowitz Social Anxiety Scale) and reductions in cortical volume in bilateral dorsomedial prefrontal cortex. Diffusion tensor imaging analysis revealed a significant increase in FA in bilateral uncinate fasciculus and right inferior longitudinal fasciculus. Network-based statistics revealed a significant increase of structural connectivity in a frontolimbic network. No partial correlations with treatment success have been found in WM analyses. For, we believe, the first time, we present a distinctive pattern of longitudinal structural brain changes after CBGT measured with three established magnetic resonance imaging analyzing techniques. Our findings are in line with previous cross-sectional, unimodal SAD studies and extent them by highlighting anatomical brain alterations that point toward the level of HCs in parallel with a reduction in SAD symptomatology.

  7. Visual analytics in cheminformatics: user-supervised descriptor selection for QSAR methods.

    PubMed

    Martínez, María Jimena; Ponzoni, Ignacio; Díaz, Mónica F; Vazquez, Gustavo E; Soto, Axel J

    2015-01-01

    The design of QSAR/QSPR models is a challenging problem, where the selection of the most relevant descriptors constitutes a key step of the process. Several feature selection methods that address this step are concentrated on statistical associations among descriptors and target properties, whereas the chemical knowledge is left out of the analysis. For this reason, the interpretability and generality of the QSAR/QSPR models obtained by these feature selection methods are drastically affected. Therefore, an approach for integrating domain expert's knowledge in the selection process is needed for increase the confidence in the final set of descriptors. In this paper a software tool, which we named Visual and Interactive DEscriptor ANalysis (VIDEAN), that combines statistical methods with interactive visualizations for choosing a set of descriptors for predicting a target property is proposed. Domain expertise can be added to the feature selection process by means of an interactive visual exploration of data, and aided by statistical tools and metrics based on information theory. Coordinated visual representations are presented for capturing different relationships and interactions among descriptors, target properties and candidate subsets of descriptors. The competencies of the proposed software were assessed through different scenarios. These scenarios reveal how an expert can use this tool to choose one subset of descriptors from a group of candidate subsets or how to modify existing descriptor subsets and even incorporate new descriptors according to his or her own knowledge of the target property. The reported experiences showed the suitability of our software for selecting sets of descriptors with low cardinality, high interpretability, low redundancy and high statistical performance in a visual exploratory way. Therefore, it is possible to conclude that the resulting tool allows the integration of a chemist's expertise in the descriptor selection process with a low cognitive effort in contrast with the alternative of using an ad-hoc manual analysis of the selected descriptors. Graphical abstractVIDEAN allows the visual analysis of candidate subsets of descriptors for QSAR/QSPR. In the two panels on the top, users can interactively explore numerical correlations as well as co-occurrences in the candidate subsets through two interactive graphs.

  8. Probing the exchange statistics of one-dimensional anyon models

    NASA Astrophysics Data System (ADS)

    Greschner, Sebastian; Cardarelli, Lorenzo; Santos, Luis

    2018-05-01

    We propose feasible scenarios for revealing the modified exchange statistics in one-dimensional anyon models in optical lattices based on an extension of the multicolor lattice-depth modulation scheme introduced in [Phys. Rev. A 94, 023615 (2016), 10.1103/PhysRevA.94.023615]. We show that the fast modulation of a two-component fermionic lattice gas in the presence a magnetic field gradient, in combination with additional resonant microwave fields, allows for the quantum simulation of hardcore anyon models with periodic boundary conditions. Such a semisynthetic ring setup allows for realizing an interferometric arrangement sensitive to the anyonic statistics. Moreover, we show as well that simple expansion experiments may reveal the formation of anomalously bound pairs resulting from the anyonic exchange.

  9. Diet misreporting can be corrected: confirmation of the association between energy intake and fat-free mass in adolescents.

    PubMed

    Vainik, Uku; Konstabel, Kenn; Lätt, Evelin; Mäestu, Jarek; Purge, Priit; Jürimäe, Jaak

    2016-10-01

    Subjective energy intake (sEI) is often misreported, providing unreliable estimates of energy consumed. Therefore, relating sEI data to health outcomes is difficult. Recently, Börnhorst et al. compared various methods to correct sEI-based energy intake estimates. They criticised approaches that categorise participants as under-reporters, plausible reporters and over-reporters based on the sEI:total energy expenditure (TEE) ratio, and thereafter use these categories as statistical covariates or exclusion criteria. Instead, they recommended using external predictors of sEI misreporting as statistical covariates. We sought to confirm and extend these findings. Using a sample of 190 adolescent boys (mean age=14), we demonstrated that dual-energy X-ray absorptiometry-measured fat-free mass is strongly associated with objective energy intake data (onsite weighted breakfast), but the association with sEI (previous 3-d dietary interview) is weak. Comparing sEI with TEE revealed that sEI was mostly under-reported (74 %). Interestingly, statistically controlling for dietary reporting groups or restricting samples to plausible reporters created a stronger-than-expected association between fat-free mass and sEI. However, the association was an artifact caused by selection bias - that is, data re-sampling and simulations showed that these methods overestimated the effect size because fat-free mass was related to sEI both directly and indirectly via TEE. A more realistic association between sEI and fat-free mass was obtained when the model included common predictors of misreporting (e.g. BMI, restraint). To conclude, restricting sEI data only to plausible reporters can cause selection bias and inflated associations in later analyses. Therefore, we further support statistically correcting sEI data in nutritional analyses. The script for running simulations is provided.

  10. Exercise reduces depressive symptoms in adults with arthritis: Evidential value.

    PubMed

    Kelley, George A; Kelley, Kristi S

    2016-07-12

    To determine whether evidential value exists that exercise reduces depression in adults with arthritis and other rheumatic conditions. Utilizing data derived from a prior meta-analysis of 29 randomized controlled trials comprising 2449 participants (1470 exercise, 979 control) with fibromyalgia, osteoarthritis, rheumatoid arthritis or systemic lupus erythematosus, a new method, P -curve, was utilized to assess for evidentiary worth as well as dismiss the possibility of discriminating reporting of statistically significant results regarding exercise and depression in adults with arthritis and other rheumatic conditions. Using the method of Stouffer, Z -scores were calculated to examine selective-reporting bias. An alpha ( P ) value < 0.05 was deemed statistically significant. In addition, average power of the tests included in P -curve, adjusted for publication bias, was calculated. Fifteen of 29 studies (51.7%) with exercise and depression results were statistically significant ( P < 0.05) while none of the results were statistically significant with respect to exercise increasing depression in adults with arthritis and other rheumatic conditions. Right-skew to dismiss selective reporting was identified ( Z = -5.28, P < 0.0001). In addition, the included studies did not lack evidential value ( Z = 2.39, P = 0.99), nor did they lack evidential value and were P -hacked ( Z = 5.28, P > 0.99). The relative frequencies of P -values were 66.7% at 0.01, 6.7% each at 0.02 and 0.03, 13.3% at 0.04 and 6.7% at 0.05. The average power of the tests included in P -curve, corrected for publication bias, was 69%. Diagnostic plot results revealed that the observed power estimate was a better fit than the alternatives. Evidential value results provide additional support that exercise reduces depression in adults with arthritis and other rheumatic conditions.

  11. Divergent Trends in Abortion and Birth Control Practices in Belarus, Russia and Ukraine

    PubMed Central

    Denisov, Boris P.; Sakevich, Victoria I.; Jasilioniene, Aiva

    2012-01-01

    Context The last decade witnessed growing differences in abortion dynamics in Belarus, Russia, and Ukraine despite demographic, social, and historical similarities of these nations. This paper investigates changes in birth control practices in the three countries and searches for an explanation of the diverging trends in abortion. Methods Official abortion and contraceptive use statistics, provided by national statistical agencies, were analysed. Respective laws and other legal documents were examined and compared between the three countries. To disclose inter-country differences in prevalence of the modern methods of contraception and its association with major demographic and social factors, an analysis of data from national sample surveys was performed, including binary logistic regression. Results The growing gap in abortion rate in Belarus, Russia, and Ukraine is a genuine phenomenon, not a statistical artefact. The examination of abortion and prevalence of contraception based on official statistics and three national sample surveys did not reveal any unambiguous factors that could explain differences in abortion dynamics in Belarus, Russia, and Ukraine. However, it is very likely that the cause of the inter-country discrepancies lies in contraceptive behavior itself, in adequacies of contraceptive knowledge and practices. Additionally, large differences in government policies, which are very important in shaping contraceptive practices of the population, were detected. Conclusion Since the end of the 1990s, the Russian government switched to archaic ideology in the area of reproductive health and family planning and neglects evidence-based arguments. Such an extreme turn in the governmental position is not observed in Belarus or Ukraine. This is an important factor contributing to the slowdown in the decrease of abortion rates in Russia. PMID:23349656

  12. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    NASA Astrophysics Data System (ADS)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  13. Comparison of U.S. Environmental Protection Agency and U.S. Composting Council microbial detection methods in finished compost and regrowth potential of Salmonella spp. and Escherichia coli O157:H7 in finished compost.

    PubMed

    Reynnells, Russell; Ingram, David T; Roberts, Cheryl; Stonebraker, Richard; Handy, Eric T; Felton, Gary; Vinyard, Bryan T; Millner, Patricia D; Sharma, Manan

    2014-07-01

    Bacterial pathogens may survive and regrow in finished compost due to incomplete thermal inactivation during or recontamination after composting. Twenty-nine finished composts were obtained from 19 U.S. states and were separated into three broad feedstock categories: biosolids (n=10), manure (n=4), and yard waste (n=15). Three replicates of each compost were inoculated with ≈ 1-2 log CFU/g of nonpathogenic Escherichia coli, Salmonella spp., and E. coli O157:H7. The U.S. Environmental Protection Agency's (EPA) protocols and U.S. Composting Council's (USCC) Test Methods for the Examination of Composting and Compost (TMECC) were compared to determine which method recovered higher percentages of inoculated E. coli (representing fecal coliforms) and Salmonella spp. from 400-g samples of finished composts. Populations of Salmonella spp. and E. coli O157:H7 were determined over 3 days while stored at 25°C and compared to physicochemical parameters to predict their respective regrowth potentials. EPA Method 1680 recovered significantly (p=0.0003) more inoculated E. coli (68.7%) than TMECC 07.01 (48.1%) due to the EPA method using more compost in the initial homogenate, larger transfer dilutions, and a larger most probable number scheme compared to TMECC 07.01. The recoveries of inoculated Salmonella spp. by Environmental Protection Agency Method 1682 (89.1%) and TMECC 07.02 (72.4%) were not statistically significant (p=0.44). The statistically similar recovery percentages may be explained by the use of a nonselective pre-enrichment step used in both methods. No physicochemical parameter (C:N, moisture content, total organic carbon) was able to serve as a sole predictor of regrowth of Salmonella spp. or E. coli O157:H7 in finished compost. However, statistical analysis revealed that the C:N ratio, total organic carbon, and moisture content all contributed to pathogen regrowth potential in finished composts. It is recommended that the USCC modify TMECC protocols to test larger amounts of compost in the initial homogenate to facilitate greater recovery of target organisms.

  14. The seasonal occupancy and diel behaviour of Antarctic sperm whales revealed by acoustic monitoring.

    PubMed

    Miller, Brian S; Miller, Elanor J

    2018-04-03

    The seasonal occupancy and diel behaviour of sperm whales (Physeter macrocephalus) was investigated using data from long-term acoustic recorders deployed off east Antarctica. An automated method for investigating acoustic presence of sperm whales was developed, characterised, and applied to multi-year acoustic datasets at three locations. Instead of focusing on the acoustic properties of detected clicks, the method relied solely on the inter-click-interval (ICI) for determining presence within an hour-long recording. Parameters for our classifier were informed by knowledge of typical vocal behaviour of sperm whales. Sperm whales were detected predominantly from Dec-Feb, occasionally in Nov, Mar, Apr, and May, but never in the Austral winter or early spring months. Ice cover was found to have a statistically significant negative effect on sperm whale presence. In ice-free months sperm whales were detected more often during daylight hours and were seldom detected at night, and this effect was also statistically significant. Seasonal presence at the three east Antarctic recording sites were in accord with what has been inferred from 20th century whale catches off western Antarctica and from stomach contents of whales caught off South Africa.

  15. Assessment of pleiotropic transcriptome perturbations in Arabidopsis engineered for indirect insect defence.

    PubMed

    Houshyani, Benyamin; van der Krol, Alexander R; Bino, Raoul J; Bouwmeester, Harro J

    2014-06-19

    Molecular characterization is an essential step of risk/safety assessment of genetically modified (GM) crops. Holistic approaches for molecular characterization using omics platforms can be used to confirm the intended impact of the genetic engineering, but can also reveal the unintended changes at the omics level as a first assessment of potential risks. The potential of omics platforms for risk assessment of GM crops has rarely been used for this purpose because of the lack of a consensus reference and statistical methods to judge the significance or importance of the pleiotropic changes in GM plants. Here we propose a meta data analysis approach to the analysis of GM plants, by measuring the transcriptome distance to untransformed wild-types. In the statistical analysis of the transcriptome distance between GM and wild-type plants, values are compared with naturally occurring transcriptome distances in non-GM counterparts obtained from a database. Using this approach we show that the pleiotropic effect of genes involved in indirect insect defence traits is substantially equivalent to the variation in gene expression occurring naturally in Arabidopsis. Transcriptome distance is a useful screening method to obtain insight in the pleiotropic effects of genetic modification.

  16. The Essential Genome of Escherichia coli K-12

    PubMed Central

    2018-01-01

    ABSTRACT Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. PMID:29463657

  17. Low order models for uncertainty quantification in acoustic propagation problems

    NASA Astrophysics Data System (ADS)

    Millet, Christophe

    2016-11-01

    Long-range sound propagation problems are characterized by both a large number of length scales and a large number of normal modes. In the atmosphere, these modes are confined within waveguides causing the sound to propagate through multiple paths to the receiver. For uncertain atmospheres, the modes are described as random variables. Concise mathematical models and analysis reveal fundamental limitations in classical projection techniques due to different manifestations of the fact that modes that carry small variance can have important effects on the large variance modes. In the present study, we propose a systematic strategy for obtaining statistically accurate low order models. The normal modes are sorted in decreasing Sobol indices using asymptotic expansions, and the relevant modes are extracted using a modified iterative Krylov-based method. The statistics of acoustic signals are computed by decomposing the original pulse into a truncated sum of modal pulses that can be described by a stationary phase method. As the low-order acoustic model preserves the overall structure of waveforms under perturbations of the atmosphere, it can be applied to uncertainty quantification. The result of this study is a new algorithm which applies on the entire phase space of acoustic fields.

  18. Improving Accuracy and Temporal Resolution of Learning Curve Estimation for within- and across-Session Analysis

    PubMed Central

    Tabelow, Karsten; König, Reinhard; Polzehl, Jörg

    2016-01-01

    Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809

  19. In-situ structural integrity evaluation for high-power pulsed spallation neutron source - Effects of cavitation damage on structural vibration

    NASA Astrophysics Data System (ADS)

    Wan, Tao; Naoe, Takashi; Futakawa, Masatoshi

    2016-01-01

    A double-wall structure mercury target will be installed at the high-power pulsed spallation neutron source in the Japan Proton Accelerator Research Complex (J-PARC). Cavitation damage on the inner wall is an important factor governing the lifetime of the target-vessel. To monitor the structural integrity of the target vessel, displacement velocity at a point on the outer surface of the target vessel is measured using a laser Doppler vibrometer (LDV). The measured signals can be used for evaluating the damage inside the target vessel because of cyclic loading and cavitation bubble collapse caused by pulsed-beam induced pressure waves. The wavelet differential analysis (WDA) was applied to reveal the effects of the damage on vibrational cycling. To reduce the effects of noise superimposed on the vibration signals on the WDA results, analysis of variance (ANOVA) and analysis of covariance (ANCOVA), statistical methods were applied. Results from laboratory experiments, numerical simulation results with random noise added, and target vessel field data were analyzed by the WDA and the statistical methods. The analyses demonstrated that the established in-situ diagnostic technique can be used to effectively evaluate the structural response of the target vessel.

  20. Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain

    PubMed Central

    Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young

    2010-01-01

    Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071

  1. DISQOVER the Landcover - R based tools for quantitative vegetation reconstruction

    NASA Astrophysics Data System (ADS)

    Theuerkauf, Martin; Couwenberg, John; Kuparinen, Anna; Liebscher, Volkmar

    2016-04-01

    Quantitative methods have gained increasing attention in the field of vegetation reconstruction over the past decade. The DISQOVER package implements key tools in the R programming environment for statistical computing. This implementation has three main goals: 1) Provide a user-friendly, transparent, and open implementation of the methods 2) Provide full flexibility in all parameters (including the underlying pollen dispersal model) 3) Provide a sandbox for testing the sensitivity of the methods. We illustrate the possibilities of the package with tests of the REVEALS model and of the extended downscaling approach (EDA). REVEALS (Sugita 2007) is designed to translate pollen data from large lakes into regional vegetation composition. We applied REVEALSinR on pollen data from Lake Tiefer See (NE-Germany) and validated the results with historic landcover data. The results clearly show that REVEALS is sensitive to the underlying pollen dispersal model; REVEALS performs best when applied with the state of the art Lagrangian stochastic dispersal model. REVEALS applications with the conventional Gauss model can produce realistic results, but only if unrealistic pollen productivity estimates are used. The EDA (Theuerkauf et al. 2014) employs pollen data from many sites across a landscape to explore whether species distributions in the past were related to know stable patterns in the landscape, e.g. the distribution of soil types. The approach had so far only been implemented in simple settings with few taxa. Tests with EDAinR show that it produces sharp results in complex settings with many taxa as well. The DISQOVER package is open source software, available from disqover.uni-greifswald.de. This website can be used as a platform to discuss and improve quantitative methods in vegetation reconstruction. To introduce the tool we plan a short course in autumn of this year. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution (ICLEA; www.iclea.de) of the Helmholtz Association (Grant Number VH-VI-415) and is supported by Helmholtz infrastructure of the Terrestrial Environmental Observatory (TERENO) North-eastern Germany.

  2. Co-evolutionary Analysis of Domains in Interacting Proteins Reveals Insights into Domain–Domain Interactions Mediating Protein–Protein Interactions

    PubMed Central

    Jothi, Raja; Cherukuri, Praveen F.; Tasneem, Asba; Przytycka, Teresa M.

    2006-01-01

    Recent advances in functional genomics have helped generate large-scale high-throughput protein interaction data. Such networks, though extremely valuable towards molecular level understanding of cells, do not provide any direct information about the regions (domains) in the proteins that mediate the interaction. Here, we performed co-evolutionary analysis of domains in interacting proteins in order to understand the degree of co-evolution of interacting and non-interacting domains. Using a combination of sequence and structural analysis, we analyzed protein–protein interactions in F1-ATPase, Sec23p/Sec24p, DNA-directed RNA polymerase and nuclear pore complexes, and found that interacting domain pair(s) for a given interaction exhibits higher level of co-evolution than the noninteracting domain pairs. Motivated by this finding, we developed a computational method to test the generality of the observed trend, and to predict large-scale domain–domain interactions. Given a protein–protein interaction, the proposed method predicts the domain pair(s) that is most likely to mediate the protein interaction. We applied this method on the yeast interactome to predict domain–domain interactions, and used known domain–domain interactions found in PDB crystal structures to validate our predictions. Our results show that the prediction accuracy of the proposed method is statistically significant. Comparison of our prediction results with those from two other methods reveals that only a fraction of predictions are shared by all the three methods, indicating that the proposed method can detect known interactions missed by other methods. We believe that the proposed method can be used with other methods to help identify previously unrecognized domain–domain interactions on a genome scale, and could potentially help reduce the search space for identifying interaction sites. PMID:16949097

  3. Simultaneous Microwave Extraction and Separation of Volatile and Non-Volatile Organic Compounds of Boldo Leaves. From Lab to Industrial Scale

    PubMed Central

    Petigny, Loïc; Périno, Sandrine; Minuti, Matteo; Visinoni, Francesco; Wajsman, Joël; Chemat, Farid

    2014-01-01

    Microwave extraction and separation has been used to increase the concentration of the extract compared to the conventional method with the same solid/liquid ratio, reducing extraction time and separate at the same time Volatile Organic Compounds (VOC) from non-Volatile Organic Compounds (NVOC) of boldo leaves. As preliminary study, a response surface method has been used to optimize the extraction of soluble material and the separation of VOC from the plant in laboratory scale. The results from the statistical analysis revealed that the optimized conditions were: microwave power 200 W, extraction time 56 min and solid liquid ratio of 7.5% of plants in water. Lab scale optimized microwave method is compared to conventional distillation, and requires a power/mass ratio of 0.4 W/g of water engaged. This power/mass ratio is kept in order to upscale from lab to pilot plant. PMID:24776762

  4. Sequence of eruptive events in the Vesuvio area recorded in shallow-water Ionian Sea sediments

    NASA Astrophysics Data System (ADS)

    Taricco, C.; Alessio, S.; Vivaldo, G.

    2008-01-01

    The dating of the cores we drilled from the Gallipoli terrace in the Gulf of Taranto (Ionian Sea), previously obtained by tephroanalysis, is checked by applying a method to objectively recognize volcanic events. This automatic statistical procedure allows identifying pulse-like features in a series and evaluating quantitatively the confidence level at which the significant peaks are detected. We applied it to the 2000-years-long pyroxenes series of the GT89-3 core, on which the dating is based. The method confirms the dating previously performed by detecting at a high confidence level the peaks originally used and indicates a few possible undocumented eruptions. Moreover, a spectral analysis, focussed on the long-term variability of the pyroxenes series and performed by several advanced methods, reveals that the volcanic pulses are superimposed to a millennial trend and a 400 years oscillation.

  5. Structure, optical and phonon properties of bulk and nanocrystalline Al2-xScx(WO4)3 solid solutions doped with Cr3+

    NASA Astrophysics Data System (ADS)

    Mączka, M.; Hermanowicz, K.; Pietraszko, A.; Yordanova, A.; Koseva, I.

    2014-01-01

    Pure and Cr3+ doped nanosized Al2-xScx(WO4)3 solid solutions were prepared by co-precipitation method as well as Al2-xScx(WO4)3 single crystals were grown by high-temperature flux method. The obtained samples were characterized by X-ray, Raman, IR, absorption and luminescence methods. Single crystal X-ray diffraction showed that AlSc(WO4)3 is orthorhombic at room temperature with space group Pnca and trivalent cations are statistically distributed. Raman and IR studies showed that Al2-xScx(WO4)3 solid solutions show "two mode" behavior. They also showed that vibrational properties of nanosized samples have been weakly modified in comparison with the bulk materials. The luminescence and absorption spectra revealed that chromium ions occupy two sites of weak and strong crystal field strength.

  6. The transmission of fluctuation among price indices based on Granger causality network

    NASA Astrophysics Data System (ADS)

    Sun, Qingru; Gao, Xiangyun; Wen, Shaobo; Chen, Zhihua; Hao, Xiaoqing

    2018-09-01

    In this paper, we provide a method of statistical physics to analyze the fluctuation of transmission by constructing Granger causality network among price indices (PIGCN) from a systematical perspective, using complex network theory combined with Granger causality method. In economic system, there are numerous price indices, of which the relationships are extreme complicated. Thus, time series data of 6 types of price indices of China, including 113 kinds of sub price indices, are selected as example of empirical study. Through the analysis of the structure of PIGCN, we identify important price indices with high transmission range, high intermediation capacity, high cohesion and the fluctuation transmission path of price indices, respectively. Furthermore, dynamic relationships among price indices are revealed. Based on these results, we provide several policy implications for monitoring the diffusion of risk of price fluctuation. Our method can also be used to study the price indices of other countries, which is generally applicable.

  7. Influence of Manufacturing Methods of Implant-Supported Crowns on External and Internal Marginal Fit: A Micro-CT Analysis.

    PubMed

    Moris, Izabela C M; Monteiro, Silas Borges; Martins, Raíssa; Ribeiro, Ricardo Faria; Gomes, Erica A

    2018-01-01

    To evaluate the influence of different manufacturing methods of single implant-supported metallic crowns on the internal and external marginal fit through computed microtomography. Forty external hexagon implants were divided into 4 groups ( n = 8), according to the manufacturing method: GC, conventional casting; GI, induction casting; GP, plasma casting; and GCAD, CAD/CAM machining. The crowns were attached to the implants with insertion torque of 30 N·cm. The external (vertical and horizontal) marginal fit and internal fit were assessed through computed microtomography. Internal and external marginal fit data ( μ m) were submitted to a one-way ANOVA and Tukey's test ( α = .05). Qualitative evaluation of the images was conducted by using micro-CT. The statistical analysis revealed no significant difference between the groups for vertical misfit ( P = 0.721). There was no significant difference ( P > 0.05) for the internal and horizontal marginal misfit in the groups GC, GI, and GP, but it was found for the group GCAD ( P ≤ 0.05). Qualitative analysis revealed that most of the samples of cast groups exhibited crowns underextension while the group GCAD showed overextension. The manufacturing method of the crowns influenced the accuracy of marginal fit between the prosthesis and implant. The best results were found for the crowns fabricated through CAD/CAM machining.

  8. Bootstrap Methods: A Very Leisurely Look.

    ERIC Educational Resources Information Center

    Hinkle, Dennis E.; Winstead, Wayland H.

    The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…

  9. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalski, D; Huq, M; Bednarz, G

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same ismore » for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.« less

  10. The application of artificial intelligence to microarray data: identification of a novel gene signature to identify bladder cancer progression.

    PubMed

    Catto, James W F; Abbod, Maysam F; Wild, Peter J; Linkens, Derek A; Pilarsky, Christian; Rehman, Ishtiaq; Rosario, Derek J; Denzinger, Stefan; Burger, Maximilian; Stoehr, Robert; Knuechel, Ruth; Hartmann, Arndt; Hamdy, Freddie C

    2010-03-01

    New methods for identifying bladder cancer (BCa) progression are required. Gene expression microarrays can reveal insights into disease biology and identify novel biomarkers. However, these experiments produce large datasets that are difficult to interpret. To develop a novel method of microarray analysis combining two forms of artificial intelligence (AI): neurofuzzy modelling (NFM) and artificial neural networks (ANN) and validate it in a BCa cohort. We used AI and statistical analyses to identify progression-related genes in a microarray dataset (n=66 tumours, n=2800 genes). The AI-selected genes were then investigated in a second cohort (n=262 tumours) using immunohistochemistry. We compared the accuracy of AI and statistical approaches to identify tumour progression. AI identified 11 progression-associated genes (odds ratio [OR]: 0.70; 95% confidence interval [CI], 0.56-0.87; p=0.0004), and these were more discriminate than genes chosen using statistical analyses (OR: 1.24; 95% CI, 0.96-1.60; p=0.09). The expression of six AI-selected genes (LIG3, FAS, KRT18, ICAM1, DSG2, and BRCA2) was determined using commercial antibodies and successfully identified tumour progression (concordance index: 0.66; log-rank test: p=0.01). AI-selected genes were more discriminate than pathologic criteria at determining progression (Cox multivariate analysis: p=0.01). Limitations include the use of statistical correlation to identify 200 genes for AI analysis and that we did not compare regression identified genes with immunohistochemistry. AI and statistical analyses use different techniques of inference to determine gene-phenotype associations and identify distinct prognostic gene signatures that are equally valid. We have identified a prognostic gene signature whose members reflect a variety of carcinogenic pathways that could identify progression in non-muscle-invasive BCa. 2009 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  11. A wind proxy based on migrating dunes at the Baltic coast: statistical analysis of the link between wind conditions and sand movement

    NASA Astrophysics Data System (ADS)

    Bierstedt, Svenja E.; Hünicke, Birgit; Zorita, Eduardo; Ludwig, Juliane

    2017-07-01

    We statistically analyse the relationship between the structure of migrating dunes in the southern Baltic and the driving wind conditions over the past 26 years, with the long-term aim of using migrating dunes as a proxy for past wind conditions at an interannual resolution. The present analysis is based on the dune record derived from geo-radar measurements by Ludwig et al. (2017). The dune system is located at the Baltic Sea coast of Poland and is migrating from west to east along the coast. The dunes present layers with different thicknesses that can be assigned to absolute dates at interannual timescales and put in relation to seasonal wind conditions. To statistically analyse this record and calibrate it as a wind proxy, we used a gridded regional meteorological reanalysis data set (coastDat2) covering recent decades. The identified link between the dune annual layers and wind conditions was additionally supported by the co-variability between dune layers and observed sea level variations in the southern Baltic Sea. We include precipitation and temperature into our analysis, in addition to wind, to learn more about the dependency between these three atmospheric factors and their common influence on the dune system. We set up a statistical linear model based on the correlation between the frequency of days with specific wind conditions in a given season and dune migration velocities derived for that season. To some extent, the dune records can be seen as analogous to tree-ring width records, and hence we use a proxy validation method usually applied in dendrochronology, cross-validation with the leave-one-out method, when the observational record is short. The revealed correlations between the wind record from the reanalysis and the wind record derived from the dune structure is in the range between 0.28 and 0.63, yielding similar statistical validation skill as dendroclimatological records.

  12. Prevalence of Tuberculosis among Veterans, Military Personnel and their Families in East Azerbaijan Province Violators of the last 15 Years.

    PubMed

    Azad Aminjan, Maboud; Moaddab, Seyyed Reza; Hosseini Ravandi, Mohammad; Kazemi Haki, Behzad

    2015-10-01

    Nowadays in the world, tuberculosis is the second largest killer of adults after HIV. Due to the location of presidios that is mostly located in hazardous zones soldiers and army personnel are considered high risk, therefore we decided to determine the prevalence of tuberculosis status in this group of people. This was a cross-sectional descriptive research that studied the prevalence of pulmonary tuberculosis in soldiers and military personnel in the last 15 years in tuberculosis and lung disease research center at Tabriz University of Medical Sciences. The statistical population consisted of all the soldiers and military personnel. The detection method in this study was based on microscopic examination following Ziehl-Neelsen Stain and in Leuven Stein Johnson culturing. Descriptive statistics was used for statistical analysis and statistical values less than 0.05 were considered significant. By review information in this center since the 1988-2013 with 72 military personnel suffering from tuberculosis, it was revealed that among them 30 women, 42 men, 14 soldiers, 29 family members, and 29 military personnel are pointed. A significant correlation was found between TB rates among military personnel and their families. Although in recent years, the national statistics indicate a decline of tuberculosis, but the results of our study showed that TB is still a serious disease that must comply with the first symptoms of tuberculosis in military personnel and their families that should be diagnosed as soon as possible.

  13. Coping strategies as mediators and moderators between stress and quality of life among parents of children with autistic disorder.

    PubMed

    Dardas, Latefa A; Ahmad, Muayyad M

    2015-02-01

    The purpose of this cross-sectional study was to examine coping strategies as mediators and moderators between stress and quality of life (QoL) among parents of children with autistic disorder. The convenience sample of the study consisted of 184 parents of children with autistic disorder. Advanced statistical methods for analyses of mediator and moderator effects of coping strategies were used. The results revealed that 'accepting responsibility' was the only mediator strategy in the relationship between stress and QoL. The results also revealed that only 'seeking social support' and 'escape avoidance' were moderator strategies in the relationship between stress and QoL. This study is perhaps the first to investigate the mediating and moderating effects of coping on QoL of parents of children with autistic disorder. Recommendations for practice and future research are presented. © 2013 John Wiley & Sons, Ltd.

  14. Systematic review of practice guideline dissemination and implementation strategies for healthcare teams and team-based practice.

    PubMed

    Medves, Jennifer; Godfrey, Christina; Turner, Carly; Paterson, Margo; Harrison, Margaret; MacKenzie, Lindsay; Durando, Paola

    2010-06-01

    To synthesis the literature relevant to guideline dissemination and implementation strategies for healthcare teams and team-based practice. Systematic approach utilising Joanna Briggs Institute methods. Two reviewers screened all articles and where there was disagreement, a third reviewer determined inclusion. Initial search revealed 12,083 of which 88 met the inclusion criteria. Ten dissemination and implementation strategies identified with distribution of educational materials the most common. Studies were assessed for patient or practitioner outcomes and changes in practice, knowledge and economic outcomes. A descriptive analysis revealed multiple approaches using teams of healthcare providers were reported to have statistically significant results in knowledge, practice and/or outcomes for 72.7% of the studies. Team-based care using practice guidelines locally adapted can affect positively patient and provider outcomes. © 2010 The Authors. Journal Compilation © Blackwell Publishing Asia Pty Ltd.

  15. Pink Ribbons and Red Dresses: A Mixed Methods Content Analysis of Media Coverage of Breast Cancer and Heart Disease.

    PubMed

    Champion, Claudine; Berry, Tanya R; Kingsley, Bethan; Spence, John C

    2016-10-01

    This research examined media coverage of breast cancer (n = 145) and heart disease and stroke (n = 39) news articles, videos, advertisements, and images in a local Canadian context through quantitative and thematic content analyses. Quantitative analysis revealed significant differences between coverage of the diseases in placement, survivors as a source of information, health agency, human interest stories, citation of a research study, the inclusion of risk statistics, discussion of preventative behaviors, and tone used. The thematic analysis revealed themes that characterized a "typical" breast cancer survivor and indicated that "good" citizens and businesses should help the cause of breast cancer. Themes for heart disease and stroke articulated individual responsibility and the ways fundraising reinforced femininity and privilege. Findings provide insight on how these diseases are framed in local Canadian media, which might impact an individual's understanding of the disease.

  16. Headspace screening: A novel approach for fast quality assessment of the essential oil from culinary sage.

    PubMed

    Cvetkovikj, Ivana; Stefkov, Gjoshe; Acevska, Jelena; Karapandzova, Marija; Dimitrovska, Aneta; Kulevanova, Svetlana

    2016-07-01

    Quality assessment of essential oil (EO) from culinary sage (Salvia officinalis L., Lamiaceae) is limited by the long pharmacopoeial procedure. The aim of this study was to employ headspace (HS) sampling in the quality assessment of sage EO. Different populations (30) of culinary sage were assessed using GC/FID/MS analysis of the hydrodistilled EO (pharmacopoeial method) and HS sampling directly from leaves. Compound profiles from both procedures were evaluated according to ISO 9909 and GDC standards for sage EO quality, revealing compliance for only 10 populations. Factors to convert HS values, for the target ISO and GDC components, into theoretical EO values were calculated. Statistical analysis revealed a significant relationship between HS and EO values for seven target components. Consequently, HS sampling could be used as a complementary extraction technique for rapid screening in quality assessment of sage EOs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Confronting Compassion Fatigue: Assessment and Intervention in Inpatient Oncology
.

    PubMed

    Zajac, Lisa M; Moran, Katherine J; Groh, Carla J

    2017-08-01

    A notable variation among patient satisfaction scores with nursing care was identified. Contributing factors were examined and revealed significant negative correlations between the unit death rate and surviving patients' satisfaction scores. Compassion fatigue (CF) was hypothesized to be a major contributing factor.
. The objective was to address CF in RNs and oncology care associates (assistive personnel) by developing an intervention to provide bereavement support to staff after patient deaths.
. A mixed-methods sequential design was used. Instruments included the Professional Quality of Life scale and Press Ganey survey results. Univariate descriptive statistics, frequencies, an independent t test, and an analysis of covariance were used for data analysis.
. The preintervention results revealed average compassion satisfaction and secondary traumatic stress scores and low burnout scores. No significant difference was noted between pre- and postintervention CF scores. Patients' perception of nurses' skills improved significantly in the second quarter of 2015.

  18. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  19. Socioscientific Argumentation: The effects of content knowledge and morality

    NASA Astrophysics Data System (ADS)

    Sadler, Troy D.; Donnelly, Lisa A.

    2006-10-01

    Broad support exists within the science education community for the incorporation of socioscientific issues (SSI) and argumentation in the science curriculum. This study investigates how content knowledge and morality contribute to the quality of SSI argumentation among high school students. We employed a mixed-methods approach: 56 participants completed tests of content knowledge and moral reasoning as well as interviews, related to SSI topics, which were scored based on a rubric for argumentation quality. Multiple regression analyses revealed no statistically significant relationships among content knowledge, moral reasoning, and argumentation quality. Qualitative analyses of the interview transcripts supported the quantitative results in that participants very infrequently revealed patterns of content knowledge application. However, most of the participants did perceive the SSI as moral problems. We propose a “Threshold Model of Knowledge Transfer” to account for the relationship between content knowledge and argumentation quality. Implications for science education are discussed.

  20. A High-Throughput Approach for Identification of Nontuberculous Mycobacteria in Drinking Water Reveals Relationship between Water Age and Mycobacterium avium

    PubMed Central

    Haig, Sarah-Jane; Kotlarz, Nadine; LiPuma, John J.

    2018-01-01

    ABSTRACT Nontuberculous mycobacteria (NTM) frequently detected in drinking water (DW) include species associated with human infections, as well as species rarely linked to disease. Methods for improved the recovery of NTM DNA and high-throughput identification of NTM are needed for risk assessment of NTM infection through DW exposure. In this study, different methods of recovering bacterial DNA from DW were compared, revealing that a phenol-chloroform DNA extraction method yielded two to four times as much total DNA and eight times as much NTM DNA as two commercial DNA extraction kits. This method, combined with high-throughput, single-molecule real-time sequencing of NTM rpoB genes, allowed the identification of NTM to the species, subspecies, and (in some cases) strain levels. This approach was applied to DW samples collected from 15 households serviced by a chloraminated distribution system, with homes located in areas representing short (<24 h) and long (>24 h) distribution system residence times. Multivariate statistical analysis revealed that greater water age (i.e., combined distribution system residence time and home plumbing stagnation time) was associated with a greater relative abundance of Mycobacterium avium subsp. avium, one of the most prevalent NTM causing infections in humans. DW from homes closer to the treatment plant (with a shorter water age) contained more diverse NTM species, including Mycobacterium abscessus and Mycobacterium chelonae. Overall, our approach allows NTM identification to the species and subspecies levels and can be used in future studies to assess the risk of waterborne infection by providing insight into the similarity between environmental and infection-associated NTM. PMID:29440575

  1. A comment on "Novel scavenger removal trials increase wind turbine-caused avian fatality estimates"

    USGS Publications Warehouse

    Huso, Manuela M.P.; Erickson, Wallace P.

    2013-01-01

    In a recent paper, Smallwood et al. (2010) conducted a study to compare their “novel” approach to conducting carcass removal trials with what they term the “conventional” approach and to evaluate the effects of the different methods on estimated avian fatality at a wind power facility in California. A quick glance at Table 3 that succinctly summarizes their results and provides estimated fatality rates and 80% confidence intervals calculated using the 2 methods reveals a surprising result. The confidence intervals of all of their estimates and most of the conventional estimates extend below 0. These results imply that wind turbines may have the capacity to create live birds. But a more likely interpretation is that a serious error occurred in the calculation of either the average fatality rate or its standard error or both. Further evaluation of their methods reveals that the scientific basis for concluding that “many estimates of scavenger removal rates prior to [their] study were likely biased low due to scavenger swamping” and “previously reported estimates of avian fatality rates … should be adjusted upwards” was not evident in their analysis and results. Their comparison to conventional approaches was not applicable, their statistical models were questionable, and the conclusions they drew were unsupported.

  2. Microbiological Assessment of Moringa Oleifera Extracts and Its Incorporation in Novel Dental Remedies against Some Oral Pathogens

    PubMed Central

    Elgamily, Hanaa; Moussa, Amani; Elboraey, Asmaa; EL-Sayed, Hoda; Al-Moghazy, Marwa; Abdalla, Aboelfetoh

    2016-01-01

    AIM: To assess the antibacterial and antifungal potentials of different parts of Moringa oleifera plant using different extraction methods in attempts to formulate natural dental remedies from this plant. MATERIAL AND METHODS: Three solvents extracts (Ethanol, acetone, and ethyl acetate) of different parts of Egyptian Moringa tree were prepared and tested against oral pathogens: Staphylococcus aureus, Streptococcus mutans, and Candida albicans using disc diffusion method; As well as to incorporate the plant extract to formulate experimental toothpaste and mouthwash. The two dental remedies were assessed against the same microbial strains. Statistical analysis was performed using One-Way ANOVA test to compare the inhibition zone diameter and t-test. RESULTS: Ethanol extracts as well as leaves extracts demonstrated the highest significant mean inhibition zone values (P ≤ 0.05) against Staphylococcus aureus and Streptococcus mutans growth. However, all extracts revealed no inhibition zone against Candida albicans. For dental remedies, experimental toothpaste exhibited higher mean inhibition than the mouthwash against Staphylococcus aureus, Streptococcus mutans and only the toothpaste revealed antifungal effect against Candida albicans. CONCLUSION: The different extracts of different parts of Moringa showed an antibacterial effect against Staphylococcus aureus and Streptococcus mutans growth. The novel toothpaste of ethanolic leaves extract has antimicrobial and antifungal potential effects all selected strains. PMID:28028395

  3. Consortium for Mathematics in the Geosciences (CMG++): Promoting the application of mathematics, statistics, and computational sciences to the geosciences

    NASA Astrophysics Data System (ADS)

    Mead, J.; Wright, G. B.

    2013-12-01

    The collection of massive amounts of high quality data from new and greatly improved observing technologies and from large-scale numerical simulations are drastically improving our understanding and modeling of the earth system. However, these datasets are also revealing important knowledge gaps and limitations of our current conceptual models for explaining key aspects of these new observations. These limitations are impeding progress on questions that have both fundamental scientific and societal significance, including climate and weather, natural disaster mitigation, earthquake and volcano dynamics, earth structure and geodynamics, resource exploration, and planetary evolution. New conceptual approaches and numerical methods for characterizing and simulating these systems are needed - methods that can handle processes which vary through a myriad of scales in heterogeneous, complex environments. Additionally, as certain aspects of these systems may be observable only indirectly or not at all, new statistical methods are also needed. This type of research will demand integrating the expertise of geoscientist together with that of mathematicians, statisticians, and computer scientists. If the past is any indicator, this interdisciplinary research will no doubt lead to advances in all these fields in addition to vital improvements in our ability to predict the behavior of the planetary environment. The Consortium for Mathematics in the Geosciences (CMG++) arose from two scientific workshops held at Northwestern and Princeton in 2011 and 2012 with participants from mathematics, statistics, geoscience and computational science. The mission of CMG++ is to accelerate the traditional interaction between people in these disciplines through the promotion of both collaborative research and interdisciplinary education. We will discuss current activities, describe how people can get involved, and solicit input from the broader AGU community.

  4. Fighting bias with statistics: Detecting gender differences in responses to items on a preschool science assessment

    NASA Astrophysics Data System (ADS)

    Greenberg, Ariela Caren

    Differential item functioning (DIF) and differential distractor functioning (DDF) are methods used to screen for item bias (Camilli & Shepard, 1994; Penfield, 2008). Using an applied empirical example, this mixed-methods study examined the congruency and relationship of DIF and DDF methods in screening multiple-choice items. Data for Study I were drawn from item responses of 271 female and 236 male low-income children on a preschool science assessment. Item analyses employed a common statistical approach of the Mantel-Haenszel log-odds ratio (MH-LOR) to detect DIF in dichotomously scored items (Holland & Thayer, 1988), and extended the approach to identify DDF (Penfield, 2008). Findings demonstrated that the using MH-LOR to detect DIF and DDF supported the theoretical relationship that the magnitude and form of DIF and are dependent on the DDF effects, and demonstrated the advantages of studying DIF and DDF in multiple-choice items. A total of 4 items with DIF and DDF and 5 items with only DDF were detected. Study II incorporated an item content review, an important but often overlooked and under-published step of DIF and DDF studies (Camilli & Shepard). Interviews with 25 female and 22 male low-income preschool children and an expert review helped to interpret the DIF and DDF results and their comparison, and determined that a content review process of studied items can reveal reasons for potential item bias that are often congruent with the statistical results. Patterns emerged and are discussed in detail. The quantitative and qualitative analyses were conducted in an applied framework of examining the validity of the preschool science assessment scores for evaluating science programs serving low-income children, however, the techniques can be generalized for use with measures across various disciplines of research.

  5. Effects of Storytelling-Based Education in the Prevention of Drug Abuse among Adolescents in Iran Based on a Readiness to Addiction Index

    PubMed Central

    Moghadam, Mahdieh Poodineh; Sari, Mahdieh; Balouchi, Abbas; Moghadam, Khadijeh

    2016-01-01

    Introduction One of the most effective strategies in the prevention of addiction is increasing awareness among young people, towards the tendency for taking drugs their physical, mental and social side effects. Storytelling is effective for increasing characteristics of happiness and resilience. This study uses storytelling, a common and popular method to increase awareness among adolescents. Aim To examine the effect of storytelling-based education on the prevention of drug abuse, based on a readiness to addiction index. Materials and Methods This quasi-experimental study was conducted on 136 high school students (grade one), selected by a cluster sampling procedure from May 2014 to February 2015 in Zabol, Iran. The instrument for gathering data was a readiness to addiction questionnaire. This questionnaire included 41 items for which the scoring of each item followed the Likerts format. The data gathered was analysed using SPSS version 21 with descriptive statistics and inferential statistics. Results The results revealed that the mean of the readiness to addiction index in the case group fell from 75.66±19.99 to 69.57±21.83 (paired t-test; p =0.02); in the control group the same index changed from 103.01±21.88 to 93.98±27.70 (paired t-test, p = 0.775). That is, the index decreased for both groups, but the reduction was statistically significant only for the case group (p =0.02). Conclusion This suggests that the narrative method is effective in reducing adolescents readiness to addiction. Storytelling is an effective way to raise awareness among young people about addiction and its detrimental impacts on health. Therefore, such a technique can be taken into consideration in teaching principles of prevention. PMID:28050403

  6. The Fluoride Content of Yerba Mate Depending on the Country of Origin and the Conditions of the Infusion.

    PubMed

    Łukomska, A; Jakubczyk, K; Maciejewska, D; Baranowska-Bosiacka, I; Janda, K; Goschorska, M; Chlubek, D; Bosiacka, B; Gutowska, I

    2015-10-01

    There are many reports of the positive effect of yerba mate on the human body. Elemental composition analysis of yerba mate revealed the presence of many microelements and macroelements, but there is no literature data referencing the content and the effect of the method of preparing the yerba mate infusion on the amount of released fluoride and thus the amount of this element supplied to the human body. Therefore, in the traditional way (cold and hot), we prepared infusions of yerba mate from different countries and determined in samples content of fluoride using potentiometric method. Hot infusions resulted in statistically significant (p = 0.03) increases in the amount of fluoride released from the dried material to the water, compared to brewing with water at room temperature. The successive refills of hot water also resulted in a release of the same amount of fluoride, although smaller than the infusion with water at room temperature (at the third refill, it was statistically significantly smaller at p = 0.003). With an increase in the number of hot water refills, the amount of fluoride released from the sample portion significantly decreased. Similar results were recorded when analyzing samples depending on the country of origin. The amount of fluoride released into the water differed statistically significantly depending on the country of origin. The most fluoride was determined in the infusions of yerba mate from Argentina and the least in infusions from Paraguay.

  7. The Checkered History of American Psychiatric Epidemiology

    PubMed Central

    Horwitz, Allan V; Grob, Gerald N

    2011-01-01

    Context American psychiatry has been fascinated with statistics ever since the specialty was created in the early nineteenth century. Initially, psychiatrists hoped that statistics would reveal the benefits of institutional care. Nevertheless, their fascination with statistics was far removed from the growing importance of epidemiology generally. The impetus to create an epidemiology of mental disorders came from the emerging social sciences, whose members were concerned with developing a scientific understanding of individual and social behavior and applying it to a series of pressing social problems. Beginning in the 1920s, the interest of psychiatric epidemiologists shifted to the ways that social environments contributed to the development of mental disorders. This emphasis dramatically changed after 1980 when the policy focus of psychiatric epidemiology became the early identification and prevention of mental illness in individuals. Methods This article reviews the major developments in psychiatric epidemiology over the past century and a half. Findings The lack of an adequate classification system for mental illness has precluded the field of psychiatric epidemiology from providing causal understandings that could contribute to more adequate policies to remediate psychiatric disorders. Because of this gap, the policy influence of psychiatric epidemiology has stemmed more from institutional and ideological concerns than from knowledge about the causes of mental disorders. Conclusion Most of the problems that have bedeviled psychiatric epidemiology since its inception remain unresolved. In particular, until epidemiologists develop adequate methods to measure mental illnesses in community populations, the policy contributions of this field will not be fully realized. PMID:22188350

  8. Tsallis non-extensive statistical mechanics in the ionospheric detrended total electron content during quiet and storm periods

    NASA Astrophysics Data System (ADS)

    Ogunsua, B. O.; Laoye, J. A.

    2018-05-01

    In this paper, the Tsallis non-extensive q-statistics in ionospheric dynamics was investigated using the total electron content (TEC) obtained from two Global Positioning System (GPS) receiver stations. This investigation was carried out considering the geomagnetically quiet and storm periods. The micro density variation of the ionospheric total electron content was extracted from the TEC data by method of detrending. The detrended total electron content, which represent the variation in the internal dynamics of the system was further analyzed using for non-extensive statistical mechanics using the q-Gaussian methods. Our results reveals that for all the analyzed data sets the Tsallis Gaussian probability distribution (q-Gaussian) with value q > 1 were obtained. It was observed that there is no distinct difference in pattern between the values of qquiet and qstorm. However the values of q varies with geophysical conditions and possibly with local dynamics for the two stations. Also observed are the asymmetric pattern of the q-Gaussian and a highly significant level of correlation for the q-index values obtained for the storm periods compared to the quiet periods between the two GPS receiver stations where the TEC was measured. The factors responsible for this variation can be mostly attributed to the varying mechanisms resulting in the self-reorganization of the system dynamics during the storm periods. The result shows the existence of long range correlation for both quiet and storm periods for the two stations.

  9. Using decision trees to understand structure in missing data

    PubMed Central

    Tierney, Nicholas J; Harden, Fiona A; Harden, Maurice J; Mengersen, Kerrie L

    2015-01-01

    Objectives Demonstrate the application of decision trees—classification and regression trees (CARTs), and their cousins, boosted regression trees (BRTs)—to understand structure in missing data. Setting Data taken from employees at 3 different industrial sites in Australia. Participants 7915 observations were included. Materials and methods The approach was evaluated using an occupational health data set comprising results of questionnaires, medical tests and environmental monitoring. Statistical methods included standard statistical tests and the ‘rpart’ and ‘gbm’ packages for CART and BRT analyses, respectively, from the statistical software ‘R’. A simulation study was conducted to explore the capability of decision tree models in describing data with missingness artificially introduced. Results CART and BRT models were effective in highlighting a missingness structure in the data, related to the type of data (medical or environmental), the site in which it was collected, the number of visits, and the presence of extreme values. The simulation study revealed that CART models were able to identify variables and values responsible for inducing missingness. There was greater variation in variable importance for unstructured as compared to structured missingness. Discussion Both CART and BRT models were effective in describing structural missingness in data. CART models may be preferred over BRT models for exploratory analysis of missing data, and selecting variables important for predicting missingness. BRT models can show how values of other variables influence missingness, which may prove useful for researchers. Conclusions Researchers are encouraged to use CART and BRT models to explore and understand missing data. PMID:26124509

  10. Surgery for disc-associated wobbler syndrome in the dog--an examination of the controversy.

    PubMed

    Jeffery, N D; McKee, W M

    2001-12-01

    Controversy surrounds treatment of disc-associated 'wobbler' syndrome in the dog, centring on the choice of method of surgical decompression used. In this review, details of previously published case series are summarised and critically examined in an attempt to compare success rates and complications of different types of surgery. Unequivocally accurate comparisons were difficult because of differences in methods of case recording between series. Short-term success rates were high (approximately 80 per cent), but there was a high rate of recurrence (around 20 per cent) after any surgical treatment, suggesting the possibility that the syndrome should be considered a multifocal disease of the caudal cervical region. Statistical analysis revealed no significant differences in success rates between the various reported decompressive surgical techniques

  11. Revealing plant cryptotypes: defining meaningful phenotypes among infinite traits.

    PubMed

    Chitwood, Daniel H; Topp, Christopher N

    2015-04-01

    The plant phenotype is infinite. Plants vary morphologically and molecularly over developmental time, in response to the environment, and genetically. Exhaustive phenotyping remains not only out of reach, but is also the limiting factor to interpreting the wealth of genetic information currently available. Although phenotyping methods are always improving, an impasse remains: even if we could measure the entirety of phenotype, how would we interpret it? We propose the concept of cryptotype to describe latent, multivariate phenotypes that maximize the separation of a priori classes. Whether the infinite points comprising a leaf outline or shape descriptors defining root architecture, statistical methods to discern the quantitative essence of an organism will be required as we approach measuring the totality of phenotype. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Comparison of urine specific gravity values from total-solids refractometry and reagent strip method.

    PubMed

    Chatasingh, S; Tapaneya-Olarn, W

    1989-01-01

    The comparison of specific gravity values of 561 urine samples from TS meter and reagent strip was made. The data were divided into two groups: group 1-less than 2+ protein contained urine samples and group 2--equal or more than 2+ protein contained urine samples. The results revealed that the specific gravity values from both methods in both groups were statistically different (p less than 0.01) but they were correlated at r = 0.84 (p less than 0.001) and r = 0.73 (p less than 0.001) in group 1 and group 2, respectively. It was concluded that the reagent strip is suitable for use as a screening test but it should not be considered when precise measurement is necessary.

  13. Free Energy Minimization by Simulated Annealing with Applications to Lithospheric Slabs and Mantle Plumes

    NASA Astrophysics Data System (ADS)

    Bina, C. R.

    An optimization algorithm based upon the method of simulated annealing is of utility in calculating equilibrium phase assemblages as functions of pressure, temperature, and chemical composi tion. Operating by analogy to the statistical mechanics of the chemical system, it is applicable both to problems of strict chemical equilibrium and to problems involving metastability. The method reproduces known phase diagrams and illustrates the expected thermal deflection of phase transitions in thermal models of subducting lithospheric slabs and buoyant mantle plumes. It reveals temperature-induced changes in phase transition sharpness and the stability of Fe-rich γ phase within an α+γ field in cold slab thermal models, and it suggests that transitions such as the possible breakdown of silicate perovskite to mixed oxides can amplify velocity anomalies.

  14. An examination of the challenges influencing science instruction in Florida elementary classrooms

    NASA Astrophysics Data System (ADS)

    North, Stephanie Gwinn

    It has been shown that the mechanical properties of thin films tend to differ from their bulk counterparts. Specifically, the bulge and microtensile testing of thin films used in MEMS have revealed that these films demonstrate an inverse relationship between thickness and strength. A film dimension is not a material property, but it evidently does affect the mechanical performance of materials at very small thicknesses. A hypothetical explanation for this phenomenon is that as the thickness dimension of the film decreases, it is statistically less likely that imperfections exist in the material. It would require a very small thickness (or volume) to limit imperfections in a material, which is why this phenomenon is seen in films with thicknesses on the order of 100 nm to a few microns. Another hypothesized explanation is that the surface tension that exists in bulk material also exists in thin films but has a greater impact at such a small scale. The goal of this research is to identify a theoretical prediction of the strength of thin films based on its microstructural properties such as grain size and film thickness. This would minimize the need for expensive and complicated tests such as the bulge and microtensile tests. In this research, data was collected from the bulge and microtensile testing of copper, aluminum, gold, and polysilicon free-standing thin films. Statistical testing of this data revealed a definitive inverse relationship between thickness and strength, as well as between grain size and strength, as expected. However, due to a lack of a standardized method for either test, there were significant variations in the data. This research compares and analyzes the methods used by other researchers to develop a suggested set of instructions for a standardized bulge test and standardized microtensile test. The most important parameters to be controlled in each test were found to be strain rate, temperature, film deposition method, film length, and strain measurement.

  15. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  16. Assessment of uncertainties of the models used in thermal-hydraulic computer codes

    NASA Astrophysics Data System (ADS)

    Gricay, A. S.; Migrov, Yu. A.

    2015-09-01

    The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.

  17. Medial tibial stress syndrome: evidence-based prevention.

    PubMed

    Craig, Debbie I

    2008-01-01

    Thacker SB, Gilchrist J, Stroup DF, Kimsey CD. The prevention of shin splints in sports: a systematic review of literature. Med Sci Sports Exerc. 2002;34(1):32-40. Among physically active individuals, which medial tibial stress syndrome (MTSS) prevention methods are most effective to decrease injury rates? Studies were identified by searching MEDLINE (1966-2000), Current Contents (1996-2000), Biomedical Collection (1993-1999), and Dissertation Abstracts. Reference lists of identified studies were searched manually until no further studies were identified. Experts in the field were contacted, including first authors of randomized controlled trials addressing prevention of MTSS. The Cochrane Collaboration (early stage of Cochrane Database of Systematic Reviews) was contacted. Inclusion criteria included randomized controlled trials or clinical trials comparing different MTSS prevention methods with control groups. Excluded were studies that did not provide primary research data or that addressed treatment and rehabilitation rather than prevention of incident MTSS. A total of 199 citations were identified. Of these, 4 studies compared prevention methods for MTSS. Three reviewers independently scored the 4 studies. Reviewers were blinded to the authors' names and affiliations but not the results. Each study was evaluated independently for methodologic quality using a 100-point checklist. Final scores were averages of the 3 reviewers' scores. Prevention methods studied were shock-absorbent insoles, foam heel pads, Achilles tendon stretching, footwear, and graduated running programs. No statistically significant results were noted for any of the prevention methods. Median quality scores ranged from 29 to 47, revealing flaws in design, control for bias, and statistical methods. No current evidence supports any single prevention method for MTSS. The most promising outcomes support the use of shock-absorbing insoles. Well-designed and controlled trials are critically needed to decrease the incidence of this common injury.

  18. Statistical methods for nuclear material management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen W.M.; Bennett, C.A.

    1988-12-01

    This book is intended as a reference manual of statistical methodology for nuclear material management practitioners. It describes statistical methods currently or potentially important in nuclear material management, explains the choice of methods for specific applications, and provides examples of practical applications to nuclear material management problems. Together with the accompanying training manual, which contains fully worked out problems keyed to each chapter, this book can also be used as a textbook for courses in statistical methods for nuclear material management. It should provide increased understanding and guidance to help improve the application of statistical methods to nuclear material managementmore » problems.« less

  19. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  20. Quality of reporting statistics in two Indian pharmacology journals.

    PubMed

    Jaykaran; Yadav, Preeti

    2011-04-01

    To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.

  1. Comparison of the effect of lecture and blended teaching methods on students’ learning and satisfaction

    PubMed Central

    SADEGHI, ROYA; SEDAGHAT, MOHAMMAD MEHDI; SHA AHMADI, FARAMARZ

    2014-01-01

    Introduction: Blended learning, a new approach in educational planning, is defined as an applying more than one method, strategy, technique or media in education. Todays, due to the development of infrastructure of Internet networks and the access of most of the students, the Internet can be utilized along with traditional and conventional methods of training. The aim of this study was to compare the students’ learning and satisfaction in combination of lecture and e-learning with conventional lecture methods. Methods: This quasi-experimental study is conducted among the sophomore students of Public Health School, Tehran University of Medical Science in 2012-2013. Four classes of the school are randomly selected and are divided into two groups. Education in two classes (45 students) was in the form of lecture method and in the other two classes (48 students) was blended method with e-Learning and lecture methods. The students’ knowledge about tuberculosis in two groups was collected and measured by using pre and post-test. This step has been done by sending self-reported electronic questionnaires to the students' email addresses through Google Document software. At the end of educational programs, students' satisfaction and comments about two methods were also collected by questionnaires. Statistical tests such as descriptive methods, paired t-test, independent t-test and ANOVA were done through the SPSS 14 software, and p≤0.05 was considered as significant difference. Results: The mean scores of the lecture and blended groups were 13.18±1.37 and 13.35±1.36, respectively; the difference between the pre-test scores of the two groups was not statistically significant (p=0.535). Knowledge scores increased in both groups after training, and the mean and standard deviation of knowledge scores of the lectures and combined groups were 16.51±0.69 and 16.18±1.06, respectively. The difference between the post-test scores of the two groups was not statistically significant (p=0.112). Students’ satisfaction in blended learning method was higher than lecture method. Conclusion: The results revealed that the blended method is effective in increasing the students' learning rate. E-learning can be used to teach some courses and might be considered as economic aspects. Since in universities of medical sciences in the country, the majority of students have access to the Internet and email address, using e-learning could be used as a supplement to traditional teaching methods or sometimes as educational alternative method because this method of teaching increases the students’ knowledge, satisfaction and attention. PMID:25512938

  2. First molecular genotyping of insensitive acetylcholinesterase associated with malathion resistance in Culex quinquefasciatus Say populations in Malaysia.

    PubMed

    Low, Van Lun; Chen, Chee Dhang; Lim, Phaik Eem; Lee, Han Lim; Lim, Yvonne Ai Lian; Tan, Tiong Kai; Sofian-Azirun, Mohd

    2013-12-01

    Given that there is limited available information on the insensitive acetylcholinesterase in insect species in Malaysia, the present study aims to detect the presence of G119S mutation in the acetylcholinesterase gene of Culex quinquefasciatus from 14 residential areas across 13 states and a federal territory in Malaysia. The ace-1 sequence and PCR-RFLP test revealed the presence of glycine-serine ace-1 mutation in the wild populations of Cx. quinquefasciatus. Both direct sequencing and PCR-RFLP methods demonstrated similar results and revealed the presence of a heterozygous genotype at a very low frequency (18 out of 140 individuals), while a homozygous resistant genotype was not detected across any study site in Malaysia. In addition, statistical analysis also revealed that malathion resistance is associated with the frequency of ace-1(R) in Cx. quinquefasciatus populations. This study has demonstrated the first field-evolved instance of G119S mutation in Malaysian populations. Molecular identification of insensitive acetylcholinesterase provides significant insights into the evolution and adaptation of the Malaysian Cx. quinquefasciatus populations. © 2013 Society of Chemical Industry.

  3. Feasibility of integrating mental health and noncommunicable disease risk factor screening in periodical medical examination of employees in industries: An exploratory initiative

    PubMed Central

    Sukumar, Gautham Melur; Kupatira, Kowshik; Gururaj, G.

    2015-01-01

    Background: Noncommunicable disease (NCDs), psychological, substance use disorders, and stress-related issues have been less understood in Indian industrial settings. Systems for screening and early identification of the above have not been integrated in workplaces, nor there is a strong regulatory backing for the same. Aim: To explore the feasibility of integrating mental health and select NCD risk factor screening with the periodical medical examination of employees. To identify proportion of employees with select NCD risk factors and symptoms suggestive of mental health problems. Settings and Design: Around 10% of employees from a leading motor industry in Bangalore, (706) participated in this cross-sectional voluntary screening program. Materials and Methods: This screening was conducted as a part of their annual medical examination. A mixed method of self-report and interview administered technique was adopted for the same. Statistical Analysis: Descriptive statistical methods (proportions, median, mean, and standard deviation (SD)) and Chi-square test of significance. Results and Conclusions: Screening revealed the following; tobacco use (18%), alcohol use (57%), perceived work stress (10%), and obesity (3%). Nearly 23% screened positive for psychological distress. Time consumed for this assessment was 1–5 min. Initial attempts point out that it is feasible to integrate screening for mental health, substance use, and NCD risk factors in periodic medical examination using a combination of self-report and interview-administered method, though further detailed assessments for confirmation is necessary. PMID:26023267

  4. Evaluation of validity of Tanaka-Johnston analysis in Mumbai school children.

    PubMed

    Hambire, Chaitali Umesh; Sujan, Sunanda

    2015-01-01

    Estimation of the mesiodistal dimensions of the unerupted canines and premolars in the early mixed dentition is a necessary diagnostic aid in space management. Tanaka-Johnston analysis was developed for North American children. Anthropological study reveals that tooth size varies among different ethnicities. The present study was performed to evaluate the validity of Tanaka-Johnston method of mixed dentition arch analysis in Mumbai school children. (1) To determine the correlation between the sum of the mesiodistal widths of the permanent mandibular incisors and combined mesiodistal widths of the permanent mandibular and maxillary canines and premolar in Mumbai school children. (2) To examine the applicability of Tanaka-Johnston method of prediction. Dental casts of maxillary and mandibular arches of 300 children, 147 boys and 153 girls within the age group of 12-15 years, with permanent dentitions were fabricated. The mesiodistal crown dimensions of teeth were measured with a dial caliper. Tanaka-Johnston method of mixed dentition arch analysis was performed for the study population, and statistical analysis was done. Descriptive statistics including the mean, standard deviation, range, and standard error were calculated and tabulated. Tanaka-Johnston's equation when applied to the data available for Mumbai school children, it was observed that it slightly overestimates the tooth size. (1) There was a positive correlation between the width of mandibular incisors and mandibular and maxillary canines and premolars. (2) The Tanaka-Johnston prediction method was not accurate for a sample of Mumbai school children.

  5. [Factors conditioning primary care services utilization. Empirical evidence and methodological inconsistencies].

    PubMed

    Sáez, M

    2003-01-01

    In Spain, the degree and characteristics of primary care services utilization have been the subject of analysis since at least the 1980s. One of the main reasons for this interest is to assess the extent to which utilization matches primary care needs. In fact, the provision of an adequate health service for those who most need it is a generally accepted priority. The evidence shows that individual characteristics, mainly health status, are the factors most closely related to primary care utilization. Other personal characteristics, such as gender and age, could act as modulators of health care need. Some family and/or cultural variables, as well as factors related to the health care professional and institutions, could explain some of the observed variability in primary care services utilization. Socioeconomic variables, such as income, reveal a paradox. From an aggregate perspective, income is the main determinant of utilization as well as of health care expenditure. When data are analyzed for individuals, however, income is not related to primary health utilization. The situation is controversial, with methodological implications and, above all, consequences for the assessment of the efficiency in primary care utilization. Review of the literature reveals certain methodological inconsistencies that could at least partly explain the disparity of the empirical results. Among others, the following flaws can be highlighted: design problems, measurement errors, misspecification, and misleading statistical methods.Some solutions, among others, are quasi-experiments, the use of large administrative databases and of primary data sources (design problems); differentiation between types of utilization and between units of analysis other than consultations, and correction of measurement errors in the explanatory variables (measurement errors); consideration of relevant explanatory variables (misspecification); and the use of multilevel models (statistical methods).

  6. Personality, Driving Behavior and Mental Disorders Factors as Predictors of Road Traffic Accidents Based on Logistic Regression

    PubMed Central

    Alavi, Seyyed Salman; Mohammadi, Mohammad Reza; Souri, Hamid; Mohammadi Kalhori, Soroush; Jannatifard, Fereshteh; Sepahbodi, Ghazal

    2017-01-01

    Background: The aim of this study was to evaluate the effect of variables such as personality traits, driving behavior and mental illness on road traffic accidents among the drivers with accidents and those without road crash. Methods: In this cohort study, 800 bus and truck drivers were recruited. Participants were selected among drivers who referred to Imam Sajjad Hospital (Tehran, Iran) during 2013-2015. The Manchester driving behavior questionnaire (MDBQ), big five personality test (NEO personality inventory) and semi-structured interview (schizophrenia and affective disorders scale) were used. After two years, we surveyed all accidents due to human factors that involved the recruited drivers. The data were analyzed using the SPSS software by performing the descriptive statistics, t-test, and multiple logistic regression analysis methods. P values less than 0.05 were considered statistically significant. Results: In terms of controlling the effective and demographic variables, the findings revealed significant differences between the two groups of drivers that were and were not involved in road accidents. In addition, it was found that depression and anxiety could increase the odds ratio (OR) of road accidents by 2.4- and 2.7-folds, respectively (P=0.04, P=0.004). It is noteworthy to mention that neuroticism alone can increase the odds of road accidents by 1.1-fold (P=0.009), but other personality factors did not have a significant effect on the equation. Conclusion: The results revealed that some mental disorders affect the incidence of road collisions. Considering the importance and sensitivity of driving behavior, it is necessary to evaluate multiple psychological factors influencing drivers before and after receiving or renewing their driver’s license. PMID:28293047

  7. Local yield stress statistics in model amorphous solids

    NASA Astrophysics Data System (ADS)

    Barbot, Armand; Lerbinger, Matthias; Hernandez-Garcia, Anier; García-García, Reinaldo; Falk, Michael L.; Vandembroucq, Damien; Patinet, Sylvain

    2018-03-01

    We develop and extend a method presented by Patinet, Vandembroucq, and Falk [Phys. Rev. Lett. 117, 045501 (2016), 10.1103/PhysRevLett.117.045501] to compute the local yield stresses at the atomic scale in model two-dimensional Lennard-Jones glasses produced via differing quench protocols. This technique allows us to sample the plastic rearrangements in a nonperturbative manner for different loading directions on a well-controlled length scale. Plastic activity upon shearing correlates strongly with the locations of low yield stresses in the quenched states. This correlation is higher in more structurally relaxed systems. The distribution of local yield stresses is also shown to strongly depend on the quench protocol: the more relaxed the glass, the higher the local plastic thresholds. Analysis of the magnitude of local plastic relaxations reveals that stress drops follow exponential distributions, justifying the hypothesis of an average characteristic amplitude often conjectured in mesoscopic or continuum models. The amplitude of the local plastic rearrangements increases on average with the yield stress, regardless of the system preparation. The local yield stress varies with the shear orientation tested and strongly correlates with the plastic rearrangement locations when the system is sheared correspondingly. It is thus argued that plastic rearrangements are the consequence of shear transformation zones encoded in the glass structure that possess weak slip planes along different orientations. Finally, we justify the length scale employed in this work and extract the yield threshold statistics as a function of the size of the probing zones. This method makes it possible to derive physically grounded models of plasticity for amorphous materials by directly revealing the relevant details of the shear transformation zones that mediate this process.

  8. Hidden treasures in "ancient" microarrays: gene-expression portrays biology and potential resistance pathways of major lung cancer subtypes and normal tissue.

    PubMed

    Kerkentzes, Konstantinos; Lagani, Vincenzo; Tsamardinos, Ioannis; Vyberg, Mogens; Røe, Oluf Dimitri

    2014-01-01

    Novel statistical methods and increasingly more accurate gene annotations can transform "old" biological data into a renewed source of knowledge with potential clinical relevance. Here, we provide an in silico proof-of-concept by extracting novel information from a high-quality mRNA expression dataset, originally published in 2001, using state-of-the-art bioinformatics approaches. The dataset consists of histologically defined cases of lung adenocarcinoma (AD), squamous (SQ) cell carcinoma, small-cell lung cancer, carcinoid, metastasis (breast and colon AD), and normal lung specimens (203 samples in total). A battery of statistical tests was used for identifying differential gene expressions, diagnostic and prognostic genes, enriched gene ontologies, and signaling pathways. Our results showed that gene expressions faithfully recapitulate immunohistochemical subtype markers, as chromogranin A in carcinoids, cytokeratin 5, p63 in SQ, and TTF1 in non-squamous types. Moreover, biological information with putative clinical relevance was revealed as potentially novel diagnostic genes for each subtype with specificity 93-100% (AUC = 0.93-1.00). Cancer subtypes were characterized by (a) differential expression of treatment target genes as TYMS, HER2, and HER3 and (b) overrepresentation of treatment-related pathways like cell cycle, DNA repair, and ERBB pathways. The vascular smooth muscle contraction, leukocyte trans-endothelial migration, and actin cytoskeleton pathways were overexpressed in normal tissue. Reanalysis of this public dataset displayed the known biological features of lung cancer subtypes and revealed novel pathways of potentially clinical importance. The findings also support our hypothesis that even old omics data of high quality can be a source of significant biological information when appropriate bioinformatics methods are used.

  9. An investigation on the population structure of mixed infections of Mycobacterium tuberculosis in Inner Mongolia, China.

    PubMed

    Wang, Xiaoying; Liu, Haican; Wei, Jianhao; Wu, Xiaocui; Yu, Qin; Zhao, Xiuqin; Lyu, Jianxin; Lou, Yongliang; Wan, Kanglin

    2015-12-01

    Mixed infections of Mycobacterium tuberculosis strains have attracted more attention due to their increasing frequencies worldwide, especially in the areas of high tuberculosis (TB) prevalence. In this study, we accessed the rates of mixed infections in a setting with high TB prevalence in Inner Mongolia Autonomous Region of China. A total of 384 M. tuberculosis isolates from the local TB hospital were subjected to mycobacterial interspersed repetitive unit-variable number tandem repeat (MIRU-VNTR) typing method. The single clones of the strains with mixed infections were separated by subculturing them on the Löwenstein-Jensen medium. Of these 384 isolates, twelve strains (3.13%) were identified as mixed infections by MIRU-VNTR. Statistical analysis indicated that demographic characteristics and drug susceptibility profiles showed no statistically significant association with the mixed infections. We further subcultured the mixed infection strains and selected 30 clones from the subculture for each mixed infection. Genotyping data revealed that eight (8/12, 66.7%) strains with mixed infections had converted into single infection through subculture. The higher growth rate was associated with the increasing proportion of variant subpopulation through subculture. In conclusion, by using the MIRU-VNTR method, we demonstrate that the prevalence of mixed infections in Inner Mongolia is low. Additionally, our findings reveal that the subculture changes the population structures of mixed infections, and the subpopulation with higher growth rate show better fitness, which is associated with high proportion among the population structure after subculture. This study highlights that the use of clinical specimens, rather than subcultured isolates, is preferred to estimate the prevalence of mixed infections in the specific regions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Geographically weighted regression as a generalized Wombling to detect barriers to gene flow.

    PubMed

    Diniz-Filho, José Alexandre Felizola; Soares, Thannya Nascimento; de Campos Telles, Mariana Pires

    2016-08-01

    Barriers to gene flow play an important role in structuring populations, especially in human-modified landscapes, and several methods have been proposed to detect such barriers. However, most applications of these methods require a relative large number of individuals or populations distributed in space, connected by vertices from Delaunay or Gabriel networks. Here we show, using both simulated and empirical data, a new application of geographically weighted regression (GWR) to detect such barriers, modeling the genetic variation as a "local" linear function of geographic coordinates (latitude and longitude). In the GWR, standard regression statistics, such as R(2) and slopes, are estimated for each sampling unit and thus are mapped. Peaks in these local statistics are then expected close to the barriers if genetic discontinuities exist, capturing a higher rate of population differentiation among neighboring populations. Isolation-by-Distance simulations on a longitudinally warped lattice revealed that higher local slopes from GWR coincide with the barrier detected with Monmonier algorithm. Even with a relatively small effect of the barrier, the power of local GWR in detecting the east-west barriers was higher than 95 %. We also analyzed empirical data of genetic differentiation among tree populations of Dipteryx alata and Eugenia dysenterica Brazilian Cerrado. GWR was applied to the principal coordinate of the pairwise FST matrix based on microsatellite loci. In both simulated and empirical data, the GWR results were consistent with discontinuities detected by Monmonier algorithm, as well as with previous explanations for the spatial patterns of genetic differentiation for the two species. Our analyses reveal how this new application of GWR can viewed as a generalized Wombling in a continuous space and be a useful approach to detect barriers and discontinuities to gene flow.

  11. Clustering of 3D-Structure Similarity Based Network of Secondary Metabolites Reveals Their Relationships with Biological Activities.

    PubMed

    Ohtana, Yuki; Abdullah, Azian Azamimi; Altaf-Ul-Amin, Md; Huang, Ming; Ono, Naoaki; Sato, Tetsuo; Sugiura, Tadao; Horai, Hisayuki; Nakamura, Yukiko; Morita Hirai, Aki; Lange, Klaus W; Kibinge, Nelson K; Katsuragi, Tetsuo; Shirai, Tsuyoshi; Kanaya, Shigehiko

    2014-12-01

    Developing database systems connecting diverse species based on omics is the most important theme in big data biology. To attain this purpose, we have developed KNApSAcK Family Databases, which are utilized in a number of researches in metabolomics. In the present study, we have developed a network-based approach to analyze relationships between 3D structure and biological activity of metabolites consisting of four steps as follows: construction of a network of metabolites based on structural similarity (Step 1), classification of metabolites into structure groups (Step 2), assessment of statistically significant relations between structure groups and biological activities (Step 3), and 2-dimensional clustering of the constructed data matrix based on statistically significant relations between structure groups and biological activities (Step 4). Applying this method to a data set consisting of 2072 secondary metabolites and 140 biological activities reported in KNApSAcK Metabolite Activity DB, we obtained 983 statistically significant structure group-biological activity pairs. As a whole, we systematically analyzed the relationship between 3D-chemical structures of metabolites and biological activities. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quasi-Monochromatic Visual Environments and the Resting Point of Accommodation

    DTIC Science & Technology

    1988-01-01

    accommodation. No statistically significant differences were revealed to support the possibility of color mediated differential regression to resting...discussed with respect to the general findings of the total sample as well as the specific behavior of individual participants. The summarized statistics ...remaining ten varied considerably with respect to the averaged trends reported in the above descriptive statistics as well as with respect to precision

  13. Cardiotoxicity of Freon among refrigeration services workers: comparative cross-sectional study

    PubMed Central

    2009-01-01

    Background Freon includes a number of gaseous, colorless chlorofluorocarbons. Although freon is generally considered to be a fluorocarbon of relatively low toxicity; significantly detrimental effects may occur upon over exposure. The purpose of the present study is to investigate whether occupational exposure to fluorocarbons can induce arterial hypertension, myocardial ischemia, cardiac arrhythmias, elevated levels of plasma lipids and renal dysfunction. Methods This comparative cross-sectional study was conducted at the cardiology clinic of the Suez Canal Authority Hospital (Egypt). The study included 23 apparently healthy male workers at the refrigeration services workshop who were exposed to fluorocarbons (FC 12 and FC 22) and 23 likewise apparently healthy male workers (unexposed), the control group. All the participants were interviewed using a pre-composed questionnaire and were subjected to a clinical examination and relevant laboratory investigations. Results There were no significant statistical differences between the groups studied regarding symptoms suggesting arterial hypertension and renal affection, although a significantly higher percentage of the studied refrigeration services workers had symptoms of arrhythmias. None of the workers had symptoms suggesting coronary artery disease. Clinical examination revealed that the refrigeration services workers had a significantly higher mean pulse rate compared to the controls, though no significant statistical differences were found in arterial blood pressure measurements between the two study groups. Exercise stress testing of the workers studied revealed normal heart reaction to the increased need for oxygen, while sinus tachycardia was detected in all the participants. The results of Holter monitoring revealed significant differences within subject and group regarding the number of abnormal beats detected throughout the day of monitoring (p < 0.001). There were no significant differences detected in the average heart rate during the monitoring period within subject or group. Most laboratory investigations revealed absence of significant statistical differences for lipid profile markers, serum electrolyte levels and glomerular lesion markers between the groups except for cholesterol and urinary β2-microglobulin (tubular lesion markers) levels which were significantly elevated in freon exposed workers. Conclusions Unprotected occupational exposure to chlorofluorocarbons can induce cardiotoxicity in the form of cardiac arrhythmias. The role of chlorofluorocarbons in inducing arterial hypertension and coronary artery diseases is unclear, although significantly elevated serum cholesterol and urinary β2-microglobulin levels raise a concern. PMID:19594908

  14. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  15. Guidelines 13 and 14—Prediction uncertainty

    USGS Publications Warehouse

    Hill, Mary C.; Tiedeman, Claire

    2005-01-01

    An advantage of using optimization for model development and calibration is that optimization provides methods for evaluating and quantifying prediction uncertainty. Both deterministic and statistical methods can be used. Guideline 13 discusses using regression and post-audits, which we classify as deterministic methods. Guideline 14 discusses inferential statistics and Monte Carlo methods, which we classify as statistical methods.

  16. Metabolomic analysis based on 1H-nuclear magnetic resonance spectroscopy metabolic profiles in tuberculous, malignant and transudative pleural effusion

    PubMed Central

    Wang, Cheng; Peng, Jingjin; Kuang, Yanling; Zhang, Jiaqiang; Dai, Luming

    2017-01-01

    Pleural effusion is a common clinical manifestation with various causes. Current diagnostic and therapeutic methods have exhibited numerous limitations. By involving the analysis of dynamic changes in low molecular weight catabolites, metabolomics has been widely applied in various types of disease and have provided platforms to distinguish many novel biomarkers. However, to the best of our knowledge, there are few studies regarding the metabolic profiling for pleural effusion. In the current study, 58 pleural effusion samples were collected, among which 20 were malignant pleural effusions, 20 were tuberculous pleural effusions and 18 were transudative pleural effusions. The small molecule metabolite spectrums were obtained by adopting 1H nuclear magnetic resonance technology, and pattern-recognition multi-variable statistical analysis was used to screen out different metabolites. One-way analysis of variance, and Student-Newman-Keuls and the Kruskal-Wallis test were adopted for statistical analysis. Over 400 metabolites were identified in the untargeted metabolomic analysis and 26 metabolites were identified as significantly different among tuberculous, malignant and transudative pleural effusions. These metabolites were predominantly involved in the metabolic pathways of amino acids metabolism, glycometabolism and lipid metabolism. Statistical analysis revealed that eight metabolites contributed to the distinction between the three groups: Tuberculous, malignant and transudative pleural effusion. In the current study, the feasibility of identifying small molecule biochemical profiles in different types of pleural effusion were investigated reveal novel biological insights into the underlying mechanisms. The results provide specific insights into the biology of tubercular, malignant and transudative pleural effusion and may offer novel strategies for the diagnosis and therapy of associated diseases, including tuberculosis, advanced lung cancer and congestive heart failure. PMID:28627685

  17. Separating Putative Pathogens from Background Contamination with Principal Orthogonal Decomposition: Evidence for Leptospira in the Ugandan Neonatal Septisome

    PubMed Central

    Schiff, Steven J.; Kiwanuka, Julius; Riggio, Gina; Nguyen, Lan; Mu, Kevin; Sproul, Emily; Bazira, Joel; Mwanga-Amumpaire, Juliet; Tumusiime, Dickson; Nyesigire, Eunice; Lwanga, Nkangi; Bogale, Kaleb T.; Kapur, Vivek; Broach, James R.; Morton, Sarah U.; Warf, Benjamin C.; Poss, Mary

    2016-01-01

    Neonatal sepsis (NS) is responsible for over 1 million yearly deaths worldwide. In the developing world, NS is often treated without an identified microbial pathogen. Amplicon sequencing of the bacterial 16S rRNA gene can be used to identify organisms that are difficult to detect by routine microbiological methods. However, contaminating bacteria are ubiquitous in both hospital settings and research reagents and must be accounted for to make effective use of these data. In this study, we sequenced the bacterial 16S rRNA gene obtained from blood and cerebrospinal fluid (CSF) of 80 neonates presenting with NS to the Mbarara Regional Hospital in Uganda. Assuming that patterns of background contamination would be independent of pathogenic microorganism DNA, we applied a novel quantitative approach using principal orthogonal decomposition to separate background contamination from potential pathogens in sequencing data. We designed our quantitative approach contrasting blood, CSF, and control specimens and employed a variety of statistical random matrix bootstrap hypotheses to estimate statistical significance. These analyses demonstrate that Leptospira appears present in some infants presenting within 48 h of birth, indicative of infection in utero, and up to 28 days of age, suggesting environmental exposure. This organism cannot be cultured in routine bacteriological settings and is enzootic in the cattle that often live in close proximity to the rural peoples of western Uganda. Our findings demonstrate that statistical approaches to remove background organisms common in 16S sequence data can reveal putative pathogens in small volume biological samples from newborns. This computational analysis thus reveals an important medical finding that has the potential to alter therapy and prevention efforts in a critically ill population. PMID:27379237

  18. [Decreased retraction of blood clots in patients with venous thromboembolic complications].

    PubMed

    Bredikhin, R A; Peshkova, A D; Maliasev, D V; Batrakova, M V; Le Min, J; Panasiuk, M V; Fatkhullina, L S; Ignat'ev, I M; Khaĭrullin, R N; Litvinov, R I

    Haemostatic disorders play an important role in the pathogenesis of acute venous thrombosis. One of the least studied reactions of blood coagulation and thrombogenesis is spontaneous contraction of blood clots, which takes place at the expense of the contractility apparatus of activated blood platelets adhered to fibrin fibres. The work was aimed at studying the parameters of contraction of blood clots, formed in vitro, in blood of 41 patients with acute venous thromboses as compared with the same parameters in apparently healthy donors. We used a new instrumental method making it possible to determine the time from initiation to the beginning of contraction, as well as the degree and velocity of clot contraction. It was revealed that in patients with venous thrombosis the ability of clots to shrink was significantly reduced as compared with the control. We detected a statistically significant retardation of and decrease in of blood clot concentration in patients with venous thrombosis complicated by pulmonary artery thromboembolism as compared with contraction in patients with isolated deep vein thrombosis, witch may be important for early diagnosis and determination of the risk of thromboembolism. Besides, we revealed a statistically significant retardation of contraction in patients with proximal thrombosis as compared with contraction in patients with distal thrombosis, with similar values of the degree of contraction. Contraction was statistically significantly reduced in acute thrombosis (less than 21 days), whereas in subacute thrombosis (more than 21 days) the parameters of contraction were closer to normal values. The obtained findings suggest that reduction of blood clot contraction may be a new, hitherto unstudied pathogenetic mechanism deteriorating the course and outcome of venous thrombosis. The clinical significance of contraction and its impairments, as well as the diagnostic and prognostic value of the laboratory test for blood clot contraction would merit further study.

  19. Contrast-Enhanced Ultrasonography in Differential Diagnosis of Benign and Malignant Ovarian Tumors

    PubMed Central

    Qiao, Jing-Jing; Yu, Jing; Yu, Zhe; Li, Na; Song, Chen; Li, Man

    2015-01-01

    Objective To evaluate the accuracy of contrast-enhanced ultrasonography (CEUS) in differential diagnosis of benign and malignant ovarian tumors. Methods The scientific literature databases PubMed, Cochrane Library and CNKI were comprehensively searched for studies relevant to the use of CEUS technique for differential diagnosis of benign and malignant ovarian cancer. Pooled summary statistics for specificity (Spe), sensitivity (Sen), positive and negative likelihood ratios (LR+/LR−), and diagnostic odds ratio (DOR) and their 95%CIs were calculated. Software for statistical analysis included STATA version 12.0 (Stata Corp, College Station, TX, USA) and Meta-Disc version 1.4 (Universidad Complutense, Madrid, Spain). Results Following a stringent selection process, seven high quality clinical trials were found suitable for inclusion in the present meta-analysis. The 7 studies contained a combined total of 375 ovarian cancer patients (198 malignant and 177 benign). Statistical analysis revealed that CEUS was associated with the following performance measures in differential diagnosis of ovarian tumors: pooled Sen was 0.96 (95%CI = 0.92∼0.98); the summary Spe was 0.91 (95%CI = 0.86∼0.94); the pooled LR+ was 10.63 (95%CI = 6.59∼17.17); the pooled LR− was 0.04 (95%CI = 0.02∼0.09); and the pooled DOR was 241.04 (95% CI = 92.61∼627.37). The area under the SROC curve was 0.98 (95% CI = 0.20∼1.00). Lastly, publication bias was not detected (t = −0.52, P = 0.626) in the meta-analysis. Conclusions Our results revealed the high clinical value of CEUS in differential diagnosis of benign and malignant ovarian tumors. Further, CEUS may also prove to be useful in differential diagnosis at early stages of this disease. PMID:25764442

  20. Effect of denture cleaning on abrasion resistance and surface topography of polymerized CAD CAM acrylic resin denture base

    PubMed Central

    Shinawi, Lana Ahmed

    2017-01-01

    Background The application of computer-aided design computer-aided manufacturing (CAD CAM) technology in the fabrication of complete dentures, offers numerous advantages as it provides optimum fit and eliminates polymerization shrinkage of the acrylic base. Additionally, the porosity and surface roughness of CAD CAM resins is less compared to conventionally processed resins which leads to a decrease in the adhesion of bacteria on the denture base, which is associated with many conditions including halitosis and aspiration pneumonia in elderly denture wearers. Aim To evaluate the influence of tooth brushing with dentifrices on CAD CAM resin blocks in terms of abrasion resistance, surface roughness and scanning electron photomicrography. Methods This experimental study was carried out at the Faculty of Dentistry of King Abdulaziz University during 2016. A total of 40 rectangular shaped polymerized CAD CAM resin samples were subjected to 40.000 and 60.000 brushing strokes under a 200-gram vertical load simulating three years of tooth brushing strokes using commercially available denture cleaning dentifrice. Data were analyzed by SPSS version 20, using descriptive statistics and ANOVA. Results ANOVA test revealed a statistical significant weight loss of CAD CAM acrylic resin denture base specimens following 40.000 and 60.000 brushing strokes as well as a statistical significant change (p=0.0.5) in the surface roughness following brushing. The CAD CAM resin samples SEM baseline imaging revealed a relatively smooth homogenous surface, but following 40,000 and 60,000 brushing strokes, imaging displayed the presence of small scratches on the surface. Conclusion CAD CAM resin displayed a homogenous surface initially with low surface roughness that was significantly affected following simulating three years of manual brushing, but despite the significant weight loss, the findings are within the clinically acceptable limits. PMID:28713496

  1. Comparison of the effect of lecture and blended teaching methods on students' learning and satisfaction.

    PubMed

    Sadeghi, Roya; Sedaghat, Mohammad Mehdi; Sha Ahmadi, Faramarz

    2014-10-01

    Blended learning, a new approach in educational planning, is defined as an applying more than one method, strategy, technique or media in education. Todays, due to the development of infrastructure of Internet networks and the access of most of the students, the Internet can be utilized along with traditional and conventional methods of training. The aim of this study was to compare the students' learning and satisfaction in combination of lecture and e-learning with conventional lecture methods. This quasi-experimental study is conducted among the sophomore students of Public Health School, Tehran University of Medical Science in 2012-2013. Four classes of the school are randomly selected and are divided into two groups. Education in two classes (45 students) was in the form of lecture method and in the other two classes (48 students) was blended method with e-Learning and lecture methods. The students' knowledge about tuberculosis in two groups was collected and measured by using pre and post-test. This step has been done by sending self-reported electronic questionnaires to the students' email addresses through Google Document software. At the end of educational programs, students' satisfaction and comments about two methods were also collected by questionnaires. Statistical tests such as descriptive methods, paired t-test, independent t-test and ANOVA were done through the SPSS 14 software, and p≤0.05 was considered as significant difference. The mean scores of the lecture and blended groups were 13.18±1.37 and 13.35±1.36, respectively; the difference between the pre-test scores of the two groups was not statistically significant (p=0.535). Knowledge scores increased in both groups after training, and the mean and standard deviation of knowledge scores of the lectures and combined groups were 16.51±0.69 and 16.18±1.06, respectively. The difference between the post-test scores of the two groups was not statistically significant (p=0.112). Students' satisfaction in blended learning method was higher than lecture method. The results revealed that the blended method is effective in increasing the students' learning rate. E-learning can be used to teach some courses and might be considered as economic aspects. Since in universities of medical sciences in the country, the majority of students have access to the Internet and email address, using e-learning could be used as a supplement to traditional teaching methods or sometimes as educational alternative method because this method of teaching increases the students' knowledge, satisfaction and attention.

  2. Communicating the risks of fetal alcohol spectrum disorder: effects of message framing and exemplification.

    PubMed

    Yu, Nan; Ahern, Lee A; Connolly-Ahern, Colleen; Shen, Fuyuan

    2010-12-01

    Health messages can be either informative or descriptive, and can emphasize either potential losses or gains. This study, guided by message framing theory and exemplification theory, specifically investigated the combined effects of messages with loss-gain frames mixed with statistics or exemplar appeals. The findings revealed a series of main effects and interactions for loss-gain frames and statistics-exemplar appeals on fetal alcohol spectrum disorder (FASD) prevention intention, intention to know more, perceived severity, perceived fear, perceived external efficacy, and perceived internal efficacy. The gain-statistics appeal showed an advantage in promoting perceived efficacy toward FASD, while the loss-exemplar appeal revealed an advantage in increasing prevention intention, perceived severity, and perceived fear toward FASD. Limitations and implications for future research are discussed.

  3. Is Statistical Learning Constrained by Lower Level Perceptual Organization?

    PubMed Central

    Emberson, Lauren L.; Liu, Ran; Zevin, Jason D.

    2013-01-01

    In order for statistical information to aid in complex developmental processes such as language acquisition, learning from higher-order statistics (e.g. across successive syllables in a speech stream to support segmentation) must be possible while perceptual abilities (e.g. speech categorization) are still developing. The current study examines how perceptual organization interacts with statistical learning. Adult participants were presented with multiple exemplars from novel, complex sound categories designed to reflect some of the spectral complexity and variability of speech. These categories were organized into sequential pairs and presented such that higher-order statistics, defined based on sound categories, could support stream segmentation. Perceptual similarity judgments and multi-dimensional scaling revealed that participants only perceived three perceptual clusters of sounds and thus did not distinguish the four experimenter-defined categories, creating a tension between lower level perceptual organization and higher-order statistical information. We examined whether the resulting pattern of learning is more consistent with statistical learning being “bottom-up,” constrained by the lower levels of organization, or “top-down,” such that higher-order statistical information of the stimulus stream takes priority over the perceptual organization, and perhaps influences perceptual organization. We consistently find evidence that learning is constrained by perceptual organization. Moreover, participants generalize their learning to novel sounds that occupy a similar perceptual space, suggesting that statistical learning occurs based on regions of or clusters in perceptual space. Overall, these results reveal a constraint on learning of sound sequences, such that statistical information is determined based on lower level organization. These findings have important implications for the role of statistical learning in language acquisition. PMID:23618755

  4. Glycogen phosphorylase as a target for type 2 diabetes: synthetic, biochemical, structural and computational evaluation of novel N-acyl-N´-(β-D-glucopyranosyl) urea inhibitors.

    PubMed

    Kantsadi, Anastassia L; Parmenopoulou, Vanessa; Bakalov, Dimitar N; Snelgrove, Laura; Stravodimos, George A; Chatzileontiadou, Demetra S M; Manta, Stella; Panagiotopoulou, Angeliki; Hayes, Joseph M; Komiotis, Dimitri; Leonidas, Demetres D

    2015-01-01

    Glycogen phosphorylase (GP), a validated target for the development of anti-hyperglycaemic agents, has been targeted for the design of novel glycopyranosylamine inhibitors. Exploiting the two most potent inhibitors from our previous study of N-acyl-β-D-glucopyranosylamines (Parmenopoulou et al., Bioorg. Med. Chem. 2014, 22, 4810), we have extended the linking group to -NHCONHCO- between the glucose moiety and the aliphatic/aromatic substituent in the GP catalytic site β-cavity. The N-acyl-N´-(β-D-glucopyranosyl) urea inhibitors were synthesized and their efficiency assessed by biochemical methods, revealing inhibition constant values of 4.95 µM and 2.53 µM. Crystal structures of GP in complex with these inhibitors were determined and analyzed, providing data for further structure based design efforts. A novel Linear Response - Molecular Mechanics Coulomb Surface Area (LR-MM-CBSA) method has been developed which relates predicted and experimental binding free energies for a training set of N-acyl-N´-(β-D-glucopyranosyl) urea ligands with a correlation coefficient R(2) of 0.89 and leave-one-out cross-validation (LOO-cv) Q(2) statistic of 0.79. The method has significant applications to direct future lead optimization studies, where ligand entropy loss on binding is revealed as a key factor to be considered. ADMET property predictions revealed that apart from potential permeability issues, the synthesized N-acyl-N´-(β-D-glucopyranosyl) urea inhibitors have drug-like potential without any toxicity warnings.

  5. Individual Fit Testing of Hearing Protection Devices Based on Microphone in Real Ear.

    PubMed

    Biabani, Azam; Aliabadi, Mohsen; Golmohammadi, Rostam; Farhadian, Maryam

    2017-12-01

    Labeled noise reduction (NR) data presented by manufacturers are considered one of the main challenging issues for occupational experts in employing hearing protection devices (HPDs). This study aimed to determine the actual NR data of typical HPDs using the objective fit testing method with a microphone in real ear (MIRE) method. Five available commercially earmuff protectors were investigated in 30 workers exposed to reference noise source according to the standard method, ISO 11904-1. Personal attenuation rating (PAR) of the earmuffs was measured based on the MIRE method using a noise dosimeter (SVANTEK, model SV 102). The results showed that means of PAR of the earmuffs are from 49% to 86% of the nominal NR rating. The PAR values of earmuffs when a typical eyewear was worn differed statistically ( p < 0.05). It is revealed that a typical safety eyewear can reduce the mean of the PAR value by approximately 2.5 dB. The results also showed that measurements based on the MIRE method resulted in low variability. The variability in NR values between individuals, within individuals, and within earmuffs was not the statistically significant ( p > 0.05). This study could provide local individual fit data. Ergonomic aspects of the earmuffs and different levels of users experience and awareness can be considered the main factors affecting individual fitting compared with the laboratory condition for acquiring the labeled NR data. Based on the obtained fit testing results, the field application of MIRE can be employed for complementary studies in real workstations while workers perform their regular work duties.

  6. Authorship attribution based on Life-Like Network Automata.

    PubMed

    Machicao, Jeaneth; Corrêa, Edilson A; Miranda, Gisele H B; Amancio, Diego R; Bruno, Odemir M

    2018-01-01

    The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks.

  7. Lifting the 'violence veil': examining working conditions in long-term care facilities using iterative mixed methods.

    PubMed

    Daly, Tamara; Banerjee, Albert; Armstrong, Pat; Armstrong, Hugh; Szebehely, Marta

    2011-06-01

    We conducted a mixed-methods study-- the focus of this article--to understand how workers in long-term care facilities experienced working conditions. We surveyed unionized care workers in Ontario (n = 917); we also surveyed workers in three Canadian provinces (n = 948) and four Scandinavian countries (n = 1,625). In post-survey focus groups, we presented respondents with survey questions and descriptive statistical findings, and asked them: "Does this reflect your experience?" Workers reported time pressures and the frequency of experiences of physical violence and unwanted sexual attention, as we explain. We discuss how iteratively mixing qualitative and quantitative methods to triangulate survey and focus group results led to expected data convergence and to unexpected data divergence that revealed a normalized culture of structural violence in long-term care facilities. We discuss how the finding of structural violence emerged and also the deeper meaning, context, and insights resulting from our combined methods.

  8. Entropy of Ultrasound-Contrast-Agent Velocity Fields for Angiogenesis Imaging in Prostate Cancer.

    PubMed

    van Sloun, Ruud J G; Demi, Libertario; Postema, Arnoud W; Jmch De La Rosette, Jean; Wijkstra, Hessel; Mischi, Massimo

    2017-03-01

    Prostate cancer care can benefit from accurate and cost-efficient imaging modalities that are able to reveal prognostic indicators for cancer. Angiogenesis is known to play a central role in the growth of tumors towards a metastatic or a lethal phenotype. With the aim of localizing angiogenic activity in a non-invasive manner, Dynamic Contrast Enhanced Ultrasound (DCE-US) has been widely used. Usually, the passage of ultrasound contrast agents thought the organ of interest is analyzed for the assessment of tissue perfusion. However, the heterogeneous nature of blood flow in angiogenic vasculature hampers the diagnostic effectiveness of perfusion parameters. In this regard, quantification of the heterogeneity of flow may provide a relevant additional feature for localizing angiogenesis. Statistics based on flow magnitude as well as its orientation can be exploited for this purpose. In this paper, we estimate the microbubble velocity fields from a standard bolus injection and provide a first statistical characterization by performing a spatial entropy analysis. By testing the method on 24 patients with biopsy-proven prostate cancer, we show that the proposed method can be applied effectively to clinically acquired DCE-US data. The method permits estimation of the in-plane flow vector fields and their local intricacy, and yields promising results (receiver-operating-characteristic curve area of 0.85) for the detection of prostate cancer.

  9. Mechanistic analysis of challenge-response experiments.

    PubMed

    Shotwell, M S; Drake, K J; Sidorov, V Y; Wikswo, J P

    2013-09-01

    We present an application of mechanistic modeling and nonlinear longitudinal regression in the context of biomedical response-to-challenge experiments, a field where these methods are underutilized. In this type of experiment, a system is studied by imposing an experimental challenge, and then observing its response. The combination of mechanistic modeling and nonlinear longitudinal regression has brought new insight, and revealed an unexpected opportunity for optimal design. Specifically, the mechanistic aspect of our approach enables the optimal design of experimental challenge characteristics (e.g., intensity, duration). This article lays some groundwork for this approach. We consider a series of experiments wherein an isolated rabbit heart is challenged with intermittent anoxia. The heart responds to the challenge onset, and recovers when the challenge ends. The mean response is modeled by a system of differential equations that describe a candidate mechanism for cardiac response to anoxia challenge. The cardiac system behaves more variably when challenged than when at rest. Hence, observations arising from this experiment exhibit complex heteroscedasticity and sharp changes in central tendency. We present evidence that an asymptotic statistical inference strategy may fail to adequately account for statistical uncertainty. Two alternative methods are critiqued qualitatively (i.e., for utility in the current context), and quantitatively using an innovative Monte-Carlo method. We conclude with a discussion of the exciting opportunities in optimal design of response-to-challenge experiments. © 2013, The International Biometric Society.

  10. WE-AB-209-12: Quasi Constrained Multi-Criteria Optimization for Automated Radiation Therapy Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, W.T.; Siebers, J.V.

    Purpose: To introduce quasi-constrained Multi-Criteria Optimization (qcMCO) for unsupervised radiation therapy optimization which generates alternative patient-specific plans emphasizing dosimetric tradeoffs and conformance to clinical constraints for multiple delivery techniques. Methods: For N Organs At Risk (OARs) and M delivery techniques, qcMCO generates M(N+1) alternative treatment plans per patient. Objective weight variations for OARs and targets are used to generate alternative qcMCO plans. For 30 locally advanced lung cancer patients, qcMCO plans were generated for dosimetric tradeoffs to four OARs: each lung, heart, and esophagus (N=4) and 4 delivery techniques (simple 4-field arrangements, 9-field coplanar IMRT, 27-field non-coplanar IMRT, and non-coplanarmore » Arc IMRT). Quasi-constrained objectives included target prescription isodose to 95% (PTV-D95), maximum PTV dose (PTV-Dmax)< 110% of prescription, and spinal cord Dmax<45 Gy. The algorithm’s ability to meet these constraints while simultaneously revealing dosimetric tradeoffs was investigated. Statistically significant dosimetric tradeoffs were defined such that the coefficient of determination between dosimetric indices which varied by at least 5 Gy between different plans was >0.8. Results: The qcMCO plans varied mean dose by >5 Gy to ipsilateral lung for 24/30 patients, contralateral lung for 29/30 patients, esophagus for 29/30 patients, and heart for 19/30 patients. In the 600 plans computed without human interaction, average PTV-D95=67.4±3.3 Gy, PTV-Dmax=79.2±5.3 Gy, and spinal cord Dmax was >45 Gy in 93 plans (>50 Gy in 2/600 plans). Statistically significant dosimetric tradeoffs were evident in 19/30 plans, including multiple tradeoffs of at least 5 Gy between multiple OARs in 7/30 cases. The most common statistically significant tradeoff was increasing PTV-Dmax to reduce OAR dose (15/30 patients). Conclusion: The qcMCO method can conform to quasi-constrained objectives while revealing significant variations in OAR doses including mean dose reductions >5 Gy. Clinical implementation will facilitate patient-specific decision making based on achievable dosimetry as opposed to accept/reject models based on population derived objectives.« less

  11. Making Sense of 'Big Data' in Provenance Studies

    NASA Astrophysics Data System (ADS)

    Vermeesch, P.

    2014-12-01

    Huge online databases can be 'mined' to reveal previously hidden trends and relationships in society. One could argue that sedimentary geology has entered a similar era of 'Big Data', as modern provenance studies routinely apply multiple proxies to dozens of samples. Just like the Internet, sedimentary geology now requires specialised statistical tools to interpret such large datasets. These can be organised on three levels of progressively higher order:A single sample: The most effective way to reveal the provenance information contained in a representative sample of detrital zircon U-Pb ages are probability density estimators such as histograms and kernel density estimates. The widely popular 'probability density plots' implemented in IsoPlot and AgeDisplay compound analytical uncertainty with geological scatter and are therefore invalid.Several samples: Multi-panel diagrams comprising many detrital age distributions or compositional pie charts quickly become unwieldy and uninterpretable. For example, if there are N samples in a study, then the number of pairwise comparisons between samples increases quadratically as N(N-1)/2. This is simply too much information for the human eye to process. To solve this problem, it is necessary to (a) express the 'distance' between two samples as a simple scalar and (b) combine all N(N-1)/2 such values in a single two-dimensional 'map', grouping similar and pulling apart dissimilar samples. This can be easily achieved using simple statistics-based dissimilarity measures and a standard statistical method called Multidimensional Scaling (MDS).Several methods: Suppose that we use four provenance proxies: bulk petrography, chemistry, heavy minerals and detrital geochronology. This will result in four MDS maps, each of which likely show slightly different trends and patterns. To deal with such cases, it may be useful to use a related technique called 'three way multidimensional scaling'. This results in two graphical outputs: an MDS map, and a map with 'weights' showing to what extent the different provenance proxies influence the horizontal and vertical axis of the MDS map. Thus, detrital data can not only inform the user about the provenance of sediments, but also about the causal relationships between the mineralogy, geochronology and chemistry.

  12. Improved methods for distribution loss evaluation. Volume 1: analytic and evaluative techniques. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flinn, D.G.; Hall, S.; Morris, J.

    This volume describes the background research, the application of the proposed loss evaluation techniques, and the results. The research identified present loss calculation methods as appropriate, provided care was taken to represent the various system elements in sufficient detail. The literature search of past methods and typical data revealed that extreme caution in using typical values (load factor, etc.) should be taken to ensure that all factors were referred to the same time base (daily, weekly, etc.). The performance of the method (and computer program) proposed in this project was determined by comparison of results with a rigorous evaluation ofmore » losses on the Salt River Project system. This rigorous evaluation used statistical modeling of the entire system as well as explicit enumeration of all substation and distribution transformers. Further tests were conducted at Public Service Electric and Gas of New Jersey to check the appropriateness of the methods in a northern environment. Finally sensitivity tests indicated data elements inaccuracy of which would most affect the determination of losses using the method developed in this project.« less

  13. Objective measurement of bread crumb texture

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Coles, Graeme D.

    1995-01-01

    Evaluation of bread crumb texture plays an important role in judging bread quality. This paper discusses the application of image analysis methods to the objective measurement of the visual texture of bread crumb. The application of Fast Fourier Transform and mathematical morphology methods have been discussed by the authors in their previous work, and a commercial bread texture measurement system has been developed. Based on the nature of bread crumb texture, we compare the advantages and disadvantages of the two methods, and a third method based on features derived directly from statistics of edge density in local windows of the bread image. The analysis of various methods and experimental results provides an insight into the characteristics of the bread texture image and interconnection between texture measurement algorithms. The usefulness of the application of general stochastic process modelling of texture is thus revealed; it leads to more reliable and accurate evaluation of bread crumb texture. During the development of these methods, we also gained useful insights into how subjective judges form opinions about bread visual texture. These are discussed here.

  14. Statistical lamb wave localization based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  15. Evaluation of PDA Technical Report No 33. Statistical Testing Recommendations for a Rapid Microbiological Method Case Study.

    PubMed

    Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David

    2015-01-01

    New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc. 2015.

  16. Recent statistical methods for orientation data

    NASA Technical Reports Server (NTRS)

    Batschelet, E.

    1972-01-01

    The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.

  17. Statistical methods and errors in family medicine articles between 2010 and 2014-Suez Canal University, Egypt: A cross-sectional study

    PubMed Central

    Nour-Eldein, Hebatallah

    2016-01-01

    Background: With limited statistical knowledge of most physicians it is not uncommon to find statistical errors in research articles. Objectives: To determine the statistical methods and to assess the statistical errors in family medicine (FM) research articles that were published between 2010 and 2014. Methods: This was a cross-sectional study. All 66 FM research articles that were published over 5 years by FM authors with affiliation to Suez Canal University were screened by the researcher between May and August 2015. Types and frequencies of statistical methods were reviewed in all 66 FM articles. All 60 articles with identified inferential statistics were examined for statistical errors and deficiencies. A comprehensive 58-item checklist based on statistical guidelines was used to evaluate the statistical quality of FM articles. Results: Inferential methods were recorded in 62/66 (93.9%) of FM articles. Advanced analyses were used in 29/66 (43.9%). Contingency tables 38/66 (57.6%), regression (logistic, linear) 26/66 (39.4%), and t-test 17/66 (25.8%) were the most commonly used inferential tests. Within 60 FM articles with identified inferential statistics, no prior sample size 19/60 (31.7%), application of wrong statistical tests 17/60 (28.3%), incomplete documentation of statistics 59/60 (98.3%), reporting P value without test statistics 32/60 (53.3%), no reporting confidence interval with effect size measures 12/60 (20.0%), use of mean (standard deviation) to describe ordinal/nonnormal data 8/60 (13.3%), and errors related to interpretation were mainly for conclusions without support by the study data 5/60 (8.3%). Conclusion: Inferential statistics were used in the majority of FM articles. Data analysis and reporting statistics are areas for improvement in FM research articles. PMID:27453839

  18. Salt preference: age and sex related variability.

    PubMed

    Verma, Punam; Mittal, Sunita; Ghildiyal, Archana; Chaudhary, Lalita; Mahajan, K K

    2007-01-01

    Salt preference was assessed in 60 adults of 18-21 yrs of age (30 males and 30 females) and in 60 children of 7-12 yrs of age (30 boys and 30 girls). Subjects rated the preference on Likert scale for popcorns of five salt concentrations (OM, 1M, 2M, 3M and +3M). Statistical analysis using Two way ANOVA revealed statistically significant effect of age and sex on salt preference (F4,100 = 15.027, P < 0.01) and One Way ANOVA revealed statistically significant sex difference in salt preference of adults (F4,50 = 16.26, P < 0.01) but no statistically significant sex difference in salt preference of children (F4,50 = 4.08, P > 0.05). Dietary experiences during development and more physical activity in children may be responsible for higher salt preference in children while finding no sex variability in children favours the role of sex hormones in salt preference of male and females.

  19. Comparative performance evaluation of automated segmentation methods of hippocampus from magnetic resonance images of temporal lobe epilepsy patients

    PubMed Central

    Hosseini, Mohammad-Parsa; Nazem-Zadeh, Mohammad-Reza; Pompili, Dario; Jafari-Khouzani, Kourosh; Elisevich, Kost; Soltanian-Zadeh, Hamid

    2016-01-01

    Purpose: Segmentation of the hippocampus from magnetic resonance (MR) images is a key task in the evaluation of mesial temporal lobe epilepsy (mTLE) patients. Several automated algorithms have been proposed although manual segmentation remains the benchmark. Choosing a reliable algorithm is problematic since structural definition pertaining to multiple edges, missing and fuzzy boundaries, and shape changes varies among mTLE subjects. Lack of statistical references and guidance for quantifying the reliability and reproducibility of automated techniques has further detracted from automated approaches. The purpose of this study was to develop a systematic and statistical approach using a large dataset for the evaluation of automated methods and establish a method that would achieve results better approximating those attained by manual tracing in the epileptogenic hippocampus. Methods: A template database of 195 (81 males, 114 females; age range 32–67 yr, mean 49.16 yr) MR images of mTLE patients was used in this study. Hippocampal segmentation was accomplished manually and by two well-known tools (FreeSurfer and hammer) and two previously published methods developed at their institution [Automatic brain structure segmentation (ABSS) and LocalInfo]. To establish which method was better performing for mTLE cases, several voxel-based, distance-based, and volume-based performance metrics were considered. Statistical validations of the results using automated techniques were compared with the results of benchmark manual segmentation. Extracted metrics were analyzed to find the method that provided a more similar result relative to the benchmark. Results: Among the four automated methods, ABSS generated the most accurate results. For this method, the Dice coefficient was 5.13%, 14.10%, and 16.67% higher, Hausdorff was 22.65%, 86.73%, and 69.58% lower, precision was 4.94%, −4.94%, and 12.35% higher, and the root mean square (RMS) was 19.05%, 61.90%, and 65.08% lower than LocalInfo, FreeSurfer, and hammer, respectively. The Bland–Altman similarity analysis revealed a low bias for the ABSS and LocalInfo techniques compared to the others. Conclusions: The ABSS method for automated hippocampal segmentation outperformed other methods, best approximating what could be achieved by manual tracing. This study also shows that four categories of input data can cause automated segmentation methods to fail. They include incomplete studies, artifact, low signal-to-noise ratio, and inhomogeneity. Different scanner platforms and pulse sequences were considered as means by which to improve reliability of the automated methods. Other modifications were specially devised to enhance a particular method assessed in this study. PMID:26745947

  20. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  1. Homogenising time series: Beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2010-09-01

    For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that the eventual purpose of homogenisation is not to find change-points, but to have the observed time series with statistical properties those characterise well the climate change and climate variability.

  2. A comparison of two microscale laboratory reporting methods in a secondary chemistry classroom

    NASA Astrophysics Data System (ADS)

    Martinez, Lance Michael

    This study attempted to determine if there was a difference between the laboratory achievement of students who used a modified reporting method and those who used traditional laboratory reporting. The study also determined the relationships between laboratory performance scores and the independent variables score on the Group Assessment of Logical Thinking (GALT) test, chronological age in months, gender, and ethnicity for each of the treatment groups. The study was conducted using 113 high school students who were enrolled in first-year general chemistry classes at Pueblo South High School in Colorado. The research design used was the quasi-experimental Nonequivalent Control Group Design. The statistical treatment consisted of the Multiple Regression Analysis and the Analysis of Covariance. Based on the GALT, students in the two groups were generally in the concrete and transitional stages of the Piagetian cognitive levels. The findings of the study revealed that the traditional and the modified methods of laboratory reporting did not have any effect on the laboratory performance outcome of the subjects. However, the students who used the traditional method of reporting showed a higher laboratory performance score when evaluation was conducted using the New Standards rubric recommended by the state. Multiple Regression Analysis revealed that there was a significant relationship between the criterion variable student laboratory performance outcome of individuals who employed traditional laboratory reporting methods and the composite set of predictor variables. On the contrary, there was no significant relationship between the criterion variable student laboratory performance outcome of individuals who employed modified laboratory reporting methods and the composite set of predictor variables.

  3. Method validation and determinations of levofloxacin, metronidazole and sulfamethoxazole in an aqueous pharmaceutical, urine and blood plasma samples using quantitative nuclear magnetic resonance spectrometry.

    PubMed

    Salem, Alaa A; Mossa, Hussein A

    2012-01-15

    Selective, rapid and accurate quantitative proton nuclear magnetic resonance (qHNMR) method for the determination of levofloxacin, metronidazole benzoate and sulfamethoxazole in aqueous solutions was developed and validated. The method was successfully applied to the determinations of the drugs and their admixtures in pharmaceutical, urine and plasma samples. Maleic acid and sodium malate were used as internal standards. Effect of temperature on spectral measurements was evaluated. Linear dynamic ranges of 0.50-68.00, 0.13-11.30 and 0.24-21.00 mg per 0.60 mL solution were obtained for levofloxacin, metronidazole benzoate and sulfamethoxazole, respectively. Average recovery % in the range of 96.00-104.20 ± (0.17-2.91) was obtained for drugs in pure, pharmaceutical, plasma and urine samples. Inter and intra-day analyses gave average recoveries % in the ranges 96.10-98.40 ± (1.68-2.81) and 96.00-104.20 ± (0.17-2.91), respectively. Instrumental detection limits ≤0.03 mg per 0.6 mL were obtained for the three drugs. Developed method has demonstrated high performance characteristics for analyzing investigated drugs and their admixtures. Student t-test at 95% confidence level revealed insignificant bias between the real and measured contents of investigated drugs in pure, pharmaceutical, urine and plasma samples and its admixtures. Application of the statistical F-test revealed insignificant differences in precisions between the developed method and arbitrary selected reference methods. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Does bad inference drive out good?

    PubMed

    Marozzi, Marco

    2015-07-01

    The (mis)use of statistics in practice is widely debated, and a field where the debate is particularly active is medicine. Many scholars emphasize that a large proportion of published medical research contains statistical errors. It has been noted that top class journals like Nature Medicine and The New England Journal of Medicine publish a considerable proportion of papers that contain statistical errors and poorly document the application of statistical methods. This paper joins the debate on the (mis)use of statistics in the medical literature. Even though the validation process of a statistical result may be quite elusive, a careful assessment of underlying assumptions is central in medicine as well as in other fields where a statistical method is applied. Unfortunately, a careful assessment of underlying assumptions is missing in many papers, including those published in top class journals. In this paper, it is shown that nonparametric methods are good alternatives to parametric methods when the assumptions for the latter ones are not satisfied. A key point to solve the problem of the misuse of statistics in the medical literature is that all journals have their own statisticians to review the statistical method/analysis section in each submitted paper. © 2015 Wiley Publishing Asia Pty Ltd.

  5. Effectiveness of planned teaching program on knowledge regarding Alzheimer's disease among the family members of elderly in a selected urban community at Mangalore

    PubMed Central

    Rodrigues, Lavina; Mathias, Thereza

    2016-01-01

    Background: Alzheimer's disease is one of the debilitating chronic diseases among older persons. It is an irreversible condition that leads to progressive deterioration of cognitive, intellectual, physical, and psychosocial functions. The study was aimed to assess the knowledge of the family members of elderly regarding Alzheimer's disease in a selected urban community at Mangalore. Materials and Methods: A preexperimental research design of one group pretest and posttest with an evaluative approach was adopted for the study. A total of 50 family members of elderly who met the inclusion criteria were selected through purposive sampling technique. The researcher developed a planned teaching program on Alzheimer's disease, and structured knowledge questionnaire on Alzheimer's disease was used to collect the data. Results: Descriptive and inferential statistics was used to analyze the data. Analysis revealed that the mean posttest knowledge (20.78 ± 3.31) was higher than mean pretest knowledge scores (12.90 ± 2.43). Significance of difference between pretest and posttest was statistically tested using paired “t” test and it was found very highly significant (t = 40.85, P < 0.05). Majority of the variables showed no significant association between pretest and posttest knowledge score and with demographic variables. Conclusion: The findings revealed that the planned teaching program is an effective strategy for improving the knowledge of the subjects. PMID:26985104

  6. Natural variability in bovine milk oligosaccharides from Danish Jersey and Holstein-Friesian breeds

    PubMed Central

    Sundekilde, Ulrik K; Barile, Daniela; Meyrand, Mickael; Poulsen, Nina A; Larsen, Lotte B; Lebrilla, Carlito B.; Bruce, German J.; Bertram, Hanne C

    2012-01-01

    Free oligosaccharides are key components of human milk and play multiple roles in the health of the neonate, by stimulating growth of selected beneficial bacteria in the gut, participating in development of the brain and exerting anti-pathogenic activity. However, the concentration of oligosaccharides is low in mature bovine milk, normally used for infant formula, compared with both human colostrum and mature human milk. Characterization of bovine milk oligosaccharides in different breeds is crucial for the identification of viable sources for oligosaccharide purification. An improved source of oligosaccharides can lead to infant formula with improved oligosaccharide functionality. In the present study we have analyzed milk oligosaccharides by high-performance liquid chromatography chip quadrupole time-of-flight mass spectrometry and performed a detailed data analysis using both univariate and multivariate methods. Both statistical tools revealed several differences in oligosaccharide profiles between milk samples from the two Danish breeds; Jersey and Holstein-Friesians. Jersey milk contained higher relative amounts of both sialylated and the more complex neutral fucosylated oligosaccharides, while the Holstein-Friesian milk had higher abundance of smaller and simpler neutral oligosaccharides. The statistical analyses revealed that Jersey milk contain significantly higher levels of fucosylated oligosaccharides than Holstein-Friesian milk. Jersey milk also possesses oligosaccharides with a higher degree of complexity and functional residues (fucose and sialic acid) suggesting it may therefore offer advantages in term of a wider array of bioactivities. PMID:22632419

  7. Consequences of nursing procedures measurement on job satisfaction

    PubMed Central

    Khademol-hoseyni, Seyyed Mohammad; Nouri, Jamileh Mokhtari; Khoshnevis, Mohammad Ali; Ebadi, Abbas

    2013-01-01

    Background: Job satisfaction among nurses has consequences on the quality of nursing care and accompanying organizational commitments. Nursing procedure measurement (NPM) is one of the essential parts of the performance-oriented system. This research was performed in order to determining the job satisfaction rate in selected wards of Baqiyatallah (a. s.) Hospital prior and following the NPM. Materials and Methods: An interventional research technique designed with an evaluation study approach in which job satisfaction was measured before and after NPM within 2 months in selected wards with census sampling procedure. The questionnaire contained two major parts; demographic data and questions regarding job satisfaction, salary, and fringe benefits. Data analyzed with SPSS version 13. Results: Statistical evaluation did not reveal significant difference between demographic data and satisfaction and/or dissatisfaction of nurses (before and after nursing procedures measurement). Following NPM, the rate of salary and benefits dissatisfaction decreased up to 5% and the rate of satisfaction increased about 1.5%, however the statistical tests did not reveal a significant difference. Subsequent to NPM, the rate of job value increased (P = 0.019), whereas the rate of job comfort decreased (P = 0.033) significantly. Conclusions: Measuring procedures do not affect the job satisfaction of ward staff or their salary and benefits. Therefore, it is suggested that the satisfaction measurement compute following nurses’ salary and therefore benefits adjusted based on NPM. This is our suggested approach. PMID:23983741

  8. Diabetes Mellitus and Latent Tuberculosis Infection: A Systemic Review and Metaanalysis

    PubMed Central

    Lee, Meng-Rui; Huang, Ya-Ping; Kuo, Yu-Ting; Luo, Chen-Hao; Shih, Yun-Ju; Shu, Chin-Chung; Wang, Jann-Yuan; Ko, Jen-Chung; Yu, Chong-Jen

    2017-01-01

    Abstract Background. Despite the well-documented association between diabetes and active tuberculosis, evidence of the association between diabetes and latent tuberculosis infection (LTBI) remains limited and inconsistent. Methods. We included observational studies that applied either the tuberculin skin test or the interferon gamma release assay for diagnosis of LTBI and that provided adjusted effect estimate for the association between diabetes and LTBI. We searched PubMed and EMBASE through 31 January 2016. The risk of bias of included studies was assessed using a quality assessment tool modified from the Newcastle-Ottawa scale. Results. Thirteen studies (1 cohort study and 12 cross-sectional studies) were included, involving 38263 participants. The cohort study revealed an increased but nonsignificant risk of LTBI among diabetics (risk ratio, 4.40; 95% confidence interval [CI], 0.50–38.55). For the cross-sectional studies, the pooled odds ratio from the random-effects model was 1.18 (95% CI, 1.06–1.30), with a small statistical heterogeneity across studies (I2, 3.5%). The risk of bias assessment revealed several methodological issues, but the overall direction of biases would reduce the positive causal association between diabetes and LTBI. Conclusions. Diabetes was associated with a small but statistically significant risk for LTBI. Findings from this review could be used to inform future cost-effectiveness analysis on the impact of LTBI screening programs among diabetics. PMID:27986673

  9. The Relationship between Psychological Wellbeing and Body Image in Pregnant Women.

    PubMed

    Fahami, Fariba; Amini-Abchuyeh, Maryam; Aghaei, Asghar

    2018-01-01

    The aim of the present study was to determine the association between body image and psychological wellbeing during pregnancy. This descriptive correlational study was conducted on 320 pregnant women who were referred to health centers in Isfahan, Iran, during 2016 and had the inclusion criteria. They were selected by nonprobability convenient sampling. Data were gathered using standard psychological wellbeing and body image satisfaction questionnaires. The data were analyzed using Statistical Package for the Social Sciences software by descriptive and inferential statistical methods. The results showed that the mean (SD) score of psychological wellbeing among participants was 77.50 (10.10) and their mean (SD) score of satisfaction with body image was 89.30 (14.60). Moreover, the results revealed a positive and significant relationship between the scores of psychological wellbeing and body image satisfaction (r=0.354, p <0.001). The results of regression analysis showed that the two variables of self-acceptance ( t = 5.6, p <0.001) and personal growth ( t = 2.06, p = 0.04)) can predict body image in pregnant women. The findings revealed a significant positive relationship between body image satisfaction and psychological wellbeing. Therefore, the training of positive attitude with respect to body image or increasing the level of knowledge on psychological wellbeing can create a positive cycle for these variables, and thus, make the pregnancy more enjoyable and acceptable.

  10. Evapotranspiration variability and its association with vegetation dynamics in the Nile Basin, 2002–2011

    USGS Publications Warehouse

    Alemu, Henok; Senay, Gabriel B.; Kaptue, Armel T.; Kovalskyy, Valeriy

    2014-01-01

    Evapotranspiration (ET) is a vital component in land-atmosphere interactions. In drylands, over 90% of annual rainfall evaporates. The Nile Basin in Africa is about 42% dryland in a region experiencing rapid population growth and development. The relationship of ET with climate, vegetation and land cover in the basin during 2002–2011 is analyzed using thermal-based Simplified Surface Energy Balance Operational (SSEBop) ET, Normalized Difference Vegetation Index (NDVI)-based MODIS Terrestrial (MOD16) ET, MODIS-derived NDVI as a proxy for vegetation productivity and rainfall from Tropical Rainfall Measuring Mission (TRMM). Interannual variability and trends are analyzed using established statistical methods. Analysis based on thermal-based ET revealed that >50% of the study area exhibited negative ET anomalies for 7 years (2009, driest), while >60% exhibited positive ET anomalies for 3 years (2007, wettest). NDVI-based monthly ET correlated strongly (r > 0.77) with vegetation than thermal-based ET (0.52 < r < 0.73) at p < 0.001. Climate-zone averaged thermal-based ET anomalies positively correlated (r = 0.6, p < 0.05) with rainfall in 4 of the 9 investigated climate zones. Thermal-based and NDVI-based ET estimates revealed minor discrepancies over rainfed croplands (60 mm/yr higher for thermal-based ET), but a significant divergence over wetlands (440 mm/yr higher for thermal-based ET). Only 5% of the study area exhibited statistically significant trends in ET.

  11. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  12. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  13. Permeability evaluation after decay removal in primary teeth with current caries-excavation techniques.

    PubMed

    Shabzendedar, Mahbobeh; Moosavi, Horieh; Talbi, Maryam; Sharifi, Marjan

    2011-11-01

    The goal of the study was to evaluate the effect of caries removal by three various methods on the permeability of class II composite resin restorations in primary molar teeth. Forty-five recently extracted primary molars were randomly assigned to three groups for three different methods of caries removal; group 1-mechanical, group 2-caries detector dye, and group 3-Carisolv (n = 15). After that, class II cavities in all groups were restored with the adhesive (Opti Bond Solo Plus) that was applied according to the manufacturer's instruction and a posterior composite (Herculite XRV), which was used incrementally. After 24 hours the samples were thermocycled in water for 500 cycles between 5 and 55°C with a dwell time of 30 sec. Permeability was assessed by the fluid filtration method. The data were analyzed using the ANOVA test while study groups were compared with Tukey test for statistically significant differences at a 5% significance level. The evaluation of tested groups indicated that the highest (0.80) and least (0.37) mean of permeability was observed in group 2 and 3 respectively. Significant difference was revealed among the tested groups (p = 0.045). The comparison of Carisolv and caries detector dye groups indicated a statistically significant difference (p = 0.037). There was not any significant difference between Carisolv or caries dye in the conventional group. Using the chemomechanical and staining methods for caries removal had no more detrimental effect on permeability than the conventional technique. However, caries detection dye for caries removal could be more harmful than chemomechanical method. None of the current caries-excavation techniques could eliminate permeability in class II composite resin restorations. Furthermore, staining methods do not have an adverse effect on sealing ability in comparison to the conventional technique.

  14. 78 FR 70059 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... (as opposed to quantitative statistical methods). In consultation with research experts, we have... qualitative interviews (as opposed to quantitative statistical methods). In consultation with research experts... utilization of qualitative interviews (as opposed to quantitative statistical methods). In consultation with...

  15. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  16. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  17. Comparative performance evaluation of automated segmentation methods of hippocampus from magnetic resonance images of temporal lobe epilepsy patients.

    PubMed

    Hosseini, Mohammad-Parsa; Nazem-Zadeh, Mohammad-Reza; Pompili, Dario; Jafari-Khouzani, Kourosh; Elisevich, Kost; Soltanian-Zadeh, Hamid

    2016-01-01

    Segmentation of the hippocampus from magnetic resonance (MR) images is a key task in the evaluation of mesial temporal lobe epilepsy (mTLE) patients. Several automated algorithms have been proposed although manual segmentation remains the benchmark. Choosing a reliable algorithm is problematic since structural definition pertaining to multiple edges, missing and fuzzy boundaries, and shape changes varies among mTLE subjects. Lack of statistical references and guidance for quantifying the reliability and reproducibility of automated techniques has further detracted from automated approaches. The purpose of this study was to develop a systematic and statistical approach using a large dataset for the evaluation of automated methods and establish a method that would achieve results better approximating those attained by manual tracing in the epileptogenic hippocampus. A template database of 195 (81 males, 114 females; age range 32-67 yr, mean 49.16 yr) MR images of mTLE patients was used in this study. Hippocampal segmentation was accomplished manually and by two well-known tools (FreeSurfer and hammer) and two previously published methods developed at their institution [Automatic brain structure segmentation (ABSS) and LocalInfo]. To establish which method was better performing for mTLE cases, several voxel-based, distance-based, and volume-based performance metrics were considered. Statistical validations of the results using automated techniques were compared with the results of benchmark manual segmentation. Extracted metrics were analyzed to find the method that provided a more similar result relative to the benchmark. Among the four automated methods, ABSS generated the most accurate results. For this method, the Dice coefficient was 5.13%, 14.10%, and 16.67% higher, Hausdorff was 22.65%, 86.73%, and 69.58% lower, precision was 4.94%, -4.94%, and 12.35% higher, and the root mean square (RMS) was 19.05%, 61.90%, and 65.08% lower than LocalInfo, FreeSurfer, and hammer, respectively. The Bland-Altman similarity analysis revealed a low bias for the ABSS and LocalInfo techniques compared to the others. The ABSS method for automated hippocampal segmentation outperformed other methods, best approximating what could be achieved by manual tracing. This study also shows that four categories of input data can cause automated segmentation methods to fail. They include incomplete studies, artifact, low signal-to-noise ratio, and inhomogeneity. Different scanner platforms and pulse sequences were considered as means by which to improve reliability of the automated methods. Other modifications were specially devised to enhance a particular method assessed in this study.

  18. Antarctic Temperature Extremes from MODIS Land Surface Temperatures: New Processing Methods Reveal Data Quality Puzzles

    NASA Astrophysics Data System (ADS)

    Grant, G.; Gallaher, D. W.

    2017-12-01

    New methods for processing massive remotely sensed datasets are used to evaluate Antarctic land surface temperature (LST) extremes. Data from the MODIS/Terra sensor (Collection 6) provides a twice-daily look at Antarctic LSTs over a 17 year period, at a higher spatiotemporal resolution than past studies. Using a data condensation process that creates databases of anomalous values, our processes create statistical images of Antarctic LSTs. In general, the results find few significant trends in extremes; however, they do reveal a puzzling picture of inconsistent cloud detection and possible systemic errors, perhaps due to viewing geometry. Cloud discrimination shows a distinct jump in clear-sky detections starting in 2011, and LSTs around the South Pole exhibit a circular cooling pattern, which may also be related to cloud contamination. Possible root causes are discussed. Ongoing investigations seek to determine whether the results are a natural phenomenon or, as seems likely, the results of sensor degradation or processing artefacts. If the unusual LST patterns or cloud detection discontinuities are natural, they point to new, interesting processes on the Antarctic continent. If the data artefacts are artificial, MODIS LST users should be alerted to the potential issues.

  19. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  20. Revealing biological information using data structuring and automated learning.

    PubMed

    Mohorianu, Irina; Moulton, Vincent

    2010-11-01

    The intermediary steps between a biological hypothesis, concretized in the input data, and meaningful results, validated using biological experiments, commonly employ bioinformatics tools. Starting with storage of the data and ending with a statistical analysis of the significance of the results, every step in a bioinformatics analysis has been intensively studied and the resulting methods and models patented. This review summarizes the bioinformatics patents that have been developed mainly for the study of genes, and points out the universal applicability of bioinformatics methods to other related studies such as RNA interference. More specifically, we overview the steps undertaken in the majority of bioinformatics analyses, highlighting, for each, various approaches that have been developed to reveal details from different perspectives. First we consider data warehousing, the first task that has to be performed efficiently, optimizing the structure of the database, in order to facilitate both the subsequent steps and the retrieval of information. Next, we review data mining, which occupies the central part of most bioinformatics analyses, presenting patents concerning differential expression, unsupervised and supervised learning. Last, we discuss how networks of interactions of genes or other players in the cell may be created, which help draw biological conclusions and have been described in several patents.

Top