Sample records for traditional statistical analysis

  1. Application of Statistics in Engineering Technology Programs

    ERIC Educational Resources Information Center

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…

  2. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  3. Numerical Analysis of Stochastic Dynamical Systems in the Medium-Frequency Range

    DTIC Science & Technology

    2003-02-01

    frequency vibration analysis such as the statistical energy analysis (SEA), the traditional modal analysis (well-suited for high and low: frequency...that the first few structural normal modes primarily constitute the total response. In the higher frequency range, the statistical energy analysis (SEA

  4. Trial Sequential Analysis in systematic reviews with meta-analysis.

    PubMed

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-03-06

    Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors) and too many false negative conclusions (type II errors). We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D 2 ) measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated that the Trial Sequential Analysis provides better control of type I errors and of type II errors than the traditional naïve meta-analysis. Trial Sequential Analysis represents analysis of meta-analytic data, with transparent assumptions, and better control of type I and type II errors than the traditional meta-analysis using naïve unadjusted confidence intervals.

  5. Comparative analysis of profitability of honey production using traditional and box hives.

    PubMed

    Al-Ghamdi, Ahmed A; Adgaba, Nuru; Herab, Ahmed H; Ansari, Mohammad J

    2017-07-01

    Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA), the Cobb-Douglas (CD) production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.

  6. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.

  7. Integration of Research Studies: Meta-Analysis of Research. Methods of Integrative Analysis; Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; And Others

    Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…

  8. Analysis of thrips distribution: application of spatial statistics and Kriging

    Treesearch

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  9. [Analysis on difference of richness of traditional Chinese medicine resources in Chongqing based on grid technology].

    PubMed

    Zhang, Xiao-Bo; Qu, Xian-You; Li, Meng; Wang, Hui; Jing, Zhi-Xian; Liu, Xiang; Zhang, Zhi-Wei; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    After the end of the national and local medicine resources census work, a large number of Chinese medicine resources and distribution of data will be summarized. The species richness between the regions is a valid indicator for objective reflection of inter-regional resources of Chinese medicine. Due to the large difference in the size of the county area, the assessment of the intercropping of the resources of the traditional Chinese medicine by the county as a statistical unit will lead to the deviation of the regional abundance statistics. Based on the rule grid or grid statistical methods, the size of the statistical unit due to different can be reduced, the differences in the richness of traditional Chinese medicine resources are caused. Taking Chongqing as an example, based on the existing survey data, the difference of richness of traditional Chinese medicine resources under different grid scale were compared and analyzed. The results showed that the 30 km grid could be selected and the richness of Chinese medicine resources in Chongqing could reflect the objective situation of intercropping resources richness in traditional Chinese medicine better. Copyright© by the Chinese Pharmaceutical Association.

  10. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  11. Objective research of auscultation signals in Traditional Chinese Medicine based on wavelet packet energy and support vector machine.

    PubMed

    Yan, Jianjun; Shen, Xiaojing; Wang, Yiqin; Li, Fufeng; Xia, Chunming; Guo, Rui; Chen, Chunfeng; Shen, Qingwei

    2010-01-01

    This study aims at utilising Wavelet Packet Transform (WPT) and Support Vector Machine (SVM) algorithm to make objective analysis and quantitative research for the auscultation in Traditional Chinese Medicine (TCM) diagnosis. First, Wavelet Packet Decomposition (WPD) at level 6 was employed to split more elaborate frequency bands of the auscultation signals. Then statistic analysis was made based on the extracted Wavelet Packet Energy (WPE) features from WPD coefficients. Furthermore, the pattern recognition was used to distinguish mixed subjects' statistical feature values of sample groups through SVM. Finally, the experimental results showed that the classification accuracies were at a high level.

  12. Consumer-driven definition of traditional food products and innovation in traditional foods. A qualitative cross-cultural study.

    PubMed

    Guerrero, Luis; Guàrdia, Maria Dolors; Xicola, Joan; Verbeke, Wim; Vanhonacker, Filiep; Zakowska-Biemans, Sylwia; Sajdakowska, Marta; Sulmont-Rossé, Claire; Issanchou, Sylvie; Contel, Michele; Scalvedi, M Luisa; Granli, Britt Signe; Hersleth, Margrethe

    2009-04-01

    Traditional food products (TFP) are an important part of European culture, identity, and heritage. In order to maintain and expand the market share of TFP, further improvement in safety, health, or convenience is needed by means of different innovations. The aim of this study was to obtain a consumer-driven definition for the concept of TFP and innovation and to compare these across six European countries (Belgium, France, Italy, Norway, Poland and Spain) by means of semantic and textual statistical analyses. Twelve focus groups were performed, two per country, under similar conditions. The transcriptions obtained were submitted to an ordinary semantic analysis and to a textual statistical analysis using the software ALCESTE. Four main dimensions were identified for the concept of TFP: habit-natural, origin-locality, processing-elaboration and sensory properties. Five dimensions emerged around the concept of innovation: novelty-change, variety, processing-technology, origin-ethnicity and convenience. TFP were similarly perceived in the countries analysed, while some differences were detected for the concept of innovation. Semantic and statistical analyses of the focus groups led to similar results for both concepts. In some cases and according to the consumers' point of view the application of innovations may damage the traditional character of TFP.

  13. Getting the big picture in community science: methods that capture context.

    PubMed

    Luke, Douglas A

    2005-06-01

    Community science has a rich tradition of using theories and research designs that are consistent with its core value of contextualism. However, a survey of empirical articles published in the American Journal of Community Psychology shows that community scientists utilize a narrow range of statistical tools that are not well suited to assess contextual data. Multilevel modeling, geographic information systems (GIS), social network analysis, and cluster analysis are recommended as useful tools to address contextual questions in community science. An argument for increased methodological consilience is presented, where community scientists are encouraged to adopt statistical methodology that is capable of modeling a greater proportion of the data than is typical with traditional methods.

  14. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  15. A PLSPM-Based Test Statistic for Detecting Gene-Gene Co-Association in Genome-Wide Association Study with Case-Control Design

    PubMed Central

    Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong

    2013-01-01

    For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods. PMID:23620809

  16. A PLSPM-based test statistic for detecting gene-gene co-association in genome-wide association study with case-control design.

    PubMed

    Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong

    2013-01-01

    For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods.

  17. Clinical Trials With Large Numbers of Variables: Important Advantages of Canonical Analysis.

    PubMed

    Cleophas, Ton J

    2016-01-01

    Canonical analysis assesses the combined effects of a set of predictor variables on a set of outcome variables, but it is little used in clinical trials despite the omnipresence of multiple variables. The aim of this study was to assess the performance of canonical analysis as compared with traditional multivariate methods using multivariate analysis of covariance (MANCOVA). As an example, a simulated data file with 12 gene expression levels and 4 drug efficacy scores was used. The correlation coefficient between the 12 predictor and 4 outcome variables was 0.87 (P = 0.0001) meaning that 76% of the variability in the outcome variables was explained by the 12 covariates. Repeated testing after the removal of 5 unimportant predictor and 1 outcome variable produced virtually the same overall result. The MANCOVA identified identical unimportant variables, but it was unable to provide overall statistics. (1) Canonical analysis is remarkable, because it can handle many more variables than traditional multivariate methods such as MANCOVA can. (2) At the same time, it accounts for the relative importance of the separate variables, their interactions and differences in units. (3) Canonical analysis provides overall statistics of the effects of sets of variables, whereas traditional multivariate methods only provide the statistics of the separate variables. (4) Unlike other methods for combining the effects of multiple variables such as factor analysis/partial least squares, canonical analysis is scientifically entirely rigorous. (5) Limitations include that it is less flexible than factor analysis/partial least squares, because only 2 sets of variables are used and because multiple solutions instead of one is offered. We do hope that this article will stimulate clinical investigators to start using this remarkable method.

  18. Separate-channel analysis of two-channel microarrays: recovering inter-spot information.

    PubMed

    Smyth, Gordon K; Altman, Naomi S

    2013-05-26

    Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.

  19. Evaluation of adding item-response theory analysis for evaluation of the European Board of Ophthalmology Diploma examination.

    PubMed

    Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José

    2013-11-01

    To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  20. [Artificial neural networks for decision making in urologic oncology].

    PubMed

    Remzi, M; Djavan, B

    2007-06-01

    This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.

  1. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  2. [Application of chemometrics in composition-activity relationship research of traditional Chinese medicine].

    PubMed

    Han, Sheng-Nan

    2014-07-01

    Chemometrics is a new branch of chemistry which is widely applied to various fields of analytical chemistry. Chemometrics can use theories and methods of mathematics, statistics, computer science and other related disciplines to optimize the chemical measurement process and maximize access to acquire chemical information and other information on material systems by analyzing chemical measurement data. In recent years, traditional Chinese medicine has attracted widespread attention. In the research of traditional Chinese medicine, it has been a key problem that how to interpret the relationship between various chemical components and its efficacy, which seriously restricts the modernization of Chinese medicine. As chemometrics brings the multivariate analysis methods into the chemical research, it has been applied as an effective research tool in the composition-activity relationship research of Chinese medicine. This article reviews the applications of chemometrics methods in the composition-activity relationship research in recent years. The applications of multivariate statistical analysis methods (such as regression analysis, correlation analysis, principal component analysis, etc. ) and artificial neural network (such as back propagation artificial neural network, radical basis function neural network, support vector machine, etc. ) are summarized, including the brief fundamental principles, the research contents and the advantages and disadvantages. Finally, the existing main problems and prospects of its future researches are proposed.

  3. Dysphagia management: an analysis of patient outcomes using VitalStim therapy compared to traditional swallow therapy.

    PubMed

    Kiger, Mary; Brown, Catherine S; Watkins, Lynn

    2006-10-01

    This study compares the outcomes using VitalStim therapy to outcomes using traditional swallowing therapy for deglutition disorders. Twenty-two patients had an initial and a followup videofluoroscopic swallowing study or fiberoptic endoscopic evaluation of swallowing and were divided into an experimental group that received VitalStim treatments and a control group that received traditional swallowing therapy. Outcomes were analyzed for changes in oral and pharyngeal phase dysphagia severity, dietary consistency restrictions, and progression from nonoral to oral intake. Results of chi(2) analysis showed no statistically significant difference in outcomes between the experimental and control groups.

  4. Seeing Prehistory through New Lenses: Using Geophysical and Statistical Analysis to Identify Fresh Perspectives of a 15th Century Mandan Occupation

    NASA Astrophysics Data System (ADS)

    Mitchum, Amber Marie

    Great Plains prehistoric research has evolved over the course of a century, with many sites like Huff Village (32MO11) in North Dakota recently coming back to the forefront of discussion through new technological applications. Through a majority of its studies and excavations, Huff Village appeared to endure as the final stage in the Middle Missouri tradition. Long thought to reflect only systematically placed long-rectangular structure types of its Middle Missouri predecessors, recent magnetic gradiometry and topographic mapping data revealed circular structure types that deviated from long-held traditions, highlighting new associations with Coalescent groups. A compact system for food capacity was also discovered, with more than 1,500 storage pits visible inside and outside of all structures delineated. Archaeological applications of these new technologies have provided a near-complete picture of this 15th century Mandan expression, allowing new questions to be raised about its previous taxonomic placement. Using a combination of GIS and statistical analysis, an attempt is made to quantitatively examine if it truly represented the Terminal Middle Missouri variant, or if Huff diverted in new directions. Statistical analysis disagrees with previous conclusions that a patterned layout of structures existed, significant clustering shown through point pattern analysis and Ripley’s K function amongst structures. Clustering of external storage pits also resulted from similar analysis, highlighting a connection between external storage features and the structures they surrounded. A combination of documented defensive features, a much higher estimation of caloric support for a population present, and a short occupation lead us to believe that a significant transition was occurring that incorporated attributes of both the Middle Missouri tradition as well as the Coalescent tradition. With more refined taxonomies currently developing, it is hoped that these data will help in the effort to develop future classifications that represent this complex period in prehistory.

  5. AUPress: A Comparison of an Open Access University Press with Traditional Presses

    ERIC Educational Resources Information Center

    McGreal, Rory; Chen, Nian-Shing

    2011-01-01

    This study is a comparison of AUPress with three other traditional (non-open access) Canadian university presses. The analysis is based on the rankings that are correlated with book sales on Amazon.com and Amazon.ca. Statistical methods include the sampling of the sales ranking of randomly selected books from each press. The results of one-way…

  6. Statistical analysis of activation and reaction energies with quasi-variational coupled-cluster theory

    NASA Astrophysics Data System (ADS)

    Black, Joshua A.; Knowles, Peter J.

    2018-06-01

    The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.

  7. Multi-resolutional shape features via non-Euclidean wavelets: Applications to statistical analysis of cortical thickness

    PubMed Central

    Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.

    2014-01-01

    Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060

  8. Wastewater-Based Epidemiology of Stimulant Drugs: Functional Data Analysis Compared to Traditional Statistical Methods.

    PubMed

    Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo

    2015-01-01

    Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.

  9. Barriers to Eating Traditional Foods Vary by Age Group in Ecuador With Biodiversity Loss as a Key Issue.

    PubMed

    Penafiel, Daniela; Termote, Celine; Lachat, Carl; Espinel, Ramon; Kolsteren, Patrick; Van Damme, Patrick

    2016-04-01

    To document the perceptions of indigenous peoples for the sustainable management of natural resources against malnutrition. Initially 4 and then 12 interviews were conducted with 4 different age groups. Eight rural villages in Guasaganda, central Ecuador, were studied in 2011-2012. A total of 75 people (22 children, 18 adolescents, 20 adults, and 15 elders). Benefits, severity, susceptibility, barriers, cues to action, and self-efficacy of eating traditional foods. Qualitative content analysis was completed using NVivo software. Initial analysis was inductive, followed by a content analysis directed by the Health Belief Model. Coding was completed independently by 2 researchers and kappa statistics (κ ≥ 0.65) were used to evaluate agreement. Healthy perceptions toward traditional foods existed and differed by age. Local young people ate traditional foods for their health benefits and good taste; adults cultivated traditional foods that had an economic benefit. Traditional knowledge used for consumption and cultivation of traditional foods was present but needs to be disseminated. Nutrition education in schools is needed that supports traditional knowledge in younger groups and prevents dietary changes toward unhealthy eating. Increased production of traditional food is needed to address current economic realities. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  10. Traditional Practices of Mothers in the Postpartum Period: Evidence from Turkey.

    PubMed

    Altuntuğ, Kamile; Anık, Yeşim; Ege, Emel

    2018-03-01

    In various cultures, the postpartum period is a sensitive time and various traditional practices are applied to protect the health of the mother and the baby. The aim of this study was to determine traditional practices of mother care in the postpartum period in Konya City of Turkey. The research was a descriptive, cross-sectional study carried out among 291 women at the first 8 weeks of postpartum period who visited to family health centers from June 1 to December 1, 2015. The data were collected using questionnaires. Statistical analysis of the data was done with SSPS version 22.0. Descriptive statistics were used to analyze the data. Based on the results, 84.5% of women applied a traditional mother care practice during the postpartum period. The most popular, were practices for increasing of breast milk (97.9%), preventing incubus "albasması" (81.8%), getting rid of incubus (74.9%), and preventing postpartum bleeding (14.1%).The findings of the study show that traditional practices towards mother care in the period after birth are common. In order to provide better health services, it is important for health professionals to understand the traditional beliefs and practices of the individuals, families, and society that they serve.

  11. Blended Learning Versus Traditional Lecture in Introductory Nursing Pathophysiology Courses.

    PubMed

    Blissitt, Andrea Marie

    2016-04-01

    Currently, many undergraduate nursing courses use blended-learning course formats with success; however, little evidence exists that supports the use of blended formats in introductory pathophysiology courses. The purpose of this study was to compare the scores on pre- and posttests and course satisfaction between traditional and blended course formats in an introductory nursing pathophysiology course. This study used a quantitative, quasi-experimental, nonrandomized control group, pretest-posttest design. Analysis of covariance compared pre- and posttest scores, and a t test for independent samples compared students' reported course satisfaction of the traditional and blended course formats. Results indicated that the differences in posttest scores were not statistically significant between groups. Students in the traditional group reported statistically significantly higher satisfaction ratings than students in the blended group. The results of this study support the need for further research of using blended learning in introductory pathophysiology courses in undergraduate baccalaureate nursing programs. Further investigation into how satisfaction is affected by course formats is needed. Copyright 2016, SLACK Incorporated.

  12. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  13. Compositional data analysis for physical activity, sedentary time and sleep research.

    PubMed

    Dumuid, Dorothea; Stanford, Tyman E; Martin-Fernández, Josep-Antoni; Pedišić, Željko; Maher, Carol A; Lewis, Lucy K; Hron, Karel; Katzmarzyk, Peter T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Lambert, Estelle V; Maia, José; Sarmiento, Olga L; Standage, Martyn; Barreira, Tiago V; Broyles, Stephanie T; Tudor-Locke, Catrine; Tremblay, Mark S; Olds, Timothy

    2017-01-01

    The health effects of daily activity behaviours (physical activity, sedentary time and sleep) are widely studied. While previous research has largely examined activity behaviours in isolation, recent studies have adjusted for multiple behaviours. However, the inclusion of all activity behaviours in traditional multivariate analyses has not been possible due to the perfect multicollinearity of 24-h time budget data. The ensuing lack of adjustment for known effects on the outcome undermines the validity of study findings. We describe a statistical approach that enables the inclusion of all daily activity behaviours, based on the principles of compositional data analysis. Using data from the International Study of Childhood Obesity, Lifestyle and the Environment, we demonstrate the application of compositional multiple linear regression to estimate adiposity from children's daily activity behaviours expressed as isometric log-ratio coordinates. We present a novel method for predicting change in a continuous outcome based on relative changes within a composition, and for calculating associated confidence intervals to allow for statistical inference. The compositional data analysis presented overcomes the lack of adjustment that has plagued traditional statistical methods in the field, and provides robust and reliable insights into the health effects of daily activity behaviours.

  14. A General Approach to Causal Mediation Analysis

    ERIC Educational Resources Information Center

    Imai, Kosuke; Keele, Luke; Tingley, Dustin

    2010-01-01

    Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…

  15. Radiomic analysis in prediction of Human Papilloma Virus status.

    PubMed

    Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu

    2017-12-01

    Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.

  16. Alternative and traditional assessments: Their comparative impact on students' attitudes and science learning outcomes. An exploratory study

    NASA Astrophysics Data System (ADS)

    Century, Daisy Nelson

    This probing study focused on alternative and traditional assessments, their comparative impacts on students' attitudes and science learning outcomes. Four basic questions were asked: What type of science learning stemming from the instruction can best be assessed by the use of traditional paper-and pencil test? What type of science learning stemming from the instruction can best be assessed by the use of alternative assessment? What are the differences in the types of learning outcomes that can be assessed by the use of paper-pencil test and alternative assessment test? Is there a difference in students' attitude towards learning science when assessment of outcomes is by alternative assessment means compared to traditional means compared to traditional means? A mixed methodology involving quantitative and qualitative techniques was utilized. However, the study was essentially a case study. Quantitative data analysis included content achievement and attitude results, to which non-parametric statistics were applied. Analysis of qualitative data was done as a case study utilizing pre-set protocols resulting in a narrative summary style of report. These outcomes were combined in order to produce conclusions. This study revealed that the traditional method yielded more concrete cognitive content learning than did the alternative assessment. The alternative assessment yielded more psychomotor, cooperative learning and critical thinking skills. In both the alternative and the traditional methods the student's attitudes toward science were positive. There was no significant differences favoring either group. The quantitative findings of no statistically significant differences suggest that at a minimum there is no loss in the use of alternative assessment methods, in this instance, performance testing. Adding the results from the qualitative analysis to this suggests (1) that class groups were more satisfied when alternative methods were employed, and (2) that the two assessment methodologies are complementary to each other, and thus should probably be used together to produce maximum benefit.

  17. The l z ( p ) * Person-Fit Statistic in an Unfolding Model Context.

    PubMed

    Tendeiro, Jorge N

    2017-01-01

    Although person-fit analysis has a long-standing tradition within item response theory, it has been applied in combination with dominance response models almost exclusively. In this article, a popular log likelihood-based parametric person-fit statistic under the framework of the generalized graded unfolding model is used. Results from a simulation study indicate that the person-fit statistic performed relatively well in detecting midpoint response style patterns and not so well in detecting extreme response style patterns.

  18. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Meta‐analysis using individual participant data: one‐stage and two‐stage approaches, and why they may differ

    PubMed Central

    Ensor, Joie; Riley, Richard D.

    2016-01-01

    Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915

  20. Crash analysis, statistics & information notebook 2008

    DOT National Transportation Integrated Search

    2008-01-01

    Traditionally crash data is often presented as single fact sheets highlighting a single factor such as Vehicle Type or Road Type. This document will try to show how the risk factors interrelate to produce a crash. Complete detailed analys...

  1. Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine

    PubMed Central

    Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit

    2017-01-01

    In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804

  2. Undergraduate medical student's perceptions on traditional and problem based curricula: pilot study.

    PubMed

    Meo, Sultan Ayoub

    2014-07-01

    To evaluate and compare students' perceptions about teaching and learning, knowledge and skills, outcomes of course materials and their satisfaction in traditional Lecture Based learning versus Problem-Based Learning curricula in two different medical schools. The comparative cross-sectional questionnaire-based study was conducted in the Department of Physiology, College of Medicine, King Saud University, Riyadh, Saudi Arabia, from July 2009 to January 2011. Two different undergraduate medical schools were selected; one followed the traditional curriculum, while the other followed the problem-based learning curriculum. Two equal groups of first year medical students were selected. They were taught in respiratory physiology and lung function lab according to their curriculum for a period of two weeks. At the completion of the study period, a five-point Likert scale was used to assess students' perceptions on satisfaction, academic environment, teaching and learning, knowledge and skills and outcomes of course materials about effectiveness of problem-based learning compared to traditional methods. SPSS 19 was used for statistical analysis. Students used to problem-based learning curriculum obtained marginally higher scores in their perceptions (24.10 +/- 3.63) compared to ones following the traditional curriculum (22.67 +/- 3.74). However, the difference in perceptions did not achieve a level of statistical significance. Students following problem-based learning curriculum have more positive perceptions on teaching and learning, knowledge and skills, outcomes of their course materials and satisfaction compared to the students belonging to the traditional style of medical school. However, the difference between the two groups was not statistically significant.

  3. Vocational Preparation for Women: A Critical Analysis.

    ERIC Educational Resources Information Center

    Steiger, JoAnn

    In this analysis of vocational preparation for women material is presented to substantiate the claim that women are joining the labor force in increasing numbers and their career opportunities are expanding, but that the educational system has failed to respond. Statistical data is cited showing that women have traditionally been employed in just…

  4. Soy-enhanced lunch acceptance by preschoolers.

    PubMed

    Endres, Jeannette; Barter, Sharon; Theodora, Perseli; Welch, Patricia

    2003-03-01

    To evaluate acceptance of soy-enhanced compared with traditional menus by preschool children. Soy-enhanced foods were substituted on a traditional cycle menu, and the amount eaten, energy, and nutrient values for traditional and soy-enhanced lunches were compared. A traditional three-week cycle menu, using the Child and Adult Care Food Program (CACFP) meal pattern guidelines, was used to develop a comparable soy-enhanced menu. Traditional and soy-enhanced lunches were randomly assigned to respective days. Foods were portioned onto individual plates using standardized measuring utensils. Individual plate waste techniques were used to collect food waste. Subjects/setting Participants were preschool children, three to six years of age and of white and Hispanic origin, attending a part-day Head Start program. Statistical analyses performed Analysis of covariance was used to adjust lunch and food intakes for differences in average amounts of foods served. The Nutrient Data System was used to calculate energy and nutrient content of lunches. Analysis of variance was used to calculate differences in amounts eaten, energy values, and nutrient values of traditional and soy-enhanced lunches and foods. Data analyses were performed with the Statistical Analysis Software (version 8.0, 1999, SAS Institute, Cary, NC). Soy-enhanced foods were successfully substituted for 23 traditional foods included in the cycle menus. Soy-enhanced foods tended to be higher in energy, protein, and iron. Traditional lunches tended to be higher in fat, saturated fat, and vitamin A. Consumption was significantly less for energy, protein, fiber, and iron from foods eaten from traditional compared with soy-enhanced lunch menus. Applications/conclusions Acceptance of soy-enhanced lunches was shown because there were no significant differences in the average amount eaten (grams per meal) between traditional and soy-enhanced lunches. Preschool programs can substitute soy-enhanced for traditional foods, which will add variety to the diet without sacrificing taste, energy, or nutrient value. The fat and energy content of the lunches was higher than recommended, and soy-enhanced foods were not always lower in fat. There is a need for the food industry and foodservice personnel to address the energy and fat content of all foods served in lunches to preschool children because a few extra calories added to the daily intakes can contribute to weight gain.

  5. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    NASA Astrophysics Data System (ADS)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  6. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    PubMed

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  7. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  8. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  9. Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data

    NASA Astrophysics Data System (ADS)

    Reno, B. L.; Brown, M.; Piccoli, P. M.

    2007-12-01

    Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.

  10. Statistical analysis for understanding and predicting battery degradations in real-life electric vehicle use

    NASA Astrophysics Data System (ADS)

    Barré, Anthony; Suard, Frédéric; Gérard, Mathias; Montaru, Maxime; Riu, Delphine

    2014-01-01

    This paper describes the statistical analysis of recorded data parameters of electrical battery ageing during electric vehicle use. These data permit traditional battery ageing investigation based on the evolution of the capacity fade and resistance raise. The measured variables are examined in order to explain the correlation between battery ageing and operating conditions during experiments. Such study enables us to identify the main ageing factors. Then, detailed statistical dependency explorations present the responsible factors on battery ageing phenomena. Predictive battery ageing models are built from this approach. Thereby results demonstrate and quantify a relationship between variables and battery ageing global observations, and also allow accurate battery ageing diagnosis through predictive models.

  11. Eliminating traditional reference services in an academic health sciences library: a case study

    PubMed Central

    Schulte, Stephanie J

    2011-01-01

    Question: How were traditional librarian reference desk services successfully eliminated at one health sciences library? Setting: The analysis was done at an academic health sciences library at a major research university. Method: A gap analysis was performed, evaluating changes in the first eleven months through analysis of reference transaction and instructional session data. Main Results: Substantial increases were seen in the overall number of specialized reference transactions and those conducted by librarians lasting more than thirty minutes. The number of reference transactions overall increased after implementing the new model. Several new small-scale instructional initiatives began, though perhaps not directly related to the new model. Conclusion: Traditional reference desk services were eliminated at one academic health sciences library without negative impact on reference and instructional statistics. Eliminating ties to the confines of the physical library due to staffing reference desk hours removed one significant barrier to a more proactive liaison program. PMID:22022221

  12. Suggestions for presenting the results of data analyses

    USGS Publications Warehouse

    Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.

    2001-01-01

    We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.

  13. Multilevel Latent Class Analysis: An Application of Adolescent Smoking Typologies with Individual and Contextual Predictors

    ERIC Educational Resources Information Center

    Henry, Kimberly L.; Muthen, Bengt

    2010-01-01

    Latent class analysis (LCA) is a statistical method used to identify subtypes of related cases using a set of categorical or continuous observed variables. Traditional LCA assumes that observations are independent. However, multilevel data structures are common in social and behavioral research and alternative strategies are needed. In this…

  14. Statistics for Time-Series Spatial Data: Applying Survival Analysis to Study Land-Use Change

    ERIC Educational Resources Information Center

    Wang, Ninghua Nathan

    2013-01-01

    Traditional spatial analysis and data mining methods fall short of extracting temporal information from data. This inability makes their use difficult to study changes and the associated mechanisms of many geographic phenomena of interest, for example, land-use. On the other hand, the growing availability of land-change data over multiple time…

  15. The effect of berberine on insulin resistance in women with polycystic ovary syndrome: detailed statistical analysis plan (SAP) for a multicenter randomized controlled trial.

    PubMed

    Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping

    2016-10-21

    Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.

  16. Use of Western Medicine and Traditional Korean Medicine for Joint Disorders: A Retrospective Comparative Analysis Based on Korean Nationwide Insurance Data

    PubMed Central

    2017-01-01

    This study aimed to compare the usage of Western medicine and traditional Korean medicine for treating joint disorders in Korea. Data of claims from all medical institutions with billing statements filed to HIRA from 2011 to 2014 for the four most frequent joint disorders were used for the analysis. Data from a total of 1,100,018 patients who received medical services from 2011 to 2014 were analyzed. Descriptive statistics are presented as type of care and hospital type. All statistical analyses were performed using IBM SPSS for Windows version 21. Of the 1,100,018 patients with joint disorders, 456,642 (41.5%) were males and 643,376 (58.5%) were females. Per diem costs of hospitalization in Western medicine clinics and traditional Korean medicine clinics were approximately 160,000 KRW and 50,000 KRW, respectively. Among costs associated with Western medicine, physiotherapy cost had the largest proportion (28.78%). Among costs associated with traditional Korean medicine, procedural costs and treatment accounted for more than 70%, followed by doctors' fees (21.54%). There were distinct differences in patterns of medical care use and cost of joint disorders at the national level in Korea. This study is expected to contribute to management decisions for musculoskeletal disease involving joint disorders. PMID:29456569

  17. [Effect and regulation of drying on quality of traditional Chinese medicine pills].

    PubMed

    Qi, Ya-Ru; Li, Yuan-Hui; Han, Li; Wu, Zhen-Feng; Yue, Peng-Fei; Wang, Xue-Cheng; Xiong, Yao-Kun; Yang, Ming

    2017-06-01

    The dry quality of traditional Chinese medicine pills is the hot spot of pills research, because their quality has a crucial effect on the efficacy and development of dosage forms. Through literature research and statistical analysis, we would review the current problems on the drying of traditional Chinese medicine pills in this paper, and surrounding the evaluation system for traditional Chinese medicine pills, analyze the characteristics of common drying equipment and processes as well as their effect on quality of pills, discuss the problems in drying equipment and process as well as quality, and put forward the corresponding strategies, hoping to provide new ideas and new methods for the quality improvement of traditional Chinese medicine pills and quality standards. Copyright© by the Chinese Pharmaceutical Association.

  18. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  19. Impact of e-learning on nurses' and student nurses knowledge, skills, and satisfaction: a systematic review and meta-analysis.

    PubMed

    Lahti, Mari; Hätönen, Heli; Välimäki, Maritta

    2014-01-01

    To review the impact of e-learning on nurses' and nursing student's knowledge, skills and satisfaction related to e-learning. We conducted a systematic review and meta-analysis of randomized controlled trials (RCT) to assess the impact of e-learning on nurses' and nursing student's knowledge, skills and satisfaction. Electronic databases including MEDLINE (1948-2010), CINAHL (1981-2010), Psychinfo (1967-2010) and Eric (1966-2010) were searched in May 2010 and again in December 2010. All RCT studies evaluating the effectiveness of e-learning and differentiating between traditional learning methods among nurses were included. Data was extracted related to the purpose of the trial, sample, measurements used, index test results and reference standard. An extraction tool developed for Cochrane reviews was used. Methodological quality of eligible trials was assessed. 11 trials were eligible for inclusion in the analysis. We identified 11 randomized controlled trials including a total of 2491 nurses and student nurses'. First, the random effect size for four studies showed some improvement associated with e-learning compared to traditional techniques on knowledge. However, the difference was not statistically significant (p=0.39, MD 0.44, 95% CI -0.57 to 1.46). Second, one study reported a slight impact on e-learning on skills, but the difference was not statistically significant, either (p=0.13, MD 0.03, 95% CI -0.09 to 0.69). And third, no results on nurses or student nurses' satisfaction could be reported as the statistical data from three possible studies were not available. Overall, there was no statistical difference between groups in e-learning and traditional learning relating to nurses' or student nurses' knowledge, skills and satisfaction. E-learning can, however, offer an alternative method of education. In future, more studies following the CONSORT and QUOROM statements are needed to evaluate the effects of these interventions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. A new strategy for statistical analysis-based fingerprint establishment: Application to quality assessment of Semen sojae praeparatum.

    PubMed

    Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting

    2018-08-30

    Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Transportation safety data and analysis : Volume 1, Analyzing the effectiveness of safety measures using Bayesian methods.

    DOT National Transportation Integrated Search

    2010-12-01

    Recent research suggests that traditional safety evaluation methods may be inadequate in accurately determining the effectiveness of roadway safety measures. In recent years, advanced statistical methods are being utilized in traffic safety studies t...

  2. Use of recurrence plots in the analysis of pupil diameter dynamics in narcoleptics

    NASA Astrophysics Data System (ADS)

    Keegan, Andrew P.; Zbilut, J. P.; Merritt, S. L.; Mercer, P. J.

    1993-11-01

    Recurrence plots were used to evaluate pupil dynamics of subjects with narcolepsy. Preliminary data indicate that this nonlinear method of analyses may be more useful in revealing underlying deterministic differences than traditional methods like FFT and counting statistics.

  3. A network meta-analysis on the effects of information technology application on preoperative knowledge of patients.

    PubMed

    Lai, Yi-Horng

    2015-01-01

    The application of information technology in health education plan in Taiwan has existed for a long time. The purpose of this study is to explore the relationship between information technology application in health education and patients' preoperative knowledge by synthesizing existing researches that compare the effectiveness of information technology application and traditional instruction in the health education plan. In spite of claims regarding the potential benefits of using information technology in health education plan, results of previous researches were conflicting. This study is carried out to examine the effectiveness of information technology by using network meta-analysis, which is a statistical analysis of separate but similar studies in order to test the pooled data for statistical significance. Information technology application in health education discussed in this study include interactive technology therapy (person-computer), group interactive technology therapy (person-person), multimedia technology therapy and video therapy. The result has shown that group interactive technology therapy is the most effective, followed by interactive technology therapy. And these four therapies of information technology are all superior to the traditional health education plan (leaflet therapy).

  4. Impact of Integrated Science and English Language Arts Literacy Supplemental Instructional Intervention on Science Academic Achievement of Elementary Students

    NASA Astrophysics Data System (ADS)

    Marks, Jamar Terry

    The purpose of this quasi-experimental, nonequivalent pretest-posttest control group design study was to determine if any differences existed in upper elementary school students' science academic achievement when instructed using an 8-week integrated science and English language arts literacy supplemental instructional intervention in conjunction with traditional science classroom instruction as compared to when instructed using solely traditional science classroom instruction. The targeted sample population consisted of fourth-grade students enrolled in a public elementary school located in the southeastern region of the United States. The convenience sample size consisted of 115 fourth-grade students enrolled in science classes. The pretest and posttest academic achievement data collected consisted of the science segment from the Spring 2015, and Spring 2016 state standardized assessments. Pretest and posttest academic achievement data were analyzed using an ANCOVA statistical procedure to test for differences, and the researcher reported the results of the statistical analysis. The results of the study show no significant difference in science academic achievement between treatment and control groups. An interpretation of the results and recommendations for future research were provided by the researcher upon completion of the statistical analysis.

  5. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  6. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE PAGES

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    2017-12-20

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  7. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  8. Who applies and who gets admitted to UK graduate entry medicine? - an analysis of UK admission statistics

    PubMed Central

    2011-01-01

    Background Graduate-entry medicine is a recent development in the UK, intended to expand and broaden access to medical training. After eight years, it is time to evaluate its success in recruitment. Objectives This study aimed to compare the applications and admissions profiles of graduate-entry programmes in the UK to traditional 5 and 6-year courses. Methods Aggregate data on applications and admissions were obtained from the Universities and Colleges Admission Service covering 2003 to 2009. Data were extracted, grouped as appropriate and analysed with the Statistical Package for the Social Sciences. Results Graduate-entry attracts 10,000 applications a year. Women form the majority of applicants and admissions to graduate-entry and traditional medicine programmes. Graduate-entry age profile is older, typically 20's or 30's compared to 18 or 19 years in traditional programmes. Graduate-entry applications and admissions were higher from white and black UK ethnic communities than traditional programmes, and lower from southern and Chinese Asian groups. Graduate-entry has few applications or admissions from Scotland or Northern Ireland. Secondary educational achievement is poorer amongst graduate-entry applicants and admissions than traditional programmes. Conclusions Graduate-entry has succeeded in recruiting substantial additional numbers of older applicants to medicine, in which white and black groups are better represented and Asian groups more poorly represented than in traditional undergraduate programmes. PMID:21943332

  9. [Construction and analysis of questionnaires on AIDS cough in traditional Chinese medicine diagnosis and treatment procedures].

    PubMed

    Zhang, Ying; Xue, Liu-Hua; Chen, Yu-Xia; Huang, Shi-Jing; Pan, Ju-Hua; Wang, Jie

    2013-08-01

    To norm the behavior of AIDS cough in traditional Chinese medicine diagnosis and treatment and improve the clinical level of cough treatment for HIV/AIDS, and build AIDS cough diagnosis and treatment procedures in traditional Chinese medicine. Combined with clinical practice,to formulate questionnaire on AIDS cough in traditional Chinese medicine diagnosis and treatment by both English and Chinese literature research to expertise consultation and verify the results of the questionnaires on the statistics using the Delphi method. Questionnaire contents consist of overview, pathogeny, diagnosis standard, dialectical medication (phlegm heat resistance pulmonary lung and kidney Yin deficiency lung spleen-deficiency), treating spleen-deficiency (lung), moxibustion treatment and aftercare care and diet and mental, average (2.93-3.00), full mark rate (93.10%-100%) ranks average (9.91-10.67) and (287.50-309.50) of which are the most high value, and the variation coefficient is 0.00, the Kendall coefficient (Kendalls W) is 0.049 which is statistical significance, the questionnaire reliability value of alpha was 0.788. Preliminary standarded concept, etiology and pathogenesis, diagnosis and syndrome differentiation treatment of AIDS cough, basically recognised by the experts in this field, and laid the foundation of traditional Chinese medicine diagnosis and treatment on develop the AIDS cough specifications.

  10. From Interaction to Co-Association —A Fisher r-To-z Transformation-Based Simple Statistic for Real World Genome-Wide Association Study

    PubMed Central

    Yuan, Zhongshang; Liu, Hong; Zhang, Xiaoshuai; Li, Fangyu; Zhao, Jinghua; Zhang, Furen; Xue, Fuzhong

    2013-01-01

    Currently, the genetic variants identified by genome wide association study (GWAS) generally only account for a small proportion of the total heritability for complex disease. One crucial reason is the underutilization of gene-gene joint effects commonly encountered in GWAS, which includes their main effects and co-association. However, gene-gene co-association is often customarily put into the framework of gene-gene interaction vaguely. From the causal graph perspective, we elucidate in detail the concept and rationality of gene-gene co-association as well as its relationship with traditional gene-gene interaction, and propose two Fisher r-to-z transformation-based simple statistics to detect it. Three series of simulations further highlight that gene-gene co-association refers to the extent to which the joint effects of two genes differs from the main effects, not only due to the traditional interaction under the nearly independent condition but the correlation between two genes. The proposed statistics are more powerful than logistic regression under various situations, cannot be affected by linkage disequilibrium and can have acceptable false positive rate as long as strictly following the reasonable GWAS data analysis roadmap. Furthermore, an application to gene pathway analysis associated with leprosy confirms in practice that our proposed gene-gene co-association concepts as well as the correspondingly proposed statistics are strongly in line with reality. PMID:23923021

  11. The effects of hands-on-science instruction on the science achievement of middle school students

    NASA Astrophysics Data System (ADS)

    Wiggins, Felita

    Student achievement in the Twenty First Century demands a new rigor in student science knowledge, since advances in science and technology require students to think and act like scientists. As a result, students must acquire proficient levels of knowledge and skills to support a knowledge base that is expanding exponentially with new scientific advances. This study examined the effects of hands-on-science instruction on the science achievement of middle school students. More specifically, this study was concerned with the influence of hands-on science instruction versus traditional science instruction on the science test scores of middle school students. The subjects in this study were one hundred and twenty sixth-grade students in six classes. Instruction involved lecture/discussion and hands-on activities carried out for a three week period. Specifically, the study ascertained the influence of the variables gender, ethnicity, and socioeconomic status on the science test scores of middle school students. Additionally, this study assessed the effect of the variables gender, ethnicity, and socioeconomic status on the attitudes of sixth grade students toward science. The two instruments used to collect data for this study were the Prentice Hall unit ecosystem test and the Scientific Work Experience Programs for Teachers Study (SWEPT) student's attitude survey. Moreover, the data for the study was treated using the One-Way Analysis of Covariance and the One-Way Analysis of Variance. The following findings were made based on the results: (1) A statistically significant difference existed in the science performance of middle school students exposed to hands-on science instruction. These students had significantly higher scores than the science performance of middle school students exposed to traditional instruction. (2) A statistically significant difference did not exist between the science scores of male and female middle school students. (3) A statistically significant difference did not exist between the science scores of African American and non-African American middle school students. (4) A statistically significant difference existed in the socioeconomic status of students who were not provided with assisted lunches. Students with unassisted lunches had significantly higher science scores than those middle school students who were provided with assisted lunches. (5) A statistically significant difference was not found in the attitude scores of middle school students who were exposed to hands-on or traditional science instruction. (6) A statistically significant difference was not found in the observed attitude scores of middle school students who were exposed to either hands-on or traditional science instruction by their socioeconomic status. (7) A statistically significant difference was not found in the observed attitude scores of male and female students. (8) A statistically significant difference was not found in the observed attitude scores of African American and non African American students.

  12. A meta-analysis of the differential relations of traditional and cyber-victimization with internalizing problems.

    PubMed

    Gini, Gianluca; Card, Noel A; Pozzoli, Tiziana

    2018-03-01

    This meta-analysis examined the associations between cyber-victimization and internalizing problems controlling for the occurrence of traditional victimization. Twenty independent samples with a total of 90,877 participants were included. Results confirmed the significant intercorrelation between traditional and cyber-victimization (r = .43). They both have medium-to-large bivariate correlations with internalizing problems. Traditional victimization (sr = .22) and cyber-victimization (sr = .12) were also uniquely related to internalizing problems. The difference in the relations between each type of victimization and internalizing problems was small (differential d = .06) and not statistically significant (p = .053). Moderation of these effect sizes by sample characteristics (e.g., age and proportion of girls) and study features (e.g., whether a definition of bullying was provided to participants and the time frame used as reference) was investigated. Results are discussed within the extant literature on cyber-aggression and cyber-victimization and future directions are proposed. © 2017 Wiley Periodicals, Inc.

  13. A Statistical Analysis Plan to Support the Joint Forward Area Air Defense Test.

    DTIC Science & Technology

    1984-08-02

    hy estahlishing a specific significance level prior to performing the statistical test (traditionally a levels are set at .01 or .05). What is often...undesirable increase in 8. For constant a levels , the power (I - 8) of a statistical test can he increased by Increasing the sample size of the test. fRef...ANOVA Iparison Test on MOP I=--ferences Exist AmongF "Upon MOP "A" Factor I "A" Factor I 1MOP " A " Levels ? I . I I I _ _ ________ IPerform k-Sample Com- I

  14. Topological Cacti: Visualizing Contour-based Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio

    2011-05-26

    Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introducemore » a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.« less

  15. A study of environmental characterization of conventional and advanced aluminum alloys for selection and design. Phase 2: The breaking load test method

    NASA Technical Reports Server (NTRS)

    Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.

    1984-01-01

    A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.

  16. Learning physics: A comparative analysis between instructional design methods

    NASA Astrophysics Data System (ADS)

    Mathew, Easow

    The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods are more effective in females, whereas there was no significant difference noted in the experimental group between male and female participants. There was a statistically significant and negative relationship (r = -.61, p = .01) between age and physics pretest scores in the control group. No statistical analyses yielded significantly different average academic performance values in either group as delineated by ethnicity.

  17. The Traditional Chinese Medicine and Relevant Treatment for the Efficacy and Safety of Atopic Dermatitis: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Shi, Zhao-feng; Song, Tie-bing; Xie, Juan; Yan, Yi-quan

    2017-01-01

    Background Atopic dermatitis (AD) has become a common skin disease that requires systematic and comprehensive treatment to achieve adequate clinical control. Traditional Chinese medicines and related treatments have shown clinical effects for AD in many studies. But the systematic reviews and meta-analyses for them are lacking. Objective The systematic review and meta-analysis based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement were conducted to evaluate the efficacy and safety of traditional Chinese medicines and related treatments for AD treatment. Methods Randomized controlled trials (RCTs) were searched based on standardized searching rules in eight medical databases from the inception up to December 2016 and a total of 24 articles with 1,618 patients were enrolled in this meta-analysis. Results The results revealed that traditional Chinese medicines and related treatments did not show statistical differences in clinical effectiveness, SCORAD amelioration, and SSRI amelioration for AD treatment compared with control group. However, EASI amelioration of traditional Chinese medicines and related treatments for AD was superior to control group. Conclusion We need to make conclusion cautiously for the efficacy and safety of traditional Chinese medicine and related treatment on AD therapy. More standard, multicenter, double-blind randomized controlled trials (RCTs) of traditional Chinese medicine and related treatment for AD were required to be conducted for more clinical evidences providing in the future. PMID:28713436

  18. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  19. The Utility of Robust Means in Statistics

    ERIC Educational Resources Information Center

    Goodwyn, Fara

    2012-01-01

    Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…

  20. Collected Notes on the Workshop for Pattern Discovery in Large Databases

    NASA Technical Reports Server (NTRS)

    Buntine, Wray (Editor); Delalto, Martha (Editor)

    1991-01-01

    These collected notes are a record of material presented at the Workshop. The core data analysis is addressed that have traditionally required statistical or pattern recognition techniques. Some of the core tasks include classification, discrimination, clustering, supervised and unsupervised learning, discovery and diagnosis, i.e., general pattern discovery.

  1. Computing Science and Statistics: Volume 24. Graphics and Visualization

    DTIC Science & Technology

    1993-03-20

    r, is set to 3.569, the population examples include: kneading ingredients into a bread eventually oscillates about 16 fixed values. However the dough ...34fun statistics". My goal is to offer leagues I said in jest "After all, regression analysis is you the equivalent of a fortune cookie which clearly is... cookie of the night reads: One problem that statisticians traditionally seem to "You have good friends who will come to your aid in have is that they

  2. Assessment of general public perceptions toward traditional medicines used for aphrodisiac purpose in state of Penang, Malaysia.

    PubMed

    Hassali, Mohamed Azmi; Saleem, Fahad; Shafie, Asrul Akmal; Al-Qazaz, Harith Khalid; Farooqui, Maryam; Aljadhey, Hisham; Atif, Muhammad; Masood, Imran

    2012-11-01

    The study aims to evaluate general public perceptions regarding the use of Traditional and Complementary Medicines (TCM) for aphrodisiac purposes. A questionnaire based, cross-sectional study was undertaken. Respondents were selected in the state of Penang, Malaysia. A total of 392 respondents were included in the study. Descriptive statistics were used for data analysis. Chi Square/Fischer Exact tests were used where appropriate. Out of 392 respondents, 150 (38.26%) reported using specific Traditional medicines for aphrodisiac purposes. Most respondents (46.94%) agreed that aphrodisiac medicines were easily available t. Moreover, 40.31% of the respondents reported that traditional aphrodisiac medicines were cheaper than modern (prescription) medicines. This study highlights limited public knowledge regarding the use of traditional aphrodisiac medicine. Healthcare professionals should be aware of informal TCM usage when prescribing allopathic medicines. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Bench press training program with attached chains for female volleyball and basketball athletes.

    PubMed

    Burnham, Timothy R; Ruud, Jason D; McGowan, Robert

    2010-02-01

    Attaching chains to barbells to increase strength and power has become popular for athletes; however, little scientific evidence supports this practice. The present purpose was to compare chain training to traditional training for the bench press. Women collegiate athletes in volleyball and basketball (N = 19) participated in a 16-session bench press program. They were matched into either a Traditional or a Chain training group by 1-repetition maximum (1RM). The Traditional group performed the bench press with conventional equipment, while the Chain group trained with attached chains (5% of weight). Analysis showed a significant increase in 1RM for both groups over 16 sessions, Traditional +11.8% and Chain +17.4%. The difference between the groups was not statistically significant, but suggests the women who trained with attached chains improved their bench press more than the Traditional group.

  4. Comparison of Blood Loss in Laser Lipolysis vs Traditional Liposuction.

    PubMed

    Abdelaal, Mohammed Mahmoud; Aboelatta, Yasser Abdallah

    2014-08-01

    Laser-assisted liposuction has been associated with reduced blood loss. However, this clinical finding has not been evaluated objectively. In this study, the authors objectively estimated the blood loss volume associated with laser lipolysis vs traditional liposuction in various anatomic regions. In this prospective study, 56 patients underwent equal amounts of traditional and laser-assisted liposuction at 2 contralateral anatomic sites. Blood loss volumes were calculated from the lipoaspirates by measuring hemoglobin and red blood cell content. The data were analyzed statistically with repeated-measures analysis of variance and the Mann-Whitney U test. Laser lipolysis can reduce blood loss by more than 50% compared with traditional liposuction. Laser lipolysis resulted in significant reductions in mean blood loss volumes in the abdomen, flanks, back, and breast. The authors provide objective evidence that laser lipolysis significantly reduces blood loss compared with traditional liposuction. 3. © 2014 The American Society for Aesthetic Plastic Surgery, Inc.

  5. [Medication rule for treatment of functional dyspepsia: an analysis of traditional Chinese medicine literature based on China National Knowledge Internet].

    PubMed

    Xiao, Hong-ling; Wu, Yuan-jie; Wang, Xiang; Li, Yi-fang; Fang, Zheng-qing

    2015-10-01

    By retrieving the clinical research literature of treatment functional dyspepsia by traditional Chinese medicine (TCM) from January 2004 to December 2014 based on China National Knowledge Internet (CNKI), we would establish a TCM decoction database for treating functional dyspepsia in this study. One hundred and sixty-four literature were included, involving 159 prescriptions, 377 medicines, in a total of 1 990 herbs. These herbs can be divided into 18 categories according to the effectiveness; and qi-regulating herbs, blood circulation herbs, and antipyretic herbs ranked top three ones according to the frequency of usage of the herbs, whose medicine usage frequency accounted for 51.81%. Usage frequency of 16 herbs was over 30, and Atractylodes, Radix, Poriaranked top three according to the usage frequency. Medicinal properties were divided into 9 kinds according to the frequency statistics, and the top three were warm, flat, and cold. Taste frequency statistics were classifiedinto 9 kinds, and the top three were acrid, sweet, and bitter. In frequency statistics of the meridian tropism of herbs, it was classifiedinto 11 kinds, and the top three were spleen, stomach, lung. The analysis can provide a reference for treatment and study of TCM of functional dyspepsia.

  6. [Road Extraction in Remote Sensing Images Based on Spectral and Edge Analysis].

    PubMed

    Zhao, Wen-zhi; Luo, Li-qun; Guo, Zhou; Yue, Jun; Yu, Xue-ying; Liu, Hui; Wei, Jing

    2015-10-01

    Roads are typically man-made objects in urban areas. Road extraction from high-resolution images has important applications for urban planning and transportation development. However, due to the confusion of spectral characteristic, it is difficult to distinguish roads from other objects by merely using traditional classification methods that mainly depend on spectral information. Edge is an important feature for the identification of linear objects (e. g. , roads). The distribution patterns of edges vary greatly among different objects. It is crucial to merge edge statistical information into spectral ones. In this study, a new method that combines spectral information and edge statistical features has been proposed. First, edge detection is conducted by using self-adaptive mean-shift algorithm on the panchromatic band, which can greatly reduce pseudo-edges and noise effects. Then, edge statistical features are obtained from the edge statistical model, which measures the length and angle distribution of edges. Finally, by integrating the spectral and edge statistical features, SVM algorithm is used to classify the image and roads are ultimately extracted. A series of experiments are conducted and the results show that the overall accuracy of proposed method is 93% comparing with only 78% overall accuracy of the traditional. The results demonstrate that the proposed method is efficient and valuable for road extraction, especially on high-resolution images.

  7. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  8. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.

    PubMed

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-19

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  9. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    PubMed Central

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  10. [Analysis on regularity of prescriptions in "a guide to clinical practice with medical record" for diarrhoea based on traditional Chinese medicine inheritance support system].

    PubMed

    He, Lan-Juan; Zhu, Xiang-Dong

    2016-06-01

    To analyze the regularities of prescriptions in "a guide to clinical practice with medical record" (Ye Tianshi) for diarrhoea based on traditional Chinese medicine inheritance support system(V2.5), and provide a reference for further research and development of new traditional Chinese medicines in treating diarrhoea. Traditional Chinese medicine inheritance support system was used to build a prescription database of Chinese medicines for diarrhoea. The software integration data mining method was used to analyze the prescriptions according to "four natures", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis on 94 prescriptions for diarrhoea was used to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations, and achieve 13 new prescriptions. This study indicated that the prescriptions for diarrhoea in "a guide to clinical practice with medical record" are mostly of eliminating dampness and tonifying deficienccy, with neutral drug property, sweet, bitter or hot in flavor, and reflecting the treatment principle of "activating spleen-energy and resolving dampness". Copyright© by the Chinese Pharmaceutical Association.

  11. Evaluation of Adherence to Nutritional Intervention Through Trajectory Analysis.

    PubMed

    Sevilla-Villanueva, B; Gibert, K; Sanchez-Marre, M; Fito, M; Covas, M I

    2017-05-01

    Classical pre-post intervention studies are often analyzed using traditional statistics. Nevertheless, the nutritional interventions have small effects on the metabolism and traditional statistics are not enough to detect these subtle nutrient effects. Generally, this kind of studies assumes that the participants are adhered to the assigned dietary intervention and directly analyzes its effects over the target parameters. Thus, the evaluation of adherence is generally omitted. Although, sometimes, participants do not effectively adhere to the assigned dietary guidelines. For this reason, the trajectory map is proposed as a visual tool where dietary patterns of individuals can be followed during the intervention and can also be related with nutritional prescriptions. The trajectory analysis is also proposed allowing both analysis: 1) adherence to the intervention and 2) intervention effects. The analysis is made by projecting the differences of the target parameters over the resulting trajectories between states of different time-stamps which might be considered either individually or by groups. The proposal has been applied over a real nutritional study showing that some individuals adhere better than others and some individuals of the control group modify their habits during the intervention. In addition, the intervention effects are different depending on the type of individuals, even some subgroups have opposite response to the same intervention.

  12. Construction of inorganic elemental fingerprint and multivariate statistical analysis of marine traditional Chinese medicine Meretricis concha from Rushan Bay

    NASA Astrophysics Data System (ADS)

    Wu, Xia; Zheng, Kang; Zhao, Fengjia; Zheng, Yongjun; Li, Yantuan

    2014-08-01

    Meretricis concha is a kind of marine traditional Chinese medicine (TCM), and has been commonly used for the treatment of asthma and scald burns. In order to investigate the relationship between the inorganic elemental fingerprint and the geographical origin identification of Meretricis concha, the elemental contents of M. concha from five sampling points in Rushan Bay have been determined by means of inductively coupled plasma optical emission spectrometry (ICP-OES). Based on the contents of 14 inorganic elements (Al, As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Mo, Ni, Pb, Se, and Zn), the inorganic elemental fingerprint which well reflects the elemental characteristics was constructed. All the data from the five sampling points were discriminated with accuracy through hierarchical cluster analysis (HCA) and principle component analysis (PCA), indicating that a four-factor model which could explain approximately 80% of the detection data was established, and the elements Al, As, Cd, Cu, Ni and Pb could be viewed as the characteristic elements. This investigation suggests that the inorganic elemental fingerprint combined with multivariate statistical analysis is a promising method for verifying the geographical origin of M. concha, and this strategy should be valuable for the authenticity discrimination of some marine TCM.

  13. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  14. Accounting for standard errors of vision-specific latent trait in regression models.

    PubMed

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  15. Comparative analysis of a nontraditional general chemistry textbook and selected traditional textbooks used in Texas community colleges

    NASA Astrophysics Data System (ADS)

    Salvato, Steven Walter

    The purpose of this study was to analyze questions within the chapters of a nontraditional general chemistry textbook and the four general chemistry textbooks most widely used by Texas community colleges in order to determine if the questions require higher- or lower-order thinking according to Bloom's taxonomy. The study employed quantitative methods. Bloom's taxonomy (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) was utilized as the main instrument in the study. Additional tools were used to help classify the questions into the proper category of the taxonomy (McBeath, 1992; Metfessel, Michael, & Kirsner, 1969). The top four general chemistry textbooks used in Texas community colleges and Chemistry: A Project of the American Chemical Society (Bell et al., 2005) were analyzed during the fall semester of 2010 in order to categorize the questions within the chapters into one of the six levels of Bloom's taxonomy. Two coders were used to assess reliability. The data were analyzed using descriptive and inferential methods. The descriptive method involved calculation of the frequencies and percentages of coded questions from the books as belonging to the six categories of the taxonomy. Questions were dichotomized into higher- and lower-order thinking questions. The inferential methods involved chi-square tests of association to determine if there were statistically significant differences among the four traditional college general chemistry textbooks in the proportions of higher- and lower-order questions and if there were statistically significant differences between the nontraditional chemistry textbook and the four traditional general chemistry textbooks. Findings indicated statistically significant differences among the four textbooks frequently used in Texas community colleges in the number of higher- and lower-level questions. Statistically significant differences were also found among the four textbooks and the nontraditional textbook. After the analysis of the data, conclusions were drawn, implications for practice were delineated, and recommendations for future research were given.

  16. When does biodiversity matter? Assessing ecosystem services across broad regions using forest inventory and analysis data

    Treesearch

    Kevin M. Potter; Christopher W. Woodall; Christopher M. Oswalt; Basil V. III Iannone; Songlin Fei

    2015-01-01

    Biodiversity is expected to convey numerous functional benefits to forested ecosystems, including increased productivity and resilience. When assessing biodiversity, however, statistics that account for evolutionary relationships among species may be more ecologically meaningful than traditional measures such as species richness. In three broad-scale studies, we...

  17. Third Grade Proficiency in DC: Little Progress (2007-2011). Policy Brief

    ERIC Educational Resources Information Center

    O'Keefe, Bonnie

    2012-01-01

    Analysis of DC Comprehensive Assessment System (DC CAS) scores from 2007 to 2011 found no evidence of statistically significant changes in third grade math or reading proficiency at the citywide level, among traditional public schools or public charter schools, among racial and ethnic groups or by economic advantage or disadvantage.…

  18. Can Low Tuition Fee Policy Improve Higher Education Equity and Social Welfare?

    ERIC Educational Resources Information Center

    Zha, Xianyou; Ding, Shouhai

    2007-01-01

    Traditionally there has been a theoretical view that raising tuition fees will undermine education equity and social welfare. This study examines the effects of different tuition policies on both these factors. A statistical analysis is made on the theoretical relationship between higher education tuition fees and dropout probability, which leads…

  19. A Case-Based Curriculum for Introductory Geology

    ERIC Educational Resources Information Center

    Goldsmith, David W.

    2011-01-01

    For the past 5 years I have been teaching my introductory geology class using a case-based method that promotes student engagement and inquiry. This article presents an explanation of how a case-based curriculum differs from a more traditional approach to the material. It also presents a statistical analysis of several years' worth of student…

  20. Introduction of Digital Storytelling in Preschool Education: A Case Study from Croatia

    ERIC Educational Resources Information Center

    Preradovic, Nives Mikelic; Lesin, Gordana; Boras, Damir

    2016-01-01

    Our case study from Croatia showed the benefits of digital storytelling in a preschool as a basis for the formal ICT education. The statistical analysis revealed significant differences between children aged 6-7 who learned mathematics by traditional storytelling compared to those learning through digital storytelling. The experimental group that…

  1. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    PubMed

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  2. Physical and genetic-interaction density reveals functional organization and informs significance cutoffs in genome-wide screens

    PubMed Central

    Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.

    2013-01-01

    Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890

  3. Statistical Optimality in Multipartite Ranking and Ordinal Regression.

    PubMed

    Uematsu, Kazuki; Lee, Yoonkyung

    2015-05-01

    Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.

  4. Social inequalities in alcohol consumption in the Czech Republic: a multilevel analysis.

    PubMed

    Dzúrová, Dagmara; Spilková, Jana; Pikhart, Hynek

    2010-05-01

    Czech Republic traditionally ranks among the countries with the highest alcohol, consumption. This paper examines both risk and protective factors for frequent of alcohol, consumption in the Czech population using multilevel analysis. Risk factors were measured at the, individual level and at the area level. The individual-level data were obtained from a survey for a, sample of 3526 respondents aged 18-64 years. The area-level data were obtained from the Czech, Statistical Office. The group most inclinable to risk alcohol consumption and binge drinking are mainly, men, who live as single, with low education and also unemployed. Only the variable for divorce rate, showed statistical significance at both levels, thus the individual and the aggregated one. No cross-level interactions were found to be statistically significant. Copyright 2010 Elsevier Ltd. All rights reserved.

  5. Using structural equation modeling for network meta-analysis.

    PubMed

    Tu, Yu-Kang; Wu, Yun-Chun

    2017-07-14

    Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.

  6. Statistical analysis and interpolation of compositional data in materials science.

    PubMed

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  7. Māori identity signatures: A latent profile analysis of the types of Māori identity.

    PubMed

    Greaves, Lara M; Houkamau, Carla; Sibley, Chris G

    2015-10-01

    Māori are the indigenous peoples of New Zealand. However, the term 'Māori' can refer to a wide range of people of varying ethnic compositions and cultural identity. We present a statistical model identifying 6 distinct types, or 'Māori Identity Signatures,' and estimate their proportion in the Māori population. The model is tested using a Latent Profile Analysis of a national probability sample of 686 Māori drawn from the New Zealand Attitudes and Values Study. We identify 6 distinct signatures: Traditional Essentialists (22.6%), Traditional Inclusives (16%), High Moderates (31.7%), Low Moderates (18.7%), Spiritually Orientated (4.1%), and Disassociated (6.9%). These distinct Identity Signatures predicted variation in deprivation, age, mixed-ethnic affiliation, and religion. This research presents the first formal statistical model assessing how people's identity as Māori is psychologically structured, documents the relative proportion of these different patterns of structures, and shows that these patterns reliably predict differences in core demographics. We identify a range of patterns of Māori identity far more diverse than has been previously proposed based on qualitative data, and also show that the majority of Māori fit a moderate or traditional identity pattern. The application of our model for studying Māori health and identity development is discussed. (c) 2015 APA, all rights reserved).

  8. Study on TCM Syndrome Differentiation of Primary Liver Cancer Based on the Analysis of Latent Structural Model.

    PubMed

    Gu, Zhan; Qi, Xiuzhong; Zhai, Xiaofeng; Lang, Qingbo; Lu, Jianying; Ma, Changping; Liu, Long; Yue, Xiaoqiang

    2015-01-01

    Primary liver cancer (PLC) is one of the most common malignant tumors because of its high incidence and high mortality. Traditional Chinese medicine (TCM) plays an active role in the treatment of PLC. As the most important part in the TCM system, syndrome differentiation based on the clinical manifestations from traditional four diagnostic methods has met great challenges and questions with the lack of statistical validation support. In this study, we provided evidences for TCM syndrome differentiation of PLC using the method of analysis of latent structural model from clinic data, thus providing basis for establishing TCM syndrome criteria. And also we obtain the common syndromes of PLC as well as their typical clinical manifestations, respectively.

  9. A Retrospective Analysis of Hemostatic Techniques in Primary Total Knee Arthroplasty: Traditional Electrocautery, Bipolar Sealer, and Argon Beam Coagulation.

    PubMed

    Rosenthal, Brett D; Haughom, Bryan D; Levine, Brett R

    2016-01-01

    In this retrospective cohort study of 280 primary total knee arthroplasties, clinical outcomes relevant to hemostasis were compared by electrocautery type: traditional electrocautery (TE), bipolar sealer (BS), and argon beam coagulation (ABC). Age, sex, and preoperative diagnosis were not significantly different among the TE, BS, and ABC cohorts. The 3 hemostasis systems were statistically equivalent with respect to estimated blood loss. Wound drainage during the first 48 hours after surgery was equivalent between the BS and ABC cohorts but less for the TE cohort. Transfusion requirements were not significantly different among the cohorts. The 3 hemostasis systems were statistically equivalent with respect to mean change in hemoglobin level during the early postoperative period (levels were measured on postoperative day 1 and on discharge). As BS and ABC are clinically equivalent to TE, their increased cost may not be justified.

  10. [Construction of competency model of 'excellent doctor' in Chinese medicine].

    PubMed

    Jin, Aning; Tian, Yongquan; Zhao, Taiyang

    2014-05-01

    To evaluate outstanding and ordinary persons from personal characteristics using competency as the important criteria, which is the future direction of medical education reform. We carried on a behavior event interview about famous doctors of old traditional Chinese medicine, compiled competency dictionary, proceed control prediction test. SPSS and AMOS were used to be data analysis tools on statistics. We adopted the model of peer assessment and contrast to carry out empirical research. This project has carried on exploratory factor analysis and confirmatory factor analysis, established a "5A" competency model which include moral ability, thinking ability, communication ability, learning and practical ability. Competency model of "excellent doctor" in Chinese medicine has been validated, with good reliability and validity, and embodies the characteristics of traditional Chinese medicine personnel training, with theoretical and practical significance for excellence in medicine physician training.

  11. Principal variance component analysis of crop composition data: a case study on herbicide-tolerant cotton.

    PubMed

    Harrison, Jay M; Howard, Delia; Malven, Marianne; Halls, Steven C; Culler, Angela H; Harrigan, George G; Wolfinger, Russell D

    2013-07-03

    Compositional studies on genetically modified (GM) and non-GM crops have consistently demonstrated that their respective levels of key nutrients and antinutrients are remarkably similar and that other factors such as germplasm and environment contribute more to compositional variability than transgenic breeding. We propose that graphical and statistical approaches that can provide meaningful evaluations of the relative impact of different factors to compositional variability may offer advantages over traditional frequentist testing. A case study on the novel application of principal variance component analysis (PVCA) in a compositional assessment of herbicide-tolerant GM cotton is presented. Results of the traditional analysis of variance approach confirmed the compositional equivalence of the GM and non-GM cotton. The multivariate approach of PVCA provided further information on the impact of location and germplasm on compositional variability relative to GM.

  12. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    NASA Astrophysics Data System (ADS)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  13. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  14. The Relationship of Instructional Methods with Student Responses to the Survey of Attitudes Toward Statistics.

    ERIC Educational Resources Information Center

    Faghihi, Foroozandeh; Rakow, Ernest A.

    This study, conducted at the University of Memphis (Tennessee), compared the effects of a self-paced method of instruction on the attitudes and perceptions of students enrolled in an undergraduate statistics course with those of a comparable group of students taking statistics in a traditional lecture setting. The non-traditional course used a…

  15. Studying Student Benefits of Assigning a Service-Learning Project Compared to a Traditional Final Project in a Business Statistics Class

    ERIC Educational Resources Information Center

    Phelps, Amy L.; Dostilio, Lina

    2008-01-01

    The present study addresses the efficacy of using service-learning methods to meet the GAISE guidelines (http://www.amstat.org/education/gaise/GAISECollege.htm) in a second business statistics course and further explores potential advantages of assigning a service-learning (SL) project as compared to the traditional statistics project assignment.…

  16. A comparative study of traditional lecture methods and interactive lecture methods in introductory geology courses for non-science majors at the college level

    NASA Astrophysics Data System (ADS)

    Hundley, Stacey A.

    In recent years there has been a national call for reform in undergraduate science education. The goal of this reform movement in science education is to develop ways to improve undergraduate student learning with an emphasis on developing more effective teaching practices. Introductory science courses at the college level are generally taught using a traditional lecture format. Recent studies have shown incorporating active learning strategies within the traditional lecture classroom has positive effects on student outcomes. This study focuses on incorporating interactive teaching methods into the traditional lecture classroom to enhance student learning for non-science majors enrolled in introductory geology courses at a private university. Students' experience and instructional preferences regarding introductory geology courses were identified from survey data analysis. The information gained from responses to the questionnaire was utilized to develop an interactive lecture introductory geology course for non-science majors. Student outcomes were examined in introductory geology courses based on two teaching methods: interactive lecture and traditional lecture. There were no significant statistical differences between the groups based on the student outcomes and teaching methods. Incorporating interactive lecture methods did not statistically improve student outcomes when compared to traditional lecture teaching methods. However, the responses to the survey revealed students have a preference for introductory geology courses taught with lecture and instructor-led discussions and students prefer to work independently or in small groups. The results of this study are useful to individuals who teach introductory geology courses and individuals who teach introductory science courses for non-science majors at the college level.

  17. Analysis of counting data: Development of the SATLAS Python package

    NASA Astrophysics Data System (ADS)

    Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.

    2018-01-01

    For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.

  18. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  19. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  20. Cultural change and traditional ecological knowledge. An empirical analysis from the Tsimane' in the Bolivian Amazon.

    PubMed

    Reyes-García, Victoria; Paneque-Gálvez, Jaime; Luz, Ana C; Gueze, Maximilien; Macía, Manuel J; Orta-Martínez, Martí; Pino, Joan

    2014-01-01

    Among the different factors associated to change in traditional ecological knowledge, the study of the relations between cultural change and traditional ecological knowledge has received scan and inadequate scholarly attention. Using data from indigenous peoples of an Amazonian society facing increasing exposure to the mainstream Bolivian society, we analyzed the relation between traditional ecological knowledge, proxied with individual plant use knowledge (n=484), and cultural change, proxied with individual- and village-level (n=47) measures of attachment to traditional beliefs and values. We found that both the individual level of detachment to traditional values and the village level of agreement in detachment to traditional values were associated with individual levels of plant use knowledge, irrespective of other proxy measures for cultural change. Because both the individual- and the village-level variables bear statistically significant associations with plant use knowledge, our results suggest that both the individual- and the supra-individual level processes of cultural change are related to the erosion of plant use knowledge. Results from our work highlight the importance of analyzing processes that happen at intermediary social units -the village in our case study- to explain changes in traditional ecological knowledge.

  1. Cultural change and traditional ecological knowledge. An empirical analysis from the Tsimane’ in the Bolivian Amazon

    PubMed Central

    Reyes-García, Victoria; Paneque-Gálvez, Jaime; Luz, Ana C.; Gueze, Maximilien; Macía, Manuel J.; Orta-Martínez, Martí; Pino, Joan

    2016-01-01

    Among the different factors associated to change in traditional ecological knowledge, the study of the relations between cultural change and traditional ecological knowledge has received scan and inadequate scholarly attention. Using data from indigenous peoples of an Amazonian society facing increasing exposure to the mainstream Bolivian society, we analyzed the relation between traditional ecological knowledge, proxied with individual plant use knowledge (n=484), and cultural change, proxied with individual- and village-level (n=47) measures of attachment to traditional beliefs and values. We found that both the individual level of detachment to traditional values and the village level of agreement in detachment to traditional values were associated with individual levels of plant use knowledge, irrespective of other proxy measures for cultural change. Because both the individual- and the village-level variables bear statistically significant associations with plant use knowledge, our results suggest that both the individual- and the supra-individual level processes of cultural change are related to the erosion of plant use knowledge. Results from our work highlight the importance of analyzing processes that happen at intermediary social units -the village in our case study- to explain changes in traditional ecological knowledge. PMID:27642188

  2. A study of the effects of an experimental spiral physics curriculum taught to sixth grade girls and boys

    NASA Astrophysics Data System (ADS)

    Davis, Edith G.

    The pilot study compared the effectiveness of using an experimental spiral physics curriculum to a traditional linear physics curriculum for sixth through eighth grades. The study also surveyed students' parents and principals about students' academic history and background as well as identified resilient children's attributes for academic success. The pilot study was used to help validate the testing instrument as well as help refine the complete study. The purpose of the complete study was to compare the effectiveness of using an experimental spiral physics curriculum and a traditional linear curriculum with sixth graders only; seventh and eighth graders were dropped in the complete study. The study also surveyed students' parents, teachers, and principals about students' academic history and background as well as identified resilient children's attributes for academic success. Both the experimental spiral physics curriculum and the traditional linear physics curriculum increased physics achievement; however, there was no statistically significant difference in effectiveness of teaching experimental spiral physics curriculum in the aggregated sixth grade group compared to the traditional linear physics curriculum. It is important to note that the majority of the subgroups studied did show statistically significant differences in effectiveness for the experimental spiral physics curriculum compared to the traditional linear physics curriculum. The Grounded Theory analysis of resilient student characteristics resulted in categories for future studies including the empathy factor ("E" factor), the tenacity factor ("T" factor), the relational factor ("R" factor), and the spiritual factor ("S" factor).

  3. [Effect of compound Chinese traditional medicine on infected root canal bacteria biofilm].

    PubMed

    Ma, Rui; Huang, Li-li; Xia, Wen-wei; Zhu, Cai-lian; Ye, Dong-xia

    2010-08-01

    To assess the efficacy of compound Chinese traditional medicine(CTM), which composed of gallic acid, magnolol and polysaccharide of Blettila striata, against the infected root canal bacterial biofilm. Actinomyces viscosus (Av), Enterococcus faecalis (Ef), Fusobacterium nucleatum (Fn) were composed to form biofilm, then confocal laser scan microscope (CLSM) was used to observe and study the bacterial activity. SAS6.12 software package was used for statistical analysis. The biofilm thickness reduced after treatment by both CTM and ZnO (P>0.05),while there was a significant decrease of the percentage of vital bacterias after treatment by CTM (P<0.01). The compound Chinese traditional medicine is effective on biofilm control, so that it would be an effective disinfecting drug for root canal sealers. Supported by Research Fund of Bureau of Traditional Chinese Medicine of Shanghai Municipality (Grant No.2008L008A).

  4. On Improving the Experiment Methodology in Pedagogical Research

    ERIC Educational Resources Information Center

    Horakova, Tereza; Houska, Milan

    2014-01-01

    The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…

  5. Promoting College Students' Construction of Problem Schemata in Statistics Using Schema-Emphasizing Worked Examples

    ERIC Educational Resources Information Center

    Yan, Jie

    2010-01-01

    In this study, the effectiveness of worked examples that emphasizes problem features (data type, number of groups, purpose of analysis) associated with specific problem types (t-test, chi-square, correlation) were examined on students' construction of problem schemata compared to traditional solution-only worked examples. A sample of 96 students…

  6. Are Student Evaluations of Teaching Effectiveness Valid for Measuring Student Learning Outcomes in Business Related Classes? A Neural Network and Bayesian Analyses

    ERIC Educational Resources Information Center

    Galbraith, Craig S.; Merrill, Gregory B.; Kline, Doug M.

    2012-01-01

    In this study we investigate the underlying relational structure between student evaluations of teaching effectiveness (SETEs) and achievement of student learning outcomes in 116 business related courses. Utilizing traditional statistical techniques, a neural network analysis and a Bayesian data reduction and classification algorithm, we find…

  7. Assessing tree and stand biomass: a review with examples and critical comparisons

    Treesearch

    Bernard R. Parresol

    1999-01-01

    There is considerable interest today in estimating the biomass of trees and forests for both practical forestry issues and scientific purposes. New techniques and procedures are brought together along with the more traditional approaches to estimating woody biomass. General model forms and weighted analysis are reviewed, along with statistics for evaluating and...

  8. Fast Facts: Recent Statistics from the Library Research Service, Numbers 283-289. January-December, 2010

    ERIC Educational Resources Information Center

    Library Research Service, 2010

    2010-01-01

    Issues 283 through 289 of "Fast Facts" from the Library Research Service present data collected from libraries in Colorado and throughout the nation. Topics addressed in these "Fast Facts" from 2010 include the relationship between computer access in libraries and use of traditional services, analysis of the third year of data…

  9. A Comparison of Student Attitudes, Statistical Reasoning, Performance, and Perceptions for Web-Augmented Traditional, Fully Online, and Flipped Sections of a Statistical Literacy Class

    ERIC Educational Resources Information Center

    Gundlach, Ellen; Richards, K. Andrew R.; Nelson, David; Levesque-Bristol, Chantal

    2015-01-01

    Web-augmented traditional lecture, fully online, and flipped sections, all taught by the same instructor with the same course schedule, assignments, and exams in the same semester, were compared with regards to student attitudes; statistical reasoning; performance on common exams, homework, and projects; and perceptions of the course and…

  10. [Analysis on composition and medication regularities of prescriptions treating hypochondriac pain based on traditional Chinese medicine inheritance support system inheritance support platform].

    PubMed

    Zhao, Yan-qing; Teng, Jing

    2015-03-01

    To analyze the composition and medication regularities of prescriptions treating hypochondriac pain in Chinese journal full-text database (CNKI) based on the traditional Chinese medicine inheritance support system, in order to provide a reference for further research and development for new traditional Chinese medicines treating hypochondriac pain. The traditional Chinese medicine inheritance support platform software V2. 0 was used to build a prescription database of Chinese medicines treating hypochondriac pain. The software integration data mining method was used to distribute prescriptions according to "four odors", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis were made for 192 prescriptions treating hypochondriac pain to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations and summarize 15 new prescriptions. This study indicated that the prescriptions treating hypochondriac pain in Chinese journal full-text database are mostly those for soothing liver-qi stagnation, promoting qi and activating blood, clearing heat and promoting dampness, and invigorating spleen and removing phlem, with a cold property and bitter taste, and reflect the principles of "distinguish deficiency and excess and relieving pain by smoothening meridians" in treating hypochondriac pain.

  11. Concentric network symmetry grasps authors' styles in word adjacency networks

    NASA Astrophysics Data System (ADS)

    Amancio, Diego R.; Silva, Filipi N.; Costa, Luciano da F.

    2015-06-01

    Several characteristics of written texts have been inferred from statistical analysis derived from networked models. Even though many network measurements have been adapted to study textual properties at several levels of complexity, some textual aspects have been disregarded. In this paper, we study the symmetry of word adjacency networks, a well-known representation of text as a graph. A statistical analysis of the symmetry distribution performed in several novels showed that most of the words do not display symmetric patterns of connectivity. More specifically, the merged symmetry displayed a distribution similar to the ubiquitous power-law distribution. Our experiments also revealed that the studied metrics do not correlate with other traditional network measurements, such as the degree or the betweenness centrality. The discriminability power of the symmetry measurements was verified in the authorship attribution task. Interestingly, we found that specific authors prefer particular types of symmetric motifs. As a consequence, the authorship of books could be accurately identified in 82.5% of the cases, in a dataset comprising books written by 8 authors. Because the proposed measurements for text analysis are complementary to the traditional approach, they can be used to improve the characterization of text networks, which might be useful for applications based on stylistic classification.

  12. Network analysis of named entity co-occurrences in written texts

    NASA Astrophysics Data System (ADS)

    Amancio, Diego Raphael

    2016-06-01

    The use of methods borrowed from statistics and physics to analyze written texts has allowed the discovery of unprecedent patterns of human behavior and cognition by establishing links between models features and language structure. While current models have been useful to unveil patterns via analysis of syntactical and semantical networks, only a few works have probed the relevance of investigating the structure arising from the relationship between relevant entities such as characters, locations and organizations. In this study, we represent entities appearing in the same context as a co-occurrence network, where links are established according to a null model based on random, shuffled texts. Computational simulations performed in novels revealed that the proposed model displays interesting topological features, such as the small world feature, characterized by high values of clustering coefficient. The effectiveness of our model was verified in a practical pattern recognition task in real networks. When compared with traditional word adjacency networks, our model displayed optimized results in identifying unknown references in texts. Because the proposed representation plays a complementary role in characterizing unstructured documents via topological analysis of named entities, we believe that it could be useful to improve the characterization of written texts (and related systems), specially if combined with traditional approaches based on statistical and deeper paradigms.

  13. The use of a cognitive task analysis-based multimedia program to teach surgical decision making in flexor tendon repair.

    PubMed

    Luker, Kali R; Sullivan, Maura E; Peyre, Sarah E; Sherman, Randy; Grunwald, Tiffany

    2008-01-01

    The aim of this study was to compare the surgical knowledge of residents before and after receiving a cognitive task analysis-based multimedia teaching module. Ten plastic surgery residents were evaluated performing flexor tendon repair on 3 occasions. Traditional learning occurred between the first and second trial and served as the control. A teaching module was introduced as an intervention between the second and third trial using cognitive task analysis to illustrate decision-making skills. All residents showed improvement in their decision-making ability when performing flexor tendon repair after each surgical procedure. The group improved through traditional methods as well as exposure to our talk-aloud protocol (P > .01). After being trained using the cognitive task analysis curriculum the group displayed a statistically significant knowledge expansion (P < .01). Residents receiving cognitive task analysis-based multimedia surgical curriculum instruction achieved greater command of problem solving and are better equipped to make correct decisions in flexor tendon repair.

  14. An empirical test of a mediation model of the impact of the traditional male gender role on suicidal behavior in men.

    PubMed

    Houle, Janie; Mishara, Brian L; Chagnon, François

    2008-04-01

    Men die by suicide three to four times more often than women in Western countries. The adverse impact of the traditional male gender role as well as men's reluctance to seek help are possible explanations of this gender gap, but these hypotheses have not been well documented empirically. This study compares two groups of men who experienced comparable severely stressful life events during the preceding 12 months: 40 men admitted to hospital emergency following suicide attempts, and 40 men with no history of suicide attempts. Structured interviews were conducted to measure adherence to the traditional male gender role, help seeking behaviour, social support, suicide acceptability and mental health. ANOVAS indicated that attempters are more likely to adhere to the traditional masculine gender role and regression analysis revealed that this relationship persists even when the presence of mental disorders is statistically controlled. Sequential regression analysis support the mediation model and show that the effects of the traditional male gender role on suicidal behavior are mediated through protective and risk factors for suicide, namely mental state, help seeking and social support. The traditional male gender role appears to increase the risk of suicidal behavior in men by undermining their mental state and by inhibiting the protective factors of help seeking and social support. This study underscores the importance of encouraging men to seek help.

  15. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric cartography. This work was supported by the FCT - Portuguese Foundation for Science and Technology.

  16. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  17. Authors attain comparable or slightly higher rates of citation publishing in an open access journal (CytoJournal) compared to traditional cytopathology journals - A five year (2007-2011) experience

    PubMed Central

    Frisch, Nora K.; Nathan, Romil; Ahmed, Yasin K.; Shidham, Vinod B.

    2014-01-01

    Background: The era of Open Access (OA) publication, a platform which serves to better disseminate scientific knowledge, is upon us, as more OA journals are in existence than ever before. The idea that peer-reviewed OA publication leads to higher rates of citation has been put forth and shown to be true in several publications. This is a significant benefit to authors and is in addition to another relatively less obvious but highly critical component of the OA charter, i.e. retention of the copyright by the authors in the public domain. In this study, we analyzed the citation rates of OA and traditional non-OA publications specifically for authors in the field of cytopathology. Design: We compared the citation patterns for authors who had published in both OA and traditional non-OA peer-reviewed, scientific, cytopathology journals. Citations in an OA publication (CytoJournal) were analyzed comparatively with traditional non-OA cytopathology journals (Acta Cytologica, Cancer Cytopathology, Cytopathology, and Diagnostic Cytopathology) using the data from web of science citation analysis site (based on which the impact factors (IF) are calculated). After comparing citations per publication, as well as a time adjusted citation quotient (which takes into account the time since publication), we also analyzed the statistics after excluding the data for meeting abstracts. Results: Total 28 authors published 314 publications as articles and meeting abstracts (25 authors after excluding the abstracts). The rate of citation and time adjusted citation quotient were higher for OA in the group where abstracts were included (P < 0.05 for both). The rates were also slightly higher for OA than non-OA when the meeting abstracts were excluded, but the difference was statistically insignificant (P = 0.57 and P = 0.45). Conclusion We observed that for the same author, the publications in the OA journal attained a higher rate of citation than the publications in the traditional non-OA journals in the field of cytopathology over a 5 year period (2007-2011). However, this increase was statistically insignificant if the meeting abstracts were excluded from the analysis. Overall, the rates of citation for OA and non-OA were slightly higher to comparable. PMID:24987441

  18. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    PubMed

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Assessing the effectiveness of problem-based learning in physical diagnostics education in China: a meta-analysis

    PubMed Central

    Wang, Jianmiao; Xu, Yongjian; Liu, Xiansheng; Xiong, Weining; Xie, Jungang; Zhao, Jianping

    2016-01-01

    Problem-based learning (PBL) has been extensively applied as an experimental educational method in Chinese medical schools over the past decade. A meta-analysis was performed to assess the effectiveness of PBL on students’ learning outcomes in physical diagnostics education. Related databases were searched for eligible studies evaluating the effects of PBL compared to traditional teaching on students’ knowledge and/or skill scores of physical diagnostics. Standardized mean difference (SMD) with 95% confidence interval (CI) was estimated. Thirteen studies with a total of 2086 medical students were included in this meta-analysis. All of these studies provided usable data on knowledge scores, and the pooled analysis showed a significant difference in favor of PBL compared to the traditional teaching (SMD = 0.76, 95%CI = 0.33–1.19). Ten studies provided usable data on skill scores, and a significant difference in favor of PBL was also observed (SMD = 1.46, 95%CI = 0.89–2.02). Statistically similar results were obtained in the sensitivity analysis, and there was no significant evidence of publication bias. These results suggested that PBL in physical diagnostics education in China appeared to be more effective than traditional teaching method in improving knowledge and skills. PMID:27808158

  20. Assessing the effectiveness of problem-based learning in physical diagnostics education in China: a meta-analysis.

    PubMed

    Wang, Jianmiao; Xu, Yongjian; Liu, Xiansheng; Xiong, Weining; Xie, Jungang; Zhao, Jianping

    2016-11-03

    Problem-based learning (PBL) has been extensively applied as an experimental educational method in Chinese medical schools over the past decade. A meta-analysis was performed to assess the effectiveness of PBL on students' learning outcomes in physical diagnostics education. Related databases were searched for eligible studies evaluating the effects of PBL compared to traditional teaching on students' knowledge and/or skill scores of physical diagnostics. Standardized mean difference (SMD) with 95% confidence interval (CI) was estimated. Thirteen studies with a total of 2086 medical students were included in this meta-analysis. All of these studies provided usable data on knowledge scores, and the pooled analysis showed a significant difference in favor of PBL compared to the traditional teaching (SMD = 0.76, 95%CI = 0.33-1.19). Ten studies provided usable data on skill scores, and a significant difference in favor of PBL was also observed (SMD = 1.46, 95%CI = 0.89-2.02). Statistically similar results were obtained in the sensitivity analysis, and there was no significant evidence of publication bias. These results suggested that PBL in physical diagnostics education in China appeared to be more effective than traditional teaching method in improving knowledge and skills.

  1. Duration on unemployment: geographic mobility and selectivity bias.

    PubMed

    Goss, E P; Paul, C; Wilhite, A

    1994-01-01

    Modeling the factors affecting the duration of unemployment was found to be influenced by the inclusion of migration factors. Traditional models which did not control for migration factors were found to underestimate movers' probability of finding an acceptable job. The empirical test of the theory, based on the analysis of data on US household heads unemployed in 1982 and employed in 1982 and 1983, found that the cumulative probability of reemployment in the traditional model was .422 and in the migration selectivity model was .624 after 30 weeks of searching. In addition, controlling for selectivity eliminated the significance of the relationship between race and job search duration in the model. The relationship between search duration and the county unemployment rate in 1982 became statistically significant, and the relationship between search duration and 1980 population per square mile in the 1982 county of residence became statistically insignificant. The finding that non-Whites have a longer duration of unemployment can better be understood as non-Whites' lower geographic mobility and lack of greater job contacts. The statistical significance of a high unemployment rate in the home labor market reducing the probability of finding employment was more in keeping with expectations. The findings assumed that the duration of employment accurately reflected the length of job search. The sample was redrawn to exclude discouraged workers and the analysis was repeated. The findings were similar to the full sample, with the coefficient for migration variable being negative and statistically significant and the coefficient for alpha remaining positive and statistically significant. Race in the selectivity model remained statistically insignificant. The findings supported the Schwartz model hypothesizing that the expansion of the radius of the search would reduce the duration of unemployment. The exclusion of the migration factor misspecified the equation for unemployment duration. Policy should be directed to the problems of geographic mobility, particularly among non-Whites.

  2. Meta-analysis inside and outside particle physics: two traditions that should converge?

    PubMed

    Baker, Rose D; Jackson, Dan

    2013-06-01

    The use of meta-analysis in medicine and epidemiology really took off in the 1970s. However, in high-energy physics, the Particle Data Group has been carrying out meta-analyses of measurements of particle masses and other properties since 1957. Curiously, there has been virtually no interaction between those working inside and outside particle physics. In this paper, we use statistical models to study two major differences in practice. The first is the usefulness of systematic errors, which physicists are now beginning to quote in addition to statistical errors. The second is whether it is better to treat heterogeneity by scaling up errors as do the Particle Data Group or by adding a random effect as does the rest of the community. Besides fitting models, we derive and use an exact test of the error-scaling hypothesis. We also discuss the other methodological differences between the two streams of meta-analysis. Our conclusion is that systematic errors are not currently very useful and that the conventional random effects model, as routinely used in meta-analysis, has a useful role to play in particle physics. The moral we draw for statisticians is that we should be more willing to explore 'grassroots' areas of statistical application, so that good statistical practice can flow both from and back to the statistical mainstream. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  3. First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation

    DOE PAGES

    Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...

    2016-12-02

    Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.

  4. A comparison of Hispanic middle school students' performance, and perceived and actual physical exertion, on the traditional and treadmill one-mile runs.

    PubMed

    Latham, Daniel T; Hill, Grant M; Petray, Clayre K

    2013-04-01

    The purpose of this study was to assess whether a treadmill mile is an acceptable FitnessGram Test substitute for the traditional one-mile run for middle school boys and girls. Peak heart rate and perceived physical exertion of the participants were also measured to assess students' effort. 48 boys and 40 girls participated, with approximately 85% classified as Hispanic. Boys' mean time for the traditional one-mile run, as well as peak heart rate and perceived exertion, were statistically significantly faster and higher, respectively, than for the treadmill mile. Girls' treadmill mile times were not statistically significantly different from the traditional one-mile run. There were no statistically significant differences for girl's peak heart rate or perceived exertion. The results suggest that providing middle school students a choice of completing the FitnessGram mile run in either traditional one-mile run or treadmill one-mile format may positively affect performance.

  5. Statistical Analysis of a Class: Monte Carlo and Multiple Imputation Spreadsheet Methods for Estimation and Extrapolation

    ERIC Educational Resources Information Center

    Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael

    2017-01-01

    The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…

  6. The Relationship between External Accountability Policy and Internal Accountability: A Cross-State Analysis of Charter and Traditional Public Schools

    ERIC Educational Resources Information Center

    Poole, Sonja Martin

    2011-01-01

    Using data from the National Center for Educational Statistics, this article examines the relationship between strength of state accountability policy (i.e., external accountability) and internal accountability, defined as a school-level system in which collective behaviors and conditions exist that direct the attention and effort of the internal…

  7. [Application characteristics and situation analysis of volatile oils in database of Chinese patent medicine].

    PubMed

    Wang, Sai-Jun; Wu, Zhen-Feng; Yang, Ming; Wang, Ya-Qi; Hu, Peng-Yi; Jie, Xiao-Lu; Han, Fei; Wang, Fang

    2014-09-01

    Aromatic traditional Chinese medicines have a long history in China, with wide varieties. Volatile oils are active ingredients extracted from aromatic herbal medicines, which usually contain tens or hundreds of ingredients, with many biological activities. Therefore, volatile oils are often used in combined prescriptions and made into various efficient preparations for oral administration or external use. Based on the sources from the database of Newly Edited National Chinese Traditional Patent Medicines (the second edition), the author selected 266 Chinese patent medicines containing volatile oils in this paper, and then established an information sheet covering such items as name, dosage, dosage form, specification and usage, and main functions. Subsequently, on the basis of the multidisciplinary knowledge of pharmaceutics, traditional Chinese pharmacology and basic theory of traditional Chinese medicine, efforts were also made in the statistics of the dosage form and usage, variety of volatile oils and main functions, as well as the status analysis on volatile oils in terms of the dosage form development, prescription development, drug instruction and quality control, in order to lay a foundation for the further exploration of the market development situations of volatile oils and the future development orientation.

  8. Analysis of Garment Production Methods. Part 2: Comparison of Cost and Production between a Traditional Bundle System and Modular Manufacturing

    DTIC Science & Technology

    1992-02-01

    configuration. We have spent the last year observing two firms as they experimented with modular manufacturing. The following report will track the progress of...the transitions as they I moved through the year . Incorporated into the analysis is the statistical interpretation of data collected from each firm, as...during the year . FEBRUARY The most noticeable change this month was the introduction of the new ergonomic chairs for the operators. Previously the

  9. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  10. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  11. The Interaction of TXNIP and AFq1 Genes Increases the Susceptibility of Schizophrenia.

    PubMed

    Su, Yousong; Ding, Wenhua; Xing, Mengjuan; Qi, Dake; Li, Zezhi; Cui, Donghong

    2017-08-01

    Although previous studies showed the reduced risk of cancer in patients with schizophrenia, whether patients with schizophrenia possess genetic factors that also contribute to tumor suppressor is still unknown. In the present study, based on our previous microarray data, we focused on the tumor suppressor genes TXNIP and AF1q, which differentially expressed in patients with schizophrenia. A total of 413 patients and 578 healthy controls were recruited. We found no significant differences in genotype, allele, or haplotype frequencies at the selected five single nucleotide polymorphisms (SNPs) (rs2236566 and rs7211 in TXNIP gene; rs10749659, rs2140709, and rs3738481 in AF1q gene) between patients with schizophrenia and controls. However, we found the association between the interaction of TXNIP and AF1q with schizophrenia by using the MDR method followed by traditional statistical analysis. The best gene-gene interaction model identified was a three-locus model TXNIP (rs2236566, rs7211)-AF1q (rs2140709). After traditional statistical analysis, we found the high-risk genotype combination was rs2236566 (GG)-rs7211(CC)-rs2140709(CC) (OR = 1.35 [1.03-1.76]). The low-risk genotype combination was rs2236566 (GT)-rs7211(CC)-rs2140709(CC) (OR = 0.67 [0.49-0.91]). Our finding suggested statistically significant role of interaction of TXNIP and AF1q polymorphisms (TXNIP-rs2236566, TXNIP-rs7211, and AF1q-rs2769605) in schizophrenia susceptibility.

  12. Religiosity and Spiritual Engagement in Two American Indian Populations

    PubMed Central

    Garroutte, Eva M.; Beals, Janette; Keane, Ellen M.; Kaufman, Carol; Spicer, Paul; Henderson, Jeff; Henderson, Patricia N.; Mitchell, Christina M.; Manson, Spero M.

    2015-01-01

    Social scientific investigation into the religiospiritual characteristics of American Indians rarely includes analysis of quantitative data. After reviewing information from ethnographic and autobiographical sources, we present analyses of data from a large, population-based sample of two tribes (n = 3,084). We examine salience of belief in three traditions: aboriginal, Christian, and Native American Church. We then investigate patterns in sociodemographic subgroups, determining the significant correlates of salience with other variables controlled. Finally, we examine frequency with which respondents assign high salience to only one tradition (exclusivity) or multiple traditions (nonexclusivity), again investigating subgroup variations. This first detailed, statistical portrait of American Indian religious and spiritual lives links work on tribal ethnic identity to theoretical work on America’s “religious marketplace.” Results may also inform social/behavioral interventions that incorporate religiospiritual elements. PMID:26582963

  13. [Prosthodontic research design from the standpoint of statistical analysis: learning and knowing the research design].

    PubMed

    Tanoue, Naomi

    2007-10-01

    For any kind of research, "Research Design" is the most important. The design is used to structure the research, to show how all of the major parts of the research project. It is necessary for all the researchers to begin the research after planning research design for what is the main theme, what is the background and reference, what kind of data is needed, and what kind of analysis is needed. It seems to be a roundabout route, but, in fact, it will be a shortcut. The research methods must be appropriate to the objectives of the study. Regarding the hypothesis-testing research that is the traditional style of the research, the research design based on statistics is undoubtedly necessary considering that the research basically proves "a hypothesis" with data and statistics theory. On the subject of the clinical trial, which is the clinical version of the hypothesis-testing research, the statistical method must be mentioned in a clinical trial planning. This report describes the basis of the research design for a prosthodontics study.

  14. Correlation between musical aptitude and learning foreign languages: an epidemiological study in secondary school Italian students

    PubMed Central

    PICCIOTTI, P.M.; BUSSU, F.; CALò, L.; GALLUS, R.; SCARANO, E.; DI CINTIO, G.; CASSARÀ, F.; D’ALATRI, L.

    2018-01-01

    SUMMARY The aim of this study was to assess if a correlation exists between language learning skills and musical aptitude through the analysis of scholarly outcomes concerning the study of foreign languages and music. We enrolled 502 students from a secondary Italian school (10-14 years old), attending both traditional courses (2 hours/week of music classes scheduled) and special courses (six hours). For statistical analysis, we considered grades in English, French and Music. Our results showed a significant correlation between grades in the two foreign languages and in music, both in the traditional courses and in special courses, and better results in French than for special courses. These results are discussed and interpreted through the literature about neuroanatomical and physiological mechanisms of foreign language learning and music perception. PMID:29756615

  15. A Comparison of Classroom and Online Asynchronous Problem-Based Learning for Students Undertaking Statistics Training as Part of a Public Health Masters Degree

    ERIC Educational Resources Information Center

    de Jong, N.; Verstegen, D. M. L.; Tan, F. E. S.; O'Connor, S. J.

    2013-01-01

    This case-study compared traditional, face-to-face classroom-based teaching with asynchronous online learning and teaching methods in two sets of students undertaking a problem-based learning module in the multilevel and exploratory factor analysis of longitudinal data as part of a Masters degree in Public Health at Maastricht University. Students…

  16. CALL versus Paper: In Which Context Are L1 Glosses More Effective?

    ERIC Educational Resources Information Center

    Taylor, Alan M.

    2013-01-01

    CALL glossing in first language (L1) or second language (L2) texts has been shown by previous studies to be more effective than traditional, paper-and-pen L1 glossing. Using a pool of studies with much more statistical power and more accurate results, this meta-analysis demonstrates more precisely the degree to which CALL L1 glossing can be more…

  17. Working Towards a Scalable Model of Problem-Based Learning Instruction in Undergraduate Engineering Education

    ERIC Educational Resources Information Center

    Mantri, Archana

    2014-01-01

    The intent of the study presented in this paper is to show that the model of problem-based learning (PBL) can be made scalable by designing curriculum around a set of open-ended problems (OEPs). The detailed statistical analysis of the data collected to measure the effects of traditional and PBL instructions for three courses in Electronics and…

  18. Statistical Analysis of Mineral Concentration for the Geographic Identification of Garlic Samples from Sicily (Italy), Tunisia and Spain

    PubMed Central

    Vadalà, Rossella; Mottese, Antonio F.; Bua, Giuseppe D.; Salvo, Andrea; Mallamace, Domenico; Corsaro, Carmelo; Vasi, Sebastiano; Giofrè, Salvatore V.; Alfa, Maria; Cicero, Nicola; Dugo, Giacomo

    2016-01-01

    We performed a statistical analysis of the concentration of mineral elements, by means of inductively coupled plasma mass spectrometry (ICP-MS), in different varieties of garlic from Spain, Tunisia, and Italy. Nubia Red Garlic (Sicily) is one of the most known Italian varieties that belongs to traditional Italian food products (P.A.T.) of the Ministry of Agriculture, Food, and Forestry. The obtained results suggest that the concentrations of the considered elements may serve as geographical indicators for the discrimination of the origin of the different samples. In particular, we found a relatively high content of Selenium in the garlic variety known as Nubia red garlic, and, indeed, it could be used as an anticarcinogenic agent. PMID:28231115

  19. Docking studies on NSAID/COX-2 isozyme complexes using Contact Statistics analysis

    NASA Astrophysics Data System (ADS)

    Ermondi, Giuseppe; Caron, Giulia; Lawrence, Raelene; Longo, Dario

    2004-11-01

    The selective inhibition of COX-2 isozymes should lead to a new generation of NSAIDs with significantly reduced side effects; e.g. celecoxib (Celebrex®) and rofecoxib (Vioxx®). To obtain inhibitors with higher selectivity it has become essential to gain additional insight into the details of the interactions between COX isozymes and NSAIDs. Although X-ray structures of COX-2 complexed with a small number of ligands are available, experimental data are missing for two well-known selective COX-2 inhibitors (rofecoxib and nimesulide) and docking results reported are controversial. We use a combination of a traditional docking procedure with a new computational tool (Contact Statistics analysis) that identifies the best orientation among a number of solutions to shed some light on this topic.

  20. Modeling urbanization patterns at a global scale with generative adversarial networks

    NASA Astrophysics Data System (ADS)

    Albert, A. T.; Strano, E.; Gonzalez, M.

    2017-12-01

    Current demographic projections show that, in the next 30 years, global population growth will mostly take place in developing countries. Coupled with a decrease in density, such population growth could potentially double the land occupied by settlements by 2050. The lack of reliable and globally consistent socio-demographic data, coupled with the limited predictive performance underlying traditional urban spatial explicit models, call for developing better predictive methods, calibrated using a globally-consistent dataset. Thus, richer models of the spatial interplay between the urban built-up land, population distribution and energy use are central to the discussion around the expansion and development of cities, and their impact on the environment in the context of a changing climate. In this talk we discuss methods for, and present an analysis of, urban form, defined as the spatial distribution of macroeconomic quantities that characterize a city, using modern machine learning methods and best-available remote-sensing data for the world's largest 25,000 cities. We first show that these cities may be described by a small set of patterns in radial building density, nighttime luminosity, and population density, which highlight, to first order, differences in development and land use across the world. We observe significant, spatially-dependent variance around these typical patterns, which would be difficult to model using traditional statistical methods. We take a first step in addressing this challenge by developing CityGAN, a conditional generative adversarial network model for simulating realistic urban forms. To guide learning and measure the quality of the simulated synthetic cities, we develop a specialized loss function for GAN optimization that incorporates standard spatial statistics used by urban analysis experts. Our framework is a stark departure from both the standard physics-based approaches in the literature (that view urban forms as fractals with a scale-free behavior), and the traditional statistical learning approaches (whereby values of individual pixels are modeled as functions of locally-defined, hand-engineered features). This is a first-of-its-kind analysis of urban forms using data at a planetary scale.

  1. [Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].

    PubMed

    Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta

    2014-01-01

    Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.

  2. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art

    PubMed Central

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time. PMID:29118692

  3. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art.

    PubMed

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time.

  4. A Retrospective, Cost-minimization Analysis of Disposable and Traditional Negative Pressure Wound Therapy Medicare Paid Claims

    PubMed

    Delhougne, Gary; Hogan, Christopher; Tarka, Kim; Nair, Sunitha

    2018-01-01

    Traditional negative pressure wound therapy (NPWT) systems are considered durable. The pump is designed for use by numerous patients over a period of several years. Recently developed smaller, disposable devices are designed for single-patient use. A retrospective analysis of 2012-2014 national Medicare claims data was used to examine payments associated with the use of traditional and disposable NPWT systems. Data extracted included NPWT episodes from the Limited Data Set Standard Analytic Files including the 5% sample for traditional NPWT and 100% sample for disposable NPWT. NPWT episodes were identified using claim service dates and billing codes. Mean costs per episode were compared and analyzed using chi-squared tests for comparisons between patients who received traditional and those who used disposable NPWT. For continuous variables, statistical significance was assessed using Mann-Whitney U tests. The data included traditional (n = 2938; mean age 66.6 years) and disposable (n = 3522; mean age 67.6 years) episodes for the 2 NPWT groups. Wound types differed for NPWT groups (P <.0001) and included surgical (1134 [39%] versus 764 [22%]), generic open (850 [29%] versus 342 [10%]), skin ulcers (561 [19%] versus 1301 [37%]), diabetic ulcers (240 [8%] versus 342 [10%]), and circulatory system wounds (105 [4%] versus 563 [16%]). Average payment amounts were $4650 ± $2782 for traditional and $1532 ± $1767 per disposable NPWT episode (P <.0001). Payment differences were not affected by wound or comorbidity characteristics. Using the 2016 rates, average payments were $3501 for traditional and $1564 for disposable NPWT. Considering the rate of NPWT use in the United States and the results of this study suggesting substantial potential cost savings, additional analyses and cost-effectiveness studies are warranted.

  5. Applications of "Integrated Data Viewer'' (IDV) in the classroom

    NASA Astrophysics Data System (ADS)

    Nogueira, R.; Cutrim, E. M.

    2006-06-01

    Conventionally, weather products utilized in synoptic meteorology reduce phenomena occurring in four dimensions to a 2-dimensional form. This constitutes a road-block for non-atmospheric-science majors who need to take meteorology as a non-mathematical and complementary course to their major programs. This research examines the use of Integrated Data Viewer-IDV as a teaching tool, as it allows a 4-dimensional representation of weather products. IDV was tested in the teaching of synoptic meteorology, weather analysis, and weather map interpretation to non-science students in the laboratory sessions of an introductory meteorology class at Western Michigan University. Comparison of student exam scores according to the laboratory teaching techniques, i.e., traditional lab manual and IDV was performed for short- and long-term learning. Results of the statistical analysis show that the Fall 2004 students in the IDV-based lab session retained learning. However, in the Spring 2005 the exam scores did not reflect retention in learning when compared with IDV-based and MANUAL-based lab scores (short term learning, i.e., exam taken one week after the lab exercise). Testing the long-term learning, seven weeks between the two exams in the Spring 2005, show no statistically significant difference between IDV-based group scores and MANUAL-based group scores. However, the IDV group obtained exam score average slightly higher than the MANUAL group. Statistical testing of the principal hypothesis in this study, leads to the conclusion that the IDV-based method did not prove to be a better teaching tool than the traditional paper-based method. Future studies could potentially find significant differences in the effectiveness of both manual and IDV methods if the conditions had been more controlled. That is, students in the control group should not be exposed to the weather analysis using IDV during lecture.

  6. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    ERIC Educational Resources Information Center

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  7. Redesigning a Statistical Concepts Course to Improve Retention, Satisfaction, and Success Rates of Non-Traditional Undergraduate Students

    ERIC Educational Resources Information Center

    Alpay, Nimet; Ratvasky, Pamela; Koehler, Natalya; LeVally, Carolyn; Washington, Tawana

    2017-01-01

    This case study investigated the impact of the Statistical Concepts course redesign on the retention, performance, and satisfaction of non-traditional undergraduate students. The redesign used a systematic approach and has been yielding positive impacts over 5 trimesters. Student attrition rates on average decreased by 12% and the number of…

  8. Course Format Effects on Learning Outcomes in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Sami, Fary

    2011-01-01

    The purpose of this study was to determine if course format significantly impacted student learning and course completion rates in an introductory statistics course taught at Harford Community College. In addition to the traditional lecture format, the College offers an online, and a hybrid (blend of traditional and online) version of this class.…

  9. Constrained Stochastic Extended Redundancy Analysis.

    PubMed

    DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco

    2015-06-01

    We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).

  10. Traditional agricultural practices and the sex ratio today

    PubMed Central

    2018-01-01

    We study the historical origins of cross-country differences in the male-to-female sex ratio. Our analysis focuses on the use of the plough in traditional agriculture. In societies that did not use the plough, women tended to participate in agriculture as actively as men. By contrast, in societies that used the plough, men specialized in agricultural work, due to the physical strength needed to pull the plough or control the animal that pulls it. We hypothesize that this difference caused plough-using societies to value boys more than girls. Today, this belief is reflected in male-biased sex ratios, which arise due to sex-selective abortion or infanticide, or gender-differences in access to family resources, which results in higher mortality rates for girls. Testing this hypothesis, we show that descendants of societies that traditionally practiced plough agriculture today have higher average male-to-female sex ratios. We find that this effect systematically increases in magnitude and statistical significance as one looks at older cohorts. Estimates using instrumental variables confirm our findings from multivariate OLS analysis. PMID:29338023

  11. Use Hierarchical Storage and Analysis to Exploit Intrinsic Parallelism

    NASA Astrophysics Data System (ADS)

    Zender, C. S.; Wang, W.; Vicente, P.

    2013-12-01

    Big Data is an ugly name for the scientific opportunities and challenges created by the growing wealth of geoscience data. How to weave large, disparate datasets together to best reveal their underlying properties, to exploit their strengths and minimize their weaknesses, to continually aggregate more information than the world knew yesterday and less than we will learn tomorrow? Data analytics techniques (statistics, data mining, machine learning, etc.) can accelerate pattern recognition and discovery. However, often researchers must, prior to analysis, organize multiple related datasets into a coherent framework. Hierarchical organization permits entire dataset to be stored in nested groups that reflect their intrinsic relationships and similarities. Hierarchical data can be simpler and faster to analyze by coding operators to automatically parallelize processes over isomorphic storage units, i.e., groups. The newest generation of netCDF Operators (NCO) embody this hierarchical approach, while still supporting traditional analysis approaches. We will use NCO to demonstrate the trade-offs involved in processing a prototypical Big Data application (analysis of CMIP5 datasets) using hierarchical and traditional analysis approaches.

  12. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    PubMed

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  13. Evaluation of virtual environment as a form of interactive resuscitation exam

    NASA Astrophysics Data System (ADS)

    Leszczyński, Piotr; Charuta, Anna; Kołodziejczak, Barbara; Roszak, Magdalena

    2017-10-01

    There is scientific evidence confirming the effectiveness of e-learning within resuscitation, however, there is not enough research on modern examination techniques within the scope. The aim of the pilot research is to compare the exam results in the field of Advanced Life Support in a traditional (paper) and interactive (computer) form as well as to evaluate satisfaction of the participants. A survey was conducted which meant to evaluate satisfaction of exam participants. Statistical analysis of the collected data was conducted at a significance level of α = 0.05 using STATISTICS v. 12. Final results of the traditional exam (67.5% ± 15.8%) differed significantly (p < 0.001) from the results of the interactive exam (53.3% ± 13.7%). However, comparing the number of students who did not pass the exam (passing point at 51%), no significant differences (p = 0.13) were observed between the two types exams. The feedback accuracy as well as the presence of well-prepared interactive questions could influence the evaluation of satisfaction of taking part in the electronic test. Significant differences between the results of a traditional test and the one supported by Computer Based Learning system showed the possibility of achieving a more detailed competence verification in the field of resuscitation thanks to interactive solutions.

  14. Enhancing predictive accuracy and reproducibility in clinical evaluation research: Commentary on the special section of the Journal of Evaluation in Clinical Practice.

    PubMed

    Bryant, Fred B

    2016-12-01

    This paper introduces a special section of the current issue of the Journal of Evaluation in Clinical Practice that includes a set of 6 empirical articles showcasing a versatile, new machine-learning statistical method, known as optimal data (or discriminant) analysis (ODA), specifically designed to produce statistical models that maximize predictive accuracy. As this set of papers clearly illustrates, ODA offers numerous important advantages over traditional statistical methods-advantages that enhance the validity and reproducibility of statistical conclusions in empirical research. This issue of the journal also includes a review of a recently published book that provides a comprehensive introduction to the logic, theory, and application of ODA in empirical research. It is argued that researchers have much to gain by using ODA to analyze their data. © 2016 John Wiley & Sons, Ltd.

  15. Using data warehousing and OLAP in public health care.

    PubMed

    Hristovski, D; Rogac, M; Markota, M

    2000-01-01

    The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches.

  16. Using data warehousing and OLAP in public health care.

    PubMed Central

    Hristovski, D.; Rogac, M.; Markota, M.

    2000-01-01

    The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches. PMID:11079907

  17. Machine Learning-Augmented Propensity Score-Adjusted Multilevel Mixed Effects Panel Analysis of Hands-On Cooking and Nutrition Education versus Traditional Curriculum for Medical Students as Preventive Cardiology: Multisite Cohort Study of 3,248 Trainees over 5 Years

    PubMed Central

    Dart, Lyn; Vanbeber, Anne; Smith-Barbaro, Peggy; Costilla, Vanessa; Samuel, Charlotte; Terregino, Carol A.; Abali, Emine Ercikan; Dollinger, Beth; Baumgartner, Nicole; Kramer, Nicholas; Seelochan, Alex; Taher, Sabira; Deutchman, Mark; Evans, Meredith; Ellis, Robert B.; Oyola, Sonia; Maker-Clark, Geeta; Budnick, Isadore; Tran, David; DeValle, Nicole; Shepard, Rachel; Chow, Erika; Petrin, Christine; Razavi, Alexander; McGowan, Casey; Grant, Austin; Bird, Mackenzie; Carry, Connor; McGowan, Glynis; McCullough, Colleen; Berman, Casey M.; Dotson, Kerri; Sarris, Leah; Harlan, Timothy S.; Co-investigators, on behalf of the CHOP

    2018-01-01

    Background Cardiovascular disease (CVD) annually claims more lives and costs more dollars than any other disease globally amid widening health disparities, despite the known significant reductions in this burden by low cost dietary changes. The world's first medical school-based teaching kitchen therefore launched CHOP-Medical Students as the largest known multisite cohort study of hands-on cooking and nutrition education versus traditional curriculum for medical students. Methods This analysis provides a novel integration of artificial intelligence-based machine learning (ML) with causal inference statistics. 43 ML automated algorithms were tested, with the top performer compared to triply robust propensity score-adjusted multilevel mixed effects regression panel analysis of longitudinal data. Inverse-variance weighted fixed effects meta-analysis pooled the individual estimates for competencies. Results 3,248 unique medical trainees met study criteria from 20 medical schools nationally from August 1, 2012, to June 26, 2017, generating 4,026 completed validated surveys. ML analysis produced similar results to the causal inference statistics based on root mean squared error and accuracy. Hands-on cooking and nutrition education compared to traditional medical school curriculum significantly improved student competencies (OR 2.14, 95% CI 2.00–2.28, p < 0.001) and MedDiet adherence (OR 1.40, 95% CI 1.07–1.84, p = 0.015), while reducing trainees' soft drink consumption (OR 0.56, 95% CI 0.37–0.85, p = 0.007). Overall improved competencies were demonstrated from the initial study site through the scale-up of the intervention to 10 sites nationally (p < 0.001). Discussion This study provides the first machine learning-augmented causal inference analysis of a multisite cohort showing hands-on cooking and nutrition education for medical trainees improves their competencies counseling patients on nutrition, while improving students' own diets. This study suggests that the public health and medical sectors can unite population health management and precision medicine for a sustainable model of next-generation health systems providing effective, equitable, accessible care beginning with reversing the CVD epidemic. PMID:29850526

  18. Machine Learning-Augmented Propensity Score-Adjusted Multilevel Mixed Effects Panel Analysis of Hands-On Cooking and Nutrition Education versus Traditional Curriculum for Medical Students as Preventive Cardiology: Multisite Cohort Study of 3,248 Trainees over 5 Years.

    PubMed

    Monlezun, Dominique J; Dart, Lyn; Vanbeber, Anne; Smith-Barbaro, Peggy; Costilla, Vanessa; Samuel, Charlotte; Terregino, Carol A; Abali, Emine Ercikan; Dollinger, Beth; Baumgartner, Nicole; Kramer, Nicholas; Seelochan, Alex; Taher, Sabira; Deutchman, Mark; Evans, Meredith; Ellis, Robert B; Oyola, Sonia; Maker-Clark, Geeta; Dreibelbis, Tomi; Budnick, Isadore; Tran, David; DeValle, Nicole; Shepard, Rachel; Chow, Erika; Petrin, Christine; Razavi, Alexander; McGowan, Casey; Grant, Austin; Bird, Mackenzie; Carry, Connor; McGowan, Glynis; McCullough, Colleen; Berman, Casey M; Dotson, Kerri; Niu, Tianhua; Sarris, Leah; Harlan, Timothy S; Co-Investigators, On Behalf Of The Chop

    2018-01-01

    Cardiovascular disease (CVD) annually claims more lives and costs more dollars than any other disease globally amid widening health disparities, despite the known significant reductions in this burden by low cost dietary changes. The world's first medical school-based teaching kitchen therefore launched CHOP-Medical Students as the largest known multisite cohort study of hands-on cooking and nutrition education versus traditional curriculum for medical students. This analysis provides a novel integration of artificial intelligence-based machine learning (ML) with causal inference statistics. 43 ML automated algorithms were tested, with the top performer compared to triply robust propensity score-adjusted multilevel mixed effects regression panel analysis of longitudinal data. Inverse-variance weighted fixed effects meta-analysis pooled the individual estimates for competencies. 3,248 unique medical trainees met study criteria from 20 medical schools nationally from August 1, 2012, to June 26, 2017, generating 4,026 completed validated surveys. ML analysis produced similar results to the causal inference statistics based on root mean squared error and accuracy. Hands-on cooking and nutrition education compared to traditional medical school curriculum significantly improved student competencies (OR 2.14, 95% CI 2.00-2.28, p < 0.001) and MedDiet adherence (OR 1.40, 95% CI 1.07-1.84, p = 0.015), while reducing trainees' soft drink consumption (OR 0.56, 95% CI 0.37-0.85, p = 0.007). Overall improved competencies were demonstrated from the initial study site through the scale-up of the intervention to 10 sites nationally ( p < 0.001). This study provides the first machine learning-augmented causal inference analysis of a multisite cohort showing hands-on cooking and nutrition education for medical trainees improves their competencies counseling patients on nutrition, while improving students' own diets. This study suggests that the public health and medical sectors can unite population health management and precision medicine for a sustainable model of next-generation health systems providing effective, equitable, accessible care beginning with reversing the CVD epidemic.

  19. Point-by-point compositional analysis for atom probe tomography.

    PubMed

    Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P

    2014-01-01

    This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.

  20. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    NASA Astrophysics Data System (ADS)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  1. The Use of Session RPE to Monitor the Intensity of Weight Training in Older Women: Acute Responses to Eccentric, Concentric, and Dynamic Exercises

    PubMed Central

    Ferreira, Sandro S.; Krinski, Kleverton; Alves, Ragami C.; Benites, Mariana L.; Redkva, Paulo E.; Elsangedy, Hassan M.; Buzzachera, Cosme F.; Souza-Junior, Tácito P.; da Silva, Sergio G.

    2014-01-01

    The rating of perceived exertion (RPE) is ability to detect and interpret organic sensations while performing exercises. This method has been used to measure the level of effort that is felt during weight-training at a given intensity. The purpose of this investigation was to compare session RPE values with those of traditional RPE measurements for different weight-training muscle actions, performed together or separately. Fourteen women with no former weight-training experience were recruited for the investigation. All participants completed five sessions of exercise: familiarization, maximum force, concentric-only (CONC-only), eccentric-only (ECC-only), and dynamic (DYN = CONC + ECC). The traditional RPE method was measured after each series of exercises, and the session RPE was measured 30 min after the end of the training session. The statistical analyses used were the paired t-test, one-way analysis of variance, and repeated measures analysis of variance. Significant differences between traditional RPE and session RPE for DYN, CONC, and ECC exercises were not found. This investigation demonstrated that session RPE is similar to traditional RPE in terms of weight-training involving concentric, eccentric, or dynamic muscle exercises, and that it can be used to prescribe and monitor weight-training sessions in older subjects. PMID:24834354

  2. Influence of Geographical Origin and Flour Type on Diversity of Lactic Acid Bacteria in Traditional Belgian Sourdoughs▿ †

    PubMed Central

    Scheirlinck, Ilse; Van der Meulen, Roel; Van Schoor, Ann; Vancanneyt, Marc; De Vuyst, Luc; Vandamme, Peter; Huys, Geert

    2007-01-01

    A culture-based approach was used to investigate the diversity of lactic acid bacteria (LAB) in Belgian traditional sourdoughs and to assess the influence of flour type, bakery environment, geographical origin, and technological characteristics on the taxonomic composition of these LAB communities. For this purpose, a total of 714 LAB from 21 sourdoughs sampled at 11 artisan bakeries throughout Belgium were subjected to a polyphasic identification approach. The microbial composition of the traditional sourdoughs was characterized by bacteriological culture in combination with genotypic identification methods, including repetitive element sequence-based PCR fingerprinting and phenylalanyl-tRNA synthase (pheS) gene sequence analysis. LAB from Belgian sourdoughs belonged to the genera Lactobacillus, Pediococcus, Leuconostoc, Weissella, and Enterococcus, with the heterofermentative species Lactobacillus paralimentarius, Lactobacillus sanfranciscensis, Lactobacillus plantarum, and Lactobacillus pontis as the most frequently isolated taxa. Statistical analysis of the identification data indicated that the microbial composition of the sourdoughs is mainly affected by the bakery environment rather than the flour type (wheat, rye, spelt, or a mixture of these) used. In conclusion, the polyphasic approach, based on rapid genotypic screening and high-resolution, sequence-dependent identification, proved to be a powerful tool for studying the LAB diversity in traditional fermented foods such as sourdough. PMID:17675431

  3. Influence of geographical origin and flour type on diversity of lactic acid bacteria in traditional Belgian sourdoughs.

    PubMed

    Scheirlinck, Ilse; Van der Meulen, Roel; Van Schoor, Ann; Vancanneyt, Marc; De Vuyst, Luc; Vandamme, Peter; Huys, Geert

    2007-10-01

    A culture-based approach was used to investigate the diversity of lactic acid bacteria (LAB) in Belgian traditional sourdoughs and to assess the influence of flour type, bakery environment, geographical origin, and technological characteristics on the taxonomic composition of these LAB communities. For this purpose, a total of 714 LAB from 21 sourdoughs sampled at 11 artisan bakeries throughout Belgium were subjected to a polyphasic identification approach. The microbial composition of the traditional sourdoughs was characterized by bacteriological culture in combination with genotypic identification methods, including repetitive element sequence-based PCR fingerprinting and phenylalanyl-tRNA synthase (pheS) gene sequence analysis. LAB from Belgian sourdoughs belonged to the genera Lactobacillus, Pediococcus, Leuconostoc, Weissella, and Enterococcus, with the heterofermentative species Lactobacillus paralimentarius, Lactobacillus sanfranciscensis, Lactobacillus plantarum, and Lactobacillus pontis as the most frequently isolated taxa. Statistical analysis of the identification data indicated that the microbial composition of the sourdoughs is mainly affected by the bakery environment rather than the flour type (wheat, rye, spelt, or a mixture of these) used. In conclusion, the polyphasic approach, based on rapid genotypic screening and high-resolution, sequence-dependent identification, proved to be a powerful tool for studying the LAB diversity in traditional fermented foods such as sourdough.

  4. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.

    PubMed

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-10-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.

  5. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES

    PubMed Central

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-01-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512

  6. Possibility of reconstruction of dental plaster cast from 3D digital study models

    PubMed Central

    2013-01-01

    Objectives To compare traditional plaster casts, digital models and 3D printed copies of dental plaster casts based on various criteria. To determine whether 3D printed copies obtained using open source system RepRap can replace traditional plaster casts in dental practice. To compare and contrast the qualities of two possible 3D printing options – open source system RepRap and commercially available 3D printing. Design and settings A method comparison study on 10 dental plaster casts from the Orthodontic department, Department of Stomatology, 2nd medical Faulty, Charles University Prague, Czech Republic. Material and methods Each of 10 plaster casts were scanned by inEos Blue scanner and the printed on 3D printer RepRap [10 models] and ProJet HD3000 3D printer [1 model]. Linear measurements between selected points on the dental arches of upper and lower jaws on plaster casts and its 3D copy were recorded and statistically analyzed. Results 3D printed copies have many advantages over traditional plaster casts. The precision and accuracy of the RepRap 3D printed copies of plaster casts were confirmed based on the statistical analysis. Although the commercially available 3D printing enables to print more details than the RepRap system, it is expensive and for the purpose of clinical use can be replaced by the cheaper prints obtained from RepRap printed copies. Conclusions Scanning of the traditional plaster casts to obtain a digital model offers a pragmatic approach. The scans can subsequently be used as a template to print the plaster casts as required. Using 3D printers can replace traditional plaster casts primarily due to their accuracy and price. PMID:23721330

  7. Esthetic perception of orthodontic appliances by Brazilian children and adolescents.

    PubMed

    Kuhlman, Deise Caldas; Lima, Tatiana Araújo de; Duplat, Candice Belchior; Capelli, Jonas

    2016-01-01

    The objective of this present study was to understand how children and adolescents perceive esthetic attractiveness of a variety of orthodontic appliances. It also analyzed preferences according to patients' age, sex and socioeconomic status. A photograph album consisting of eight photographs of different orthodontic appliances and clear tray aligners placed in a consenting adult with pleasing smile was used. A sample of children or adolescents aged between 8 and 17 years old (n = 276) was asked to rate each image for its attractiveness on a visual analog scale. Comparisons between the appliances attractiveness were performed by means of nonparametric statistics with Friedman's test followed by Dunn's multiple comparison post-hoc test. Correlation between appliances and individuals' socioeconomic status, age, sex, and esthetic perception was assessed by means of Spearman's correlation analysis. Attractiveness ratings of orthodontic appliances varied nonsignificantly for children in the following hierarchy: traditional metallic brackets with green elastomeric ligatures > traditional metallic brackets with gray elastomeric ligatures > sapphire esthetic brackets; and for adolescents, as follows: sapphire esthetic brackets > clear aligner without attachments > traditional metallic brackets with green elastomeric ligatures. The correlation between individuals' socioeconomic status and esthetic perception of a given appliance was negative and statistically significant for appliances such as the golden orthodontic brackets and traditional metallic brackets with green elastomeric ligatures. Metal appliances were considered very attractive, whereas aligners were classified as less attractive by children and adolescents. The correlation between esthetic perception and socioeconomic status revealed that individuals with a higher socioeconomic level judged esthetics as the most attractive attribute. For those with higher economic status, golden orthodontic brackets and traditional metallic brackets with green elastomeric ligatures were assessed as the worst esthetic option.

  8. Esthetic perception of orthodontic appliances by Brazilian children and adolescents

    PubMed Central

    Kuhlman, Deise Caldas; de Lima, Tatiana Araújo; Duplat, Candice Belchior; Capelli, Jonas

    2016-01-01

    ABSTRACT Objective: The objective of this present study was to understand how children and adolescents perceive esthetic attractiveness of a variety of orthodontic appliances. It also analyzed preferences according to patients' age, sex and socioeconomic status. Methods: A photograph album consisting of eight photographs of different orthodontic appliances and clear tray aligners placed in a consenting adult with pleasing smile was used. A sample of children or adolescents aged between 8 and 17 years old (n = 276) was asked to rate each image for its attractiveness on a visual analog scale. Comparisons between the appliances attractiveness were performed by means of nonparametric statistics with Friedman's test followed by Dunn's multiple comparison post-hoc test. Correlation between appliances and individuals' socioeconomic status, age, sex, and esthetic perception was assessed by means of Spearman's correlation analysis. Results: Attractiveness ratings of orthodontic appliances varied nonsignificantly for children in the following hierarchy: traditional metallic brackets with green elastomeric ligatures > traditional metallic brackets with gray elastomeric ligatures > sapphire esthetic brackets; and for adolescents, as follows: sapphire esthetic brackets > clear aligner without attachments > traditional metallic brackets with green elastomeric ligatures. The correlation between individuals' socioeconomic status and esthetic perception of a given appliance was negative and statistically significant for appliances such as the golden orthodontic brackets and traditional metallic brackets with green elastomeric ligatures. Conclusion: Metal appliances were considered very attractive, whereas aligners were classified as less attractive by children and adolescents. The correlation between esthetic perception and socioeconomic status revealed that individuals with a higher socioeconomic level judged esthetics as the most attractive attribute. For those with higher economic status, golden orthodontic brackets and traditional metallic brackets with green elastomeric ligatures were assessed as the worst esthetic option. PMID:27901230

  9. Data on the application of Functional Data Analysis in food fermentations.

    PubMed

    Ruiz-Bellido, M A; Romero-Gil, V; García-García, P; Rodríguez-Gómez, F; Arroyo-López, F N; Garrido-Fernández, A

    2016-12-01

    This article refers to the paper "Assessment of table olive fermentation by functional data analysis" (Ruiz-Bellido et al., 2016) [1]. The dataset include pH, titratable acidity, yeast count and area values obtained during fermentation process (380 days) of Aloreña de Málaga olives subjected to five different fermentation systems: i) control of acidified cured olives, ii) highly acidified cured olives, iii) intermediate acidified cured olives, iv) control of traditional cracked olives, and v) traditional olives cracked after 72 h of exposure to air. Many of the Tables and Figures shown in this paper were deduced after application of Functional Data Analysis to raw data using a routine executed under R software for comparison among treatments by the transformation of raw data into smooth curves and the application of a new battery of statistical tools (functional pointwise estimation of the averages and standard deviations, maximum, minimum, first and second derivatives, functional regression, and functional F and t-tests).

  10. Principles of Rapid Acquisition and Systems Engineering

    DTIC Science & Technology

    2012-06-14

    Systems Engineering Research Council ( SERC ) research team interviewed over 30 organizations from across the DoD which focus on less traditional...enthusiasm • Lt Col John Elshaw, for his guidance on our statistical analysis • Our sponsors, the SERC , SAF/AQR, and the AF Center for Systems...experienced staff of 20 – 50 people” (Defense Science Board, 2011) Research Focus The Systems Engineering Research Center ( SERC ) has been charged with

  11. [Application of nested case-control study on safe evaluation of post-marketing traditional Chinese medicine injection].

    PubMed

    Xiao, Ying; Zhao, Yubin; Xie, Yanming

    2011-10-01

    The nested case-control study design (or the case-control in a cohort study) is described here as a new study design used in safe evaluation of post-marketing traditional Chinese medicine injection. In the nested case-control study, cases of a disease that occur in a defined cohort are identified and, for each, a specified number of matched controls is selected from among those in the cohort who have not developed the disease by the time of disease occurrence in the case. For many research questions, the nested case-control design potentially offers impressive reductions in costs and efforts of data collection and analysis compared with the full cohort approach, with relatively minor loss in statistical efficiency. The nested case-control design is particularly advantageous for studies in safe evaluation of post-marketing traditional Chinese medicine injection. Some examples of the application of nested case-control study were given.

  12. Correlation between musical aptitude and learning foreign languages: an epidemiological study in secondary school Italian students.

    PubMed

    Picciotti, P M; Bussu, F; Calò, L; Gallus, R; Scarano, E; DI Cintio, G; Cassarà, F; D'Alatri, L

    2018-02-01

    The aim of this study was to assess if a correlation exists between language learning skills and musical aptitude through the analysis of scholarly outcomes concerning the study of foreign languages and music. We enrolled 502 students from a secondary Italian school (10-14 years old), attending both traditional courses (2 hours/week of music classes scheduled) and special courses (six hours). For statistical analysis, we considered grades in English, French and Music. Our results showed a significant correlation between grades in the two foreign languages and in music, both in the traditional courses and in special courses, and better results in French than for special courses. These results are discussed and interpreted through the literature about neuroanatomical and physiological mechanisms of foreign language learning and music perception. Copyright © 2018 Società Italiana di Otorinolaringologia e Chirurgia Cervico-Facciale, Rome, Italy.

  13. Power in randomized group comparisons: the value of adding a single intermediate time point to a traditional pretest-posttest design.

    PubMed

    Venter, Anre; Maxwell, Scott E; Bolig, Erika

    2002-06-01

    Adding a pretest as a covariate to a randomized posttest-only design increases statistical power, as does the addition of intermediate time points to a randomized pretest-posttest design. Although typically 5 waves of data are required in this instance to produce meaningful gains in power, a 3-wave intensive design allows the evaluation of the straight-line growth model and may reduce the effect of missing data. The authors identify the statistically most powerful method of data analysis in the 3-wave intensive design. If straight-line growth is assumed, the pretest-posttest slope must assume fairly extreme values for the intermediate time point to increase power beyond the standard analysis of covariance on the posttest with the pretest as covariate, ignoring the intermediate time point.

  14. Learning style-based teaching harvests a superior comprehension of respiratory physiology.

    PubMed

    Anbarasi, M; Rajkumar, G; Krishnakumar, S; Rajendran, P; Venkatesan, R; Dinesh, T; Mohan, J; Venkidusamy, S

    2015-09-01

    Students entering medical college generally show vast diversity in their school education. It becomes the responsibility of teachers to motivate students and meet the needs of all diversities. One such measure is teaching students in their own preferred learning style. The present study was aimed to incorporate a learning style-based teaching-learning program for medical students and to reveal its significance and utility. Learning styles of students were assessed online using the visual-auditory-kinesthetic (VAK) learning style self-assessment questionnaire. When respiratory physiology was taught, students were divided into three groups, namely, visual (n = 34), auditory (n = 44), and kinesthetic (n = 28), based on their learning style. A fourth group (the traditional group; n = 40) was formed by choosing students randomly from the above three groups. Visual, auditory, and kinesthetic groups were taught following the appropriate teaching-learning strategies. The traditional group was taught via the routine didactic lecture method. The effectiveness of this intervention was evaluated by a pretest and two posttests, posttest 1 immediately after the intervention and posttest 2 after a month. In posttest 1, one-way ANOVA showed a significant statistical difference (P=0.005). Post hoc analysis showed significance between the kinesthetic group and traditional group (P=0.002). One-way ANOVA showed a significant difference in posttest 2 scores (P < 0.0001). Post hoc analysis showed significance between the three learning style-based groups compared with the traditional group [visual vs. traditional groups (p=0.002), auditory vs. traditional groups (p=0.03), and Kinesthetic vs. traditional groups (p=0.001)]. This study emphasizes that teaching methods tailored to students' style of learning definitely improve their understanding, performance, and retrieval of the subject. Copyright © 2015 The American Physiological Society.

  15. [Advantages and problems of traditional Chinese medicine in treatment of acute pharyngitis].

    PubMed

    Zhang, Xia; Xie, Yan-Ming; Li, Guang-Xi; Gao, Yang; Zhao, Yuan-Chen; Tang, Jing-Jing; Yao, Xiao-Yan; Li, Meng

    2017-10-01

    This paper systematically studies relevant literatures at home and abroad in recent years. China National Knowledge Internet (CNKI) was used to collect the literatures for acute pharyngitis treated with traditional Chinese medicine from January 1, 2006, to December 31, 2016, and the bibliometric method was employed for statistical analysis. A total of 493 papers were preliminarily selected. According to the inclusion criteria and exclusion criteria, 182 eligible articles were selected. According to the evaluation and analysis of the literatures, the Guidelines for Clinical Research of New Drugs is currently used as the common standards for the diagnosis and treatment of acute pharyngitis; Chinese patent medicines are the main traditional Chinese medicine for treating this disease; Decoctions for treatment of this disease include Lonicerae Japonicae Flos, Scutellariae Radix, Platycodonis Radix, Forsythiae Fructus, Glycyrrhizae Radix et Rhizoma, Scrophdlariae Radix, Isatidis Radix, and Ophiopogonis Radix; The bloodletting puncture is the common external therapy. Traditional Chinese medicine and Western medicine have their own characteristics in the treatment of this disease. Western medicine for the treatment of acute pharyngitis are mainly antiviral, antibiotic and glucocorticoid drugs, whose disadvantages are toxicity, side effects, drug resistance and double infections. Traditional Chinese medicine doctors have rich experiences in the treatment of the disease, which is characterized by treatment determination based on syndrome differentiation, safe and reliable medication, significant curative effect, low drug resistance, and wide varieties of traditional Chinese medicine forms, convenient portability and taking, low price, and low toxic and side effects. It is an arduous and significant task to explore traditional Chinese medicine, and study and develop new-type effective drugs. Copyright© by the Chinese Pharmaceutical Association.

  16. An Analysis of High School Students' Performance on Five Integrated Science Process Skills

    NASA Astrophysics Data System (ADS)

    Beaumont-Walters, Yvonne; Soyibo, Kola

    2001-02-01

    This study determined Jamaican high school students' level of performance on five integrated science process skills and if there were statistically significant differences in their performance linked to their gender, grade level, school location, school type, student type and socio-economic background (SEB). The 305 subjects comprised 133 males, 172 females, 146 ninth graders, 159 10th graders, 150 traditional and 155 comprehensive high school students, 164 students from the Reform of Secondary Education (ROSE) project and 141 non-ROSE students, 166 urban and 139 rural students and 110 students from a high SEB and 195 from a low SEB. Data were collected with the authors' constructed integrated science process skills test the results indicated that the subjects' mean score was low and unsatisfactory; their performance in decreasing order was: interpreting data, recording data, generalising, formulating hypotheses and identifying variables; there were statistically significant differences in their performance based on their grade level, school type, student type, and SEB in favour of the 10th graders, traditional high school students, ROSE students and students from a high SEB. There was a positive, statistically significant and fairly strong relationship between their performance and school type, but weak relationships among their student type, grade level and SEB and performance.

  17. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  18. Study on the relation between tissues pathologies and traditional chinese medicine syndromes in knee osteoarthritis: Medical image diagnostics by preoperative X-ray and surgical arthroscopy.

    PubMed

    Tian, Xiangdong; Zhu, Guangyu; Wang, Jian; Wang, Qingfu; Guan, Lei; Tan, Yetong; Xue, Zhipeng; Qin, Lina; Zhang, Jing

    2016-04-07

    This study aims to investigate whether integration of traditional Chinese medicine and modern medicine has advantage in achieving the improved diagnosis and treatment of knee osteoarthritis. 90 patients with knee osteoarthritis were selected from The Department of Minimal Invasive Joint of The Third Affiliated Hospital of Beijing University of Chinese Medicine from June 2013 to June 2015. They were divided into 3 groups with 30 cases per group in accordance to the syndrome differentiation of traditional Chinese medicine. The patients underwent arthroscopic surgery, and we categorized the patients having the same characterization in each group, and those having distinct difference into the three groups. Based on the arthroscopic analysis, we performed analysis of statistical data in order to analyze the relation between knee osteoarthritis under arthroscope and traditional Chinese medicine syndromes. There are three syndromes according to traditional Chinese medicine that can be categorized into various different groups. The synovial proliferation can be seen mostly in the syndrome of stagnation of blood stasis. The slight damage of knee joint cartilage can be seen in the syndrome of yang deficiency and cold stagnation, the severe one in the syndrome of kidney-marrow deficiency. We found that there are different pathological expressions with the various degree of the tissues damage at the knee and we categorized the knee according to their syndrome. For knee osteoarthritis, different syndromes of traditional Chinese medicine presents different tissues pathological changes at the knee joint under arthroscopy, which will provide objective basis for the diagnosis of this medical condition.

  19. Quality assessment of raw and processed Arctium lappa L. through multicomponent quantification, chromatographic fingerprint, and related chemometric analysis.

    PubMed

    Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang

    2015-05-01

    In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Traditional and Atypical Presentations of Anxiety in Youth with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Kerns, Connor Morrow; Kendall, Philip C.; Berry, Leandra; Souders, Margaret C.; Franklin, Martin E.; Schultz, Robert T.; Miller, Judith; Herrington, John

    2014-01-01

    We assessed anxiety consistent (i.e., "traditional") and inconsistent (i.e., "atypical") with diagnostic and statistical manual (DSM) definitions in autism spectrum disorder (ASD). Differential relationships between traditional anxiety, atypical anxiety, child characteristics, anxiety predictors and ASD-symptomology were…

  1. Human Fear Chemosignaling: Evidence from a Meta-Analysis.

    PubMed

    de Groot, Jasper H B; Smeets, Monique A M

    2017-10-01

    Alarm pheromones are widely used in the animal kingdom. Notably, there are 26 published studies (N = 1652) highlighting a human capacity to communicate fear, stress, and anxiety via body odor from one person (66% males) to another (69% females). The question is whether the findings of this literature reflect a true effect, and what the average effect size is. These questions were answered by combining traditional meta-analysis with novel meta-analytical tools, p-curve analysis and p-uniform-techniques that could indicate whether findings are likely to reflect a true effect based on the distribution of P-values. A traditional random-effects meta-analysis yielded a small-to-moderate effect size (Hedges' g: 0.36, 95% CI: 0.31-0.41), p-curve analysis showed evidence diagnostic of a true effect (ps < 0.0001), and there was no evidence for publication bias. This meta-analysis did not assess the internal validity of the current studies; yet, the combined results illustrate the statistical robustness of a field in human olfaction dealing with the human capacity to communicate certain emotions (fear, stress, anxiety) via body odor. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Algorithm for computing descriptive statistics for very large data sets and the exa-scale era

    NASA Astrophysics Data System (ADS)

    Beekman, Izaak

    2017-11-01

    An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.

  3. Knowledge level of effect size statistics, confidence intervals and meta-analysis in Spanish academic psychologists.

    PubMed

    Badenes-Ribera, Laura; Frias-Navarro, Dolores; Pascual-Soler, Marcos; Monterde-I-Bort, Héctor

    2016-11-01

    The statistical reform movement and the American Psychological Association (APA) defend the use of estimators of the effect size and its confidence intervals, as well as the interpretation of the clinical significance of the findings. A survey was conducted in which academic psychologists were asked about their behavior in designing and carrying out their studies. The sample was composed of 472 participants (45.8% men). The mean number of years as a university professor was 13.56 years (SD= 9.27). The use of effect-size estimators is becoming generalized, as well as the consideration of meta-analytic studies. However, several inadequate practices still persist. A traditional model of methodological behavior based on statistical significance tests is maintained, based on the predominance of Cohen’s d and the unadjusted R2/η2, which are not immune to outliers or departure from normality and the violations of statistical assumptions, and the under-reporting of confidence intervals of effect-size statistics. The paper concludes with recommendations for improving statistical practice.

  4. [Approaches to medical training among physicians who teach; analysis of two different educational strategies].

    PubMed

    Loría-Castellanos, Jorge; Rivera-lbarra, Doris Beatriz; Márquez-Avila, Guadalupe

    2009-01-01

    Compare the outreach of a promotional educational strategy that focuses on active participation and compare it with a more traditional approach to medical training. A quasi-experimental design was approved by the research committee. We compared the outreach of two different approaches to medical training. We administered a validated instrument that included 72 items that analyze statements used to measure educational tasks in the form of duplets through 3 indicators. A group that included seven physicians that were actively participating in teaching activities was stratified according to teaching approaches. One of the approaches was a traditional one and the other included a promotional strategy aimed at increasing participation. All participants signed informed consent before answering the research instruments. Statistical analysis was done using non-parametric tests. Mann-Whitney results did not show differences among the group in the preliminary analysis. A second analysis with the same test after the interventions found significant differences (p d" 0.018) in favor of those subjects that had participated in the promotional approach mainly in the indicator measuring "consequence". The Wilcoxon test showed that all participants in the promotional approach increased significantly (pd" 0.018) in 3 main indicators as compared with the control group. A promotional strategy aimed at increasing physician participation constitutes a more profitable approach when compared with traditional teaching methods.

  5. [Analysis on composition principles of formulae containing Gardeniae Fructus in dictionary of traditional Chinese medicine prescriptions].

    PubMed

    Hu, Yan-Zhen; Wei, Jun-Ying; Tang, Shi-Huan; Yang, Hong-Jun

    2016-04-01

    Gardeniae Fructus, which is widely used in health foods and clinical medicines, is a type of edible food and medicine. Dictionary of traditional Chinese medicine prescriptions provides good materials for prescription analysis and the R&D of traditional Chinese medicines. The composition regularity of formulae containing Gardeniae Fructus in dictionary of traditional Chinese medicine prescriptions was analyzed on the basis of the traditional Chinese medicine inheritance support system(TCMISS), in order to provide reference for clinical application and the R&D of new drugs. TCMISS was applied to establish a database of prescriptions containing Gardeniae Fructus. The software's frequency statistics and association rules and other date mining technologies were adopted to analyze commonly used drugs, combination rules and core combined formulae containing Gardeniae Fructus. Totally 3 523 prescriptions were included in this study and involved 1 725 Chinese herbs. With a support degree of 352(10%) and confidence coefficient of 90%, 57 most commonly used drug combinations were screened. Drugs adopted in core combinations were relatively concentrated and selected according to definite composition methods. They were used to mainly treat 18 diseases. Gardeniae Fructus have often been combined with herbs for heat-clearing and detoxification, expelling pathogenic wind, relieving exterior syndrome, invigorating the circulation of blood and gas and promoting blood circulation for removing blood stasis to mainly treat jaundice, typhoid, headache and other syndromes. Copyright© by the Chinese Pharmaceutical Association.

  6. Statistical analysis of modal properties of a cable-stayed bridge through long-term structural health monitoring with wireless smart sensor networks

    NASA Astrophysics Data System (ADS)

    Asadollahi, Parisa; Li, Jian

    2016-04-01

    Understanding the dynamic behavior of complex structures such as long-span bridges requires dense deployment of sensors. Traditional wired sensor systems are generally expensive and time-consuming to install due to cabling. With wireless communication and on-board computation capabilities, wireless smart sensor networks have the advantages of being low cost, easy to deploy and maintain and therefore facilitate dense instrumentation for structural health monitoring. A long-term monitoring project was recently carried out for a cable-stayed bridge in South Korea with a dense array of 113 smart sensors, which feature the world's largest wireless smart sensor network for civil structural monitoring. This paper presents a comprehensive statistical analysis of the modal properties including natural frequencies, damping ratios and mode shapes of the monitored cable-stayed bridge. Data analyzed in this paper is composed of structural vibration signals monitored during a 12-month period under ambient excitations. The correlation between environmental temperature and the modal frequencies is also investigated. The results showed the long-term statistical structural behavior of the bridge, which serves as the basis for Bayesian statistical updating for the numerical model.

  7. Alveolar ridge preservation after tooth extraction: a Bayesian Network meta-analysis of grafting materials efficacy on prevention of bone height and width reduction.

    PubMed

    Iocca, Oreste; Farcomeni, Alessio; Pardiñas Lopez, Simon; Talib, Huzefa S

    2017-01-01

    To conduct a traditional meta-analysis and a Bayesian Network meta-analysis to synthesize the information coming from randomized controlled trials on different socket grafting materials and combine the resulting indirect evidence in order to make inferences on treatments that have not been compared directly. RCTs were identified for inclusion in the systematic review and subsequent statistical analysis. Bone height and width remodelling were selected as the chosen summary measures for comparison. First, a series of pairwise meta-analyses were performed and overall mean difference (MD) in mm with 95% CI was calculated between grafted versus non-grafted sockets. Then, a Bayesian Network meta-analysis was performed to draw indirect conclusions on which grafting materials can be considered most likely the best compared to the others. From the six included studies, seven comparisons were obtained. Traditional meta-analysis showed statistically significant results in favour of grafting the socket compared to no-graft both for height (MD 1.02, 95% CI 0.44-1.59, p value < 0.001) than for width (MD 1.52 95% CI 1.18-1.86, p value <0.000001) remodelling. Bayesian Network meta-analysis allowed to obtain a rank of intervention efficacy. On the basis of the results of the present analysis, socket grafting seems to be more favourable than unassisted socket healing. Moreover, Bayesian Network meta-analysis indicates that freeze-dried bone graft plus membrane is the most likely effective in the reduction of bone height remodelling. Autologous bone marrow resulted the most likely effective when width remodelling was considered. Studies with larger samples and less risk of bias should be conducted in the future in order to further strengthen the results of this analysis. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  9. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  10. Computing Science and Statistics. Volume 24. Graphics and Visualization

    DTIC Science & Technology

    1993-03-01

    the dough , turbulent fluid flow, the time between drips of behavior changes radically when the population growth water from a faucet, Brownian motion... cookie which clearly is the discrete parameter analogue of continuous param- appropriate as after dinner fun. eter time series analysis". I strongly...methods. Your fortune cookie of the night reads: One problem that statisticians traditionally seem to "uYou have good friends who will come to your aid in

  11. Correlation between the different therapeutic properties of Chinese medicinal herbs and delayed luminescence.

    PubMed

    Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang

    2016-03-01

    In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Meteor localization via statistical analysis of spatially temporal fluctuations in image sequences

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Šihlík, Jan; Fliegel, Karel

    2015-09-01

    Meteor detection is one of the most important procedures in astronomical imaging. Meteor path in Earth's atmosphere is traditionally reconstructed from double station video observation system generating 2D image sequences. However, the atmospheric turbulence and other factors cause spatially-temporal fluctuations of image background, which makes the localization of meteor path more difficult. Our approach is based on nonlinear preprocessing of image intensity using Box-Cox and logarithmic transform as its particular case. The transformed image sequences are then differentiated along discrete coordinates to obtain statistical description of sky background fluctuations, which can be modeled by multivariate normal distribution. After verification and hypothesis testing, we use the statistical model for outlier detection. Meanwhile the isolated outlier points are ignored, the compact cluster of outliers indicates the presence of meteoroids after ignition.

  13. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    PubMed

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Problematizing Statistical Literacy: An Intersection of Critical and Statistical Literacies

    ERIC Educational Resources Information Center

    Weiland, Travis

    2017-01-01

    In this paper, I problematize traditional notions of statistical literacy by juxtaposing it with critical literacy. At the school level statistical literacy is vitally important for students who are preparing to become citizens in modern societies that are increasingly shaped and driven by data based arguments. The teaching of statistics, which is…

  15. Fracture resistance of endodontically treated teeth restored with a bulkfill flowable material and a resin composite

    PubMed Central

    Isufi, Almira; Plotino, Gianluca; Grande, Nicola Maria; Ioppolo, Pietro; Testarelli, Luca; Bedini, Rossella; Al-Sudani, Dina; Gambarini, Gianluca

    2016-01-01

    Summary Aim To determine and compare the fracture resistance of endodontically treated teeth restored with a bulk fill flowable material (SDR) and a traditional resin composite. Methods Thirty maxillary and 30 mandibular first molars were selected based on similar dimensions. After cleaning, shaping and filling of the root canals and adhesive procedures, specimens were assigned to 3 subgroups for each tooth type (n=10): Group A: control group, including intact teeth; Group B: access cavities were restored with a traditional resin composite (EsthetX; Dentsply-Italy, Rome, Italy); Group C: access cavities were restored with a bulk fill flowable composite (SDR; Dentsply-Italy), except 1.5 mm layer of the occlusal surface that was restored with the same resin composite as Group B. The specimens were subjected to compressive force in a material static-testing machine until fracture occurred, the maximum fracture load of the specimens was measured (N) and the type of fracture was recorded as favorable or unfavorable. Data were statistically analyzed with one-way analysis of variance (ANOVA) and Bonferroni tests (P<0.05). Results No statistically significant differences were found among groups (P<0.05). Fracture resistance of endodontically treated teeth restored with a traditional resin composite and with a bulk fill flowable composite (SDR) was similar in both maxillary and mandibular molars and showed no significant decrease in fracture resistance compared to intact specimens. Conclusions No significant difference was observed in the mechanical fracture resistance of endodontically treated molars restored with traditional resin composite restorations compared to bulk fill flowable composite restorations. PMID:27486505

  16. Fracture resistance of endodontically treated teeth restored with a bulkfill flowable material and a resin composite.

    PubMed

    Isufi, Almira; Plotino, Gianluca; Grande, Nicola Maria; Ioppolo, Pietro; Testarelli, Luca; Bedini, Rossella; Al-Sudani, Dina; Gambarini, Gianluca

    2016-01-01

    To determine and compare the fracture resistance of endodontically treated teeth restored with a bulk fill flowable material (SDR) and a traditional resin composite. Thirty maxillary and 30 mandibular first molars were selected based on similar dimensions. After cleaning, shaping and filling of the root canals and adhesive procedures, specimens were assigned to 3 subgroups for each tooth type (n=10): Group A: control group, including intact teeth; Group B: access cavities were restored with a traditional resin composite (EsthetX; Dentsply-Italy, Rome, Italy); Group C: access cavities were restored with a bulk fill flowable composite (SDR; Dentsply-Italy), except 1.5 mm layer of the occlusal surface that was restored with the same resin composite as Group B. The specimens were subjected to compressive force in a material static-testing machine until fracture occurred, the maximum fracture load of the specimens was measured (N) and the type of fracture was recorded as favorable or unfavorable. Data were statistically analyzed with one-way analysis of variance (ANOVA) and Bonferroni tests (P<0.05). No statistically significant differences were found among groups (P<0.05). Fracture resistance of endodontically treated teeth restored with a traditional resin composite and with a bulk fill flowable composite (SDR) was similar in both maxillary and mandibular molars and showed no significant decrease in fracture resistance compared to intact specimens. No significant difference was observed in the mechanical fracture resistance of endodontically treated molars restored with traditional resin composite restorations compared to bulk fill flowable composite restorations.

  17. Maintenance therapy with sucralfate in duodenal ulcer: genuine prevention or accelerated healing of ulcer recurrence?

    PubMed

    Bynum, T E; Koch, G G

    1991-08-08

    We sought to compare the efficacy of sucralfate to placebo for the prevention of duodenal ulcer recurrence and to determine that the efficacy of sucralfate was due to a true reduction in ulcer prevalence and not due to secondary effects such as analgesic activity or accelerated healing. This was a double-blind, randomized, placebo-controlled, parallel groups, multicenter clinical study with 254 patients. All patients had a past history of at least two duodenal ulcers with at least one ulcer diagnosed by endoscopic examination 3 months or less before the start of the study. Complete ulcer healing without erosions was required to enter the study. Sucralfate or placebo were dosed as a 1-g tablet twice a day for 4 months, or until ulcer recurrence. Endoscopic examinations once a month and when symptoms developed determined the presence or absence of duodenal ulcers. If a patient developed an ulcer between monthly scheduled visits, the patient was dosed with a 1-g sucralfate tablet twice a day until the next scheduled visit. Statistical analyses of the results determined the efficacy of sucralfate compared with placebo for preventing duodenal ulcer recurrence. Comparisons of therapeutic agents for preventing duodenal ulcers have usually been made by testing for statistical differences in the cumulative rates for all ulcers developed during a follow-up period, regardless of the time of detection. Statistical experts at the United States Food and Drug Administration (FDA) and on the FDA Advisory Panel expressed doubts about clinical study results based on this type of analysis. They suggested three possible mechanisms for reducing the number of observed ulcers: (a) analgesic effects, (b) accelerated healing, and (c) true ulcer prevention. Traditional ulcer analysis could miss recurring ulcers due to an analgesic effect or accelerated healing. Point-prevalence analysis could miss recurring ulcers due to accelerated healing between endoscopic examinations. Maximum ulcer analyses, a novel statistical method, eliminated analgesic effects by regularly scheduled endoscopies and accelerated healing of recurring ulcers by frequent endoscopies and an open-label phase. Maximum ulcer analysis reflects true ulcer recurrence and prevention. Sucralfate was significantly superior to placebo in reducing ulcer prevalence by all analyses. Significance (p less than 0.05) was found at months 3 and 4 for all analyses. All months were significant in the traditional analysis, months 2-4 in point-prevalence analysis, and months 3-4 in the maximal ulcer prevalence analysis. Sucralfate was shown to be effective for the prevention of duodenal ulcer recurrence by a true reduction in new ulcer development.

  18. Traditional and western medicine: cultural beliefs and practices of South African Indian Muslims with regard to stroke.

    PubMed

    Bham, Zaheerah; Ross, Eleanor

    2005-01-01

    To investigate the beliefs of caregivers and traditional healers within the South African Indian Muslim community regarding the etiology and treatment of stroke and the persons likely to be consulted in this regard. A descriptive case study design was employed which incorporated two groups and was located within a qualitative paradigm. Data were collected within the homes of caregivers and the consulting rooms of traditional healers. Ten caregivers of persons who had sustained strokes and 10 traditional healers were interviewed. Individual interviews were held with participants. Responses to semi-structured interview schedules were analyzed using thematic content analysis and descriptive statistics. For both groups, religion and faith in God played a pertinent role in beliefs regarding etiology of illnesses such as stroke. Caregivers used a combination of traditional and Western medicine approaches. For traditional healers, treatment was based on the premise of restoring the balance between hot and cold in the body, which had been placed in disequilibrium by the stroke. Participants expressed disillusionment with referrals to Western healthcare professionals whose treatment was often regarded as culturally inappropriate. They also emphasized the integral role played by family members in the treatment of illness and disease. Results have implications for: culturally sensitive management of stroke patients in the South African Indian Muslim community; collaboration between Western and traditional healers; involvement of families in the remediation process; and further research.

  19. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  20. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  1. Effect of an orally formulated processed black cumin, from Iranian traditional medicine pharmacopoeia, in relieving symptoms of knee osteoarthritis: A prospective, randomized, double-blind and placebo-controlled clinical trial.

    PubMed

    Salimzadeh, Ahmad; Ghourchian, Anahita; Choopani, Rasool; Hajimehdipoor, Homa; Kamalinejad, Mohammad; Abolhasani, Maryam

    2017-06-01

    Osteoarthritis is a global health problem, especially for the elderly. A good replacement for non-surgical treatments is the use of traditional medicines. We selected a revere plant (Nigella sativa L.), a widely utilized medicinal herb for the treatment of inflammatory conditions, from the Iranian traditional medicine (ITM) pharmacopoeia with proven anti-inflammatory and analgesic actions. We performed a prospective, randomized, double-blind, and placebo-controlled clinical trial, in order to investigate whether the herb is useful in alleviating the symptoms of knee osteoarthritis. American College of Rheumatology clinical criteria were the basis of diagnosis, while the Knee injury and Osteoarthritis Outcome Score (KOOS) questionnaire was considered as the main outcome measure. One hundred and ten eligible patients were assigned to receive a placebo or an active intervention (2 g/day of processed N. sativa seed powder in divided doses). Acetaminophen tablets were the rescue medicine. Finally, 40 patients in the placebo group and 37 patients in the active group completed the trial and were included in the statistical analysis. Both cohorts demonstrated statistically significant within-group differences (P < 0.05) in some subscales that were more prominent in the active group without any considerable adverse effects. Nevertheless, KOOS score results and the mean number of acetaminophen tablets used by patients showed no statistically significant between-group differences. It can be concluded that future programmed studies with larger sample sizes, longer follow-up periods, and other forms of N. sativa seeds as an active intervention is necessary to evaluate its efficacy in relieving the symptoms of knee osteoarthritis. © 2017 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  2. Bone age maturity assessment using hand-held device

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Gilsanz, Vicente; Liu, Xiaodong; Boechat, M. I.

    2004-04-01

    Purpose: Assessment of bone maturity is traditionally performed through visual comparison of hand and wrist radiograph with existing reference images in textbooks. Our goal was to develop a digital index based on idealized hand Xray images that can be incorporated in a hand held computer and used for visual assessment of bone age for patients. Material and methods: Due to the large variability in bone maturation in normals, we generated a set of "ideal" images obtained by computer combinations of images from our normal reference data sets. Software for hand-held PDA devices was developed for easy navigation through the set of images and visual selection of matching images. A formula based on our statistical analysis provides the standard deviation from normal based on the chronological age of the patient. The accuracy of the program was compared to traditional interpretation by two radiologists in a double blind reading of 200 normal Caucasian children (100 boys, 100 girls). Results: Strong correlations were present between chronological age and bone age (r > 0.9) with no statistical difference between the digital and traditional assessment methods. Determinations of carpal bone maturity in adolescents was slightly more accurate using the digital system. The users did praise the convenience and effectiveness of the digital Palm Index in clinical practice. Conclusion: An idealized digital Palm Bone Age Index provides a convenient and effective alternative to conventional atlases for the assessment of skeletal maturity.

  3. Complex networks as a unified framework for descriptive analysis and predictive modeling in climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R

    The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less

  4. Modelling short time series in metabolomics: a functional data analysis approach.

    PubMed

    Montana, Giovanni; Berk, Maurice; Ebbels, Tim

    2011-01-01

    Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.

  5. Pathway-GPS and SIGORA: identifying relevant pathways based on the over-representation of their gene-pair signatures

    PubMed Central

    Foroushani, Amir B.K.; Brinkman, Fiona S.L.

    2013-01-01

    Motivation. Predominant pathway analysis approaches treat pathways as collections of individual genes and consider all pathway members as equally informative. As a result, at times spurious and misleading pathways are inappropriately identified as statistically significant, solely due to components that they share with the more relevant pathways. Results. We introduce the concept of Pathway Gene-Pair Signatures (Pathway-GPS) as pairs of genes that, as a combination, are specific to a single pathway. We devised and implemented a novel approach to pathway analysis, Signature Over-representation Analysis (SIGORA), which focuses on the statistically significant enrichment of Pathway-GPS in a user-specified gene list of interest. In a comparative evaluation of several published datasets, SIGORA outperformed traditional methods by delivering biologically more plausible and relevant results. Availability. An efficient implementation of SIGORA, as an R package with precompiled GPS data for several human and mouse pathway repositories is available for download from http://sigora.googlecode.com/svn/. PMID:24432194

  6. Beyond the schools of psychology 1: a digital analysis of Psychological Review, 1894-1903.

    PubMed

    Green, Christopher D; Feinerer, Ingo; Burman, Jeremy T

    2013-01-01

    Traditionally, American psychology at the turn of the twentieth century has been framed as a competition among a number of "schools": structuralism, functionalism, behaviorism, etc. But this is only one way in which the "structure" of the discipline can be conceived. Most psychologists did not belong to a particular school, but they still worked within loose intellectual communities, and so their work was part of an implicit psychological "genre," if not a formalized "school." In this study, we began the process of discovering the underlying genres of American psychology at the turn of the twentieth century by taking the complete corpus of articles from the journal Psychological Review during the first decade of its publication and conducting a statistical analysis of the vocabularies they employed to see what clusters of articles naturally emerged. Although the traditional functionalist school was among the clusters we found, we also found distinct research traditions around the topics of color vision, spatial vision, philosophy/metatheory, and emotion. In addition, momentary clusters corresponding to important debates (e.g., the variability hypothesis) appeared during certain years, but not others. © 2013 Wiley Periodicals, Inc.

  7. Meta-analysis of teaching methods: a 50k+ student study

    NASA Astrophysics Data System (ADS)

    Sayre, Eleanor; Archibeque, Benjamin; Gomez, K. Alison; Heckendorf, Tyrel; Madsen, Adrian M.; McKagan, Sarah B.; Schenk, Edward W.; Shepard, Chase; Sorell, Lane; von Korff, Joshua

    2015-04-01

    The Force Concept Inventory (FCI) and the Force and Motion Conceptual Evaluation (FMCE) are the two most widely-used conceptual tests in introductory mechanics. Because they are so popular, they provide an excellent avenue to compare different teaching methods at different kinds of institutions with varying student populations. We conducted a secondary analysis of all peer-reviewed papers which publish data from US and Canadian colleges and universities. Our data include over fifty thousand students drawn from approximately 100 papers; papers were drawn from Scopus, ERIC, ComPADRE, and journal websites. We augment published data about teaching methods with institutional data such as Carnegie Classification and average SAT scores. We statistically determine the effectiveness of different teaching methods as measured by FCI and FMCE gains and mediated by institutional and course factors. As in the landmark 1998 Hake study, we find that classes using interactive engagement (IE) have significantly larger learning gains than classes using traditional instruction. However, we find a broader distribution of normalized gains occurs in each of traditional and IE classes, and the differences between IE and traditional instruction have changed over time and are more context dependent.

  8. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  9. Curative efficacy and safety of traditional Chinese medicine xuebijing injections combined with ulinastatin for treating sepsis in the Chinese population: A meta-analysis.

    PubMed

    Xiao, Shi-Hui; Luo, Liang; Liu, Xiang-Hong; Zhou, Yu-Ming; Liu, Hong-Ming; Huang, Zhen-Fei

    2018-06-01

    Sepsis is a clinically critical disease. However, it is still controversial whether the combined use of traditional Chinese medicine Xuebijing injections (XBJI) and western medicine can enhance curative efficacy and ensure safety compared with western medicine alone. Thus, this research consisted of a systematic review of the curative efficacy and safety of traditional Chinese medicine XBJI combined with ulinastatin for treating sepsis in the Chinese population. A total of 8 databases were retrieved: 4 foreign databases, namely, PubMed, The Cochrane Library, Embase, and Web of Science; and 4 Chinese databases, namely, Sino Med, China National Knowledge Infrastructure (CNKI), VIP, and Wangfang Data. The time span of retrieval began from the establishment of each database and ended on August 1, 2017. Published randomized controlled trials about the combined use of traditional Chinese medicine XBJI and western medicine were included, regardless of language. Stata12.0 software was used for statistical analysis. Finally, 16 papers involving 1335 cases were included. The result of meta-analysis showed that compared with the single use of ulinastatin, traditional Chinese medicine XBJI combined with ulinastatin could reduce the time of mechanical ventilation, shorten the length of intensive care unit (ICU) stay, improve the 28-day survival rate, and decrease the occurrence rate of multiple organ dysfunction syndrome, case fatality rate, procalcitonin (PCT) content, APACKEII score, tumor necrosis factor (TNF)-α level, and interleukin (IL)-6 level. On the basis of the common basic therapeutic regimen, the combined use of traditional Chinese medicine XBJI and ulinastatin was compared with the use of ulinastatin alone for treating sepsis in the Chinese population. It was found that the number of adverse events of combination therapy is not significantly increased, and its clinical safety is well within the permitted range. However, considering the limitations of this conclusion due to the low-quality articles included in the present research, it is necessary to conduct high-quality randomized controlled trials.

  10. Factors influencing the use of antenatal care in rural West Sumatra, Indonesia

    PubMed Central

    2012-01-01

    Background Every year, nearly half a million women and girls needlessly die as a result of complications during pregnancy, childbirth or the 6 weeks following delivery. Almost all (99%) of these deaths occur in developing countries. The study aim was to describe the factors related to low visits for antenatal care (ANC) services among pregnant women in Indonesia. Method A total of 145 of 200 married women of reproductive age who were pregnant or had experienced birth responded to the questionnaire about their ANC visits. We developed a questionnaire containing 35 items and four sections. Section one and two included the women's socio demographics, section three about basic knowledge of pregnancy and section four contained two subsections about preferences about midwives and preferences about Traditional Birth Attendant (TBA) and the second subsections were traditional beliefs. Data were collected using a convenience sampling strategy during July and August 2010, from 10 villages in the Tanjung Emas. Multiple regression analysis was used for preference for types of providers. Results Three-quarter of respondents (77.9%) received ANC more than four times. The other 22.1% received ANC less than four times. 59.4% received ANC visits during pregnancy, which was statistically significant compared to multiparous (p = 0.001). Women who were encouraged by their family to receive ANC had statistically significant higher traditional belief scores compared to those who encouraged themselves (p = 0.003). Preference for TBAs was most strongly affected by traditional beliefs (p < 0.001). On the contrary, preference for midwives was negatively correlated with traditional beliefs (p < 0.001). Conclusions Parity was the factor influencing women's receiving less than the recommended four ANC visits during pregnancy. Women who were encouraged by their family to get ANC services had higher traditional beliefs score than women who encouraged themselves. Moreover, traditional beliefs followed by lower income families had the greater influence over preferring TBAs, with the opposite trend for preferring midwives. Increased attention needs to be given to the women; it also very important for exploring women's perceptions about health services that they received. PMID:22353252

  11. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    PubMed

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  12. Enhanced Higgs boson to τ(+)τ(-) search with deep learning.

    PubMed

    Baldi, P; Sadowski, P; Whiteson, D

    2015-03-20

    The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.

  13. A Method for Evaluating the Safety Impacts of Air Traffic Automation

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles

    1998-01-01

    This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.

  14. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  15. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  16. Causality

    NASA Astrophysics Data System (ADS)

    Pearl, Judea

    2000-03-01

    Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.

  17. A mixed methods inquiry into the determinants of traditional food consumption among three Cree communities of Eeyou Istchee from an ecological perspective.

    PubMed

    Gaudin, Véronique Laberge; Receveur, Olivier; Walz, Leah; Girard, Félix; Potvin, Louise

    2014-01-01

    The Aboriginal nations of Canada have higher incidences of chronic diseases, coinciding with profound changes in their environment, lifestyle and diet. Traditional foods can protect against the risks of chronic disease. However, their consumption is in decline, and little is known about the complex mechanisms underlying this trend. To identify the factors involved in traditional food consumption by Cree Aboriginal people living in 3 communities in northern Quebec, Canada. Design. A mixed methods explanatory design, including focus group interviews to interpret the results of logistic regression. This study includes a secondary data analysis of a cross-sectional survey of 3 Cree communities (n=374) and 4 focus group interviews (n=23). In the first, quantitative phase of the study, data were collected using a food-frequency questionnaire along with a structured questionnaire. Subsequently, the focus group interviews helped explain and build on the results of logistic regressions. People who consume traditional food 3 days or more weekly were more likely to be 40 years old and over, to walk 30 minutes or more per day, not to have completed their schooling, to live in Mistissini and to be a hunter (p<0.05 for all comparisons). The focus group participants provided explanations for the quantitative analysis results or completed them. For example, although no statistical association was found, focus group participants believed that employment acts as both a facilitator and a barrier to traditional food consumption, rendering the effect undetectable. In addition, focus group participants suggested that traditional food consumption is the result of multiple interconnected influences, including individual, family, community and environmental influences, rather than a single factor. This study sheds light on a number of factors that are unique to traditional foods, factors that have been understudied to date. Efforts to promote and maintain traditional food consumption could improve the overall health and wellbeing of Cree communities.

  18. Percutaneous versus traditional and paraspinal posterior open approaches for treatment of thoracolumbar fractures without neurologic deficit: a meta-analysis.

    PubMed

    Sun, Xiang-Yao; Zhang, Xi-Nuo; Hai, Yong

    2017-05-01

    This study evaluated differences in outcome variables between percutaneous, traditional, and paraspinal posterior open approaches for traumatic thoracolumbar fractures without neurologic deficit. A systematic review of PubMed, Cochrane, and Embase was performed. In this meta-analysis, we conducted online searches of PubMed, Cochrane, Embase using the search terms "thoracolumbar fractures", "lumbar fractures", ''percutaneous'', "minimally invasive", ''open", "traditional", "posterior", "conventional", "pedicle screw", "sextant", and "clinical trial". The analysis was performed on individual patient data from all the studies that met the selection criteria. Clinical outcomes were expressed as risk difference for dichotomous outcomes and mean difference for continuous outcomes with 95 % confidence interval. Heterogeneity was assessed using the χ 2 test and I 2 statistics. There were 4 randomized controlled trials and 14 observational articles included in this analysis. Percutaneous approach was associated with better ODI score, less Cobb angle correction, less Cobb angle correction loss, less postoperative VBA correction, and lower infection rate compared with open approach. Percutaneous approach was also associated with shorter operative duration, longer intraoperative fluoroscopy, less postoperative VAS, and postoperative VBH% in comparison with traditional open approach. No significant difference was found in Cobb angle correction, postoperative VBA, VBA correction loss, Postoperative VBH%, VBH correction loss, and pedicle screw misplacement between percutaneous approach and open approach. There was no significant difference in operative duration, intraoperative fluoroscopy, postoperative VAS, and postoperative VBH% between percutaneous approach and paraspianl approach. The functional and the radiological outcome of percutaneous approach would be better than open approach in the long term. Although trans-muscular spatium approach belonged to open fixation methods, it was strictly defined as less invasive approach, which provided less injury to the paraspinal muscles and better reposition effect.

  19. Mediators of the effects of rice intake on health in individuals consuming a traditional Japanese diet centered on rice

    PubMed Central

    Toyomaki, Atsuhito; Miyazaki, Akane; Nakai, Yukiei; Yamaguchi, Atsuko; Kubo, Chizuru; Suzuki, Junko; Ohkubo, Iwao; Shimizu, Mari; Musashi, Manabu; Kiso, Yoshinobu; Kusumi, Ichiro

    2017-01-01

    Although the Japanese diet is believed to be balanced and healthy, its benefits have been poorly investigated, especially in terms of effects on mental health. We investigated dietary patterns and physical and mental health in the Japanese population using an epidemiological survey to determine the health benefits of the traditional Japanese diet. Questionnaires to assess dietary habits, quality of life, sleep quality, impulsivity, and depression severity were distributed to 550 randomly selected middle-aged and elderly individuals. Participants with any physical or mental disease were excluded. Two-hundred and seventy-eight participants were selected for the final statistical analysis. We determined rice to be one of the most traditional foods in Japanese cuisine. Scores for each questionnaire were computed, and the correlations between rice intake and health indices were assessed. When analyzing the direct correlations between rice intake and health indices, we found only two correlations, namely those with quality of life (vitality) and sleep quality. Path analysis using structural equation modeling was performed to investigate the association between rice intake and health, with indirect effects included in the model. Additional associations between rice intake and health were explained using this model when compared to those using direct correlation analysis. Path analysis was used to identify mediators of the rice-health association. These mediators were miso (soybean paste) soup, green tea, and natto (fermented soybean) intake. Interestingly, these mediators have been major components of the Japanese diet since 1975, which has been considered one of the healthiest diets since the 1960s. Our results indicate that the combination of rice with other healthy foods, which is representative of the traditional Japanese diet, may contribute to improvements in physical and mental health. PMID:28968452

  20. Evidence and Clinical Trials.

    NASA Astrophysics Data System (ADS)

    Goodman, Steven N.

    1989-11-01

    This dissertation explores the use of a mathematical measure of statistical evidence, the log likelihood ratio, in clinical trials. The methods and thinking behind the use of an evidential measure are contrasted with traditional methods of analyzing data, which depend primarily on a p-value as an estimate of the statistical strength of an observed data pattern. It is contended that neither the behavioral dictates of Neyman-Pearson hypothesis testing methods, nor the coherency dictates of Bayesian methods are realistic models on which to base inference. The use of the likelihood alone is applied to four aspects of trial design or conduct: the calculation of sample size, the monitoring of data, testing for the equivalence of two treatments, and meta-analysis--the combining of results from different trials. Finally, a more general model of statistical inference, using belief functions, is used to see if it is possible to separate the assessment of evidence from our background knowledge. It is shown that traditional and Bayesian methods can be modeled as two ends of a continuum of structured background knowledge, methods which summarize evidence at the point of maximum likelihood assuming no structure, and Bayesian methods assuming complete knowledge. Both schools are seen to be missing a concept of ignorance- -uncommitted belief. This concept provides the key to understanding the problem of sampling to a foregone conclusion and the role of frequency properties in statistical inference. The conclusion is that statistical evidence cannot be defined independently of background knowledge, and that frequency properties of an estimator are an indirect measure of uncommitted belief. Several likelihood summaries need to be used in clinical trials, with the quantitative disparity between summaries being an indirect measure of our ignorance. This conclusion is linked with parallel ideas in the philosophy of science and cognitive psychology.

  1. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    PubMed Central

    Seeley, Matthew K.; Francom, Devin; Reese, C. Shane; Hopkins, J. Ty

    2017-01-01

    Abstract In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function. PMID:29339984

  2. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach.

    PubMed

    Park, Jihong; Seeley, Matthew K; Francom, Devin; Reese, C Shane; Hopkins, J Ty

    2017-12-01

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.

  3. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jihong; Seeley, Matthew K.; Francom, Devin

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  4. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE PAGES

    Park, Jihong; Seeley, Matthew K.; Francom, Devin; ...

    2017-12-28

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  5. Statistical analysis of the electric energy production from photovoltaic conversion using mobile and fixed constructions

    NASA Astrophysics Data System (ADS)

    Bugała, Artur; Bednarek, Karol; Kasprzyk, Leszek; Tomczewski, Andrzej

    2017-10-01

    The paper presents the most representative - from the three-year measurement time period - characteristics of daily and monthly electricity production from a photovoltaic conversion using modules installed in a fixed and 2-axis tracking construction. Results are presented for selected summer, autumn, spring and winter days. Analyzed measuring stand is located on the roof of the Faculty of Electrical Engineering Poznan University of Technology building. The basic parameters of the statistical analysis like mean value, standard deviation, skewness, kurtosis, median, range, or coefficient of variation were used. It was found that the asymmetry factor can be useful in the analysis of the daily electricity production from a photovoltaic conversion. In order to determine the repeatability of monthly electricity production, occurring between the summer, and summer and winter months, a non-parametric Mann-Whitney U test was used as a statistical solution. In order to analyze the repeatability of daily peak hours, describing the largest value of the hourly electricity production, a non-parametric Kruskal-Wallis test was applied as an extension of the Mann-Whitney U test. Based on the analysis of the electric energy distribution from a prepared monitoring system it was found that traditional forecasting methods of the electricity production from a photovoltaic conversion, like multiple regression models, should not be the preferred methods of the analysis.

  6. Syndromic surveillance of influenza activity in Sweden: an evaluation of three tools.

    PubMed

    Ma, T; Englund, H; Bjelkmar, P; Wallensten, A; Hulth, A

    2015-08-01

    An evaluation was conducted to determine which syndromic surveillance tools complement traditional surveillance by serving as earlier indicators of influenza activity in Sweden. Web queries, medical hotline statistics, and school absenteeism data were evaluated against two traditional surveillance tools. Cross-correlation calculations utilized aggregated weekly data for all-age, nationwide activity for four influenza seasons, from 2009/2010 to 2012/2013. The surveillance tool indicative of earlier influenza activity, by way of statistical and visual evidence, was identified. The web query algorithm and medical hotline statistics performed equally well as each other and to the traditional surveillance tools. School absenteeism data were not reliable resources for influenza surveillance. Overall, the syndromic surveillance tools did not perform with enough consistency in season lead nor in earlier timing of the peak week to be considered as early indicators. They do, however, capture incident cases before they have formally entered the primary healthcare system.

  7. A statistical method for measuring activation of gene regulatory networks.

    PubMed

    Esteves, Gustavo H; Reis, Luiz F L

    2018-06-13

    Gene expression data analysis is of great importance for modern molecular biology, given our ability to measure the expression profiles of thousands of genes and enabling studies rooted in systems biology. In this work, we propose a simple statistical model for the activation measuring of gene regulatory networks, instead of the traditional gene co-expression networks. We present the mathematical construction of a statistical procedure for testing hypothesis regarding gene regulatory network activation. The real probability distribution for the test statistic is evaluated by a permutation based study. To illustrate the functionality of the proposed methodology, we also present a simple example based on a small hypothetical network and the activation measuring of two KEGG networks, both based on gene expression data collected from gastric and esophageal samples. The two KEGG networks were also analyzed for a public database, available through NCBI-GEO, presented as Supplementary Material. This method was implemented in an R package that is available at the BioConductor project website under the name maigesPack.

  8. The relationship between obligatory cortical auditory evoked potentials (CAEPs) and functional measures in young infants.

    PubMed

    Golding, Maryanne; Pearce, Wendy; Seymour, John; Cooper, Alison; Ching, Teresa; Dillon, Harvey

    2007-02-01

    Finding ways to evaluate the success of hearing aid fittings in young infants has increased in importance with the implementation of hearing screening programs. Cortical auditory evoked potentials (CAEP) can be recorded in infants and provides evidence for speech detection at the cortical level. The validity of this technique as a tool of hearing aid evaluation needs, however, to be demonstrated. The present study examined the relationship between the presence/absence of CAEPs to speech stimuli and the outcomes of a parental questionnaire in young infants who were fitted with hearing aids. The presence/absence of responses was determined by an experienced examiner as well as by a statistical measure, Hotelling's T(2). A statistically significant correlation between CAEPs and questionnaire scores was found using the examiner's grading (rs = 0.45) and using the statistical grading (rs = 0.41), and there was reasonably good agreement between traditional response detection methods and the statistical analysis.

  9. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  10. Feature-Based Statistical Analysis of Combustion Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, J; Krishnamoorthy, V; Liu, S

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing andmore » reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.« less

  11. HealthStyles: a new psychographic segmentation system for health care marketers.

    PubMed

    Endresen, K W; Wintz, J C

    1988-01-01

    HealthStyles is a new psychographic segmentation system specifically designed for the health care industry. This segmentation system goes beyond traditional geographic and demographic analysis and examines health-related consumer attitudes and behaviors. Four statistically distinct "styles" of consumer health care preferences have been identified. The profiles of the four groups have substantial marketing implications in terms of design and promotion of products and services. Each segment of consumers also has differing expectations of physician behavior.

  12. A statistical shape modelling framework to extract 3D shape biomarkers from medical imaging data: assessing arch morphology of repaired coarctation of the aorta.

    PubMed

    Bruse, Jan L; McLeod, Kristin; Biglino, Giovanni; Ntsinjana, Hopewell N; Capelli, Claudio; Hsia, Tain-Yen; Sermesant, Maxime; Pennec, Xavier; Taylor, Andrew M; Schievano, Silvia

    2016-05-31

    Medical image analysis in clinical practice is commonly carried out on 2D image data, without fully exploiting the detailed 3D anatomical information that is provided by modern non-invasive medical imaging techniques. In this paper, a statistical shape analysis method is presented, which enables the extraction of 3D anatomical shape features from cardiovascular magnetic resonance (CMR) image data, with no need for manual landmarking. The method was applied to repaired aortic coarctation arches that present complex shapes, with the aim of capturing shape features as biomarkers of potential functional relevance. The method is presented from the user-perspective and is evaluated by comparing results with traditional morphometric measurements. Steps required to set up the statistical shape modelling analyses, from pre-processing of the CMR images to parameter setting and strategies to account for size differences and outliers, are described in detail. The anatomical mean shape of 20 aortic arches post-aortic coarctation repair (CoA) was computed based on surface models reconstructed from CMR data. By analysing transformations that deform the mean shape towards each of the individual patient's anatomy, shape patterns related to differences in body surface area (BSA) and ejection fraction (EF) were extracted. The resulting shape vectors, describing shape features in 3D, were compared with traditionally measured 2D and 3D morphometric parameters. The computed 3D mean shape was close to population mean values of geometric shape descriptors and visually integrated characteristic shape features associated with our population of CoA shapes. After removing size effects due to differences in body surface area (BSA) between patients, distinct 3D shape features of the aortic arch correlated significantly with EF (r = 0.521, p = .022) and were well in agreement with trends as shown by traditional shape descriptors. The suggested method has the potential to discover previously unknown 3D shape biomarkers from medical imaging data. Thus, it could contribute to improving diagnosis and risk stratification in complex cardiac disease.

  13. A comparison of two microscale laboratory reporting methods in a secondary chemistry classroom

    NASA Astrophysics Data System (ADS)

    Martinez, Lance Michael

    This study attempted to determine if there was a difference between the laboratory achievement of students who used a modified reporting method and those who used traditional laboratory reporting. The study also determined the relationships between laboratory performance scores and the independent variables score on the Group Assessment of Logical Thinking (GALT) test, chronological age in months, gender, and ethnicity for each of the treatment groups. The study was conducted using 113 high school students who were enrolled in first-year general chemistry classes at Pueblo South High School in Colorado. The research design used was the quasi-experimental Nonequivalent Control Group Design. The statistical treatment consisted of the Multiple Regression Analysis and the Analysis of Covariance. Based on the GALT, students in the two groups were generally in the concrete and transitional stages of the Piagetian cognitive levels. The findings of the study revealed that the traditional and the modified methods of laboratory reporting did not have any effect on the laboratory performance outcome of the subjects. However, the students who used the traditional method of reporting showed a higher laboratory performance score when evaluation was conducted using the New Standards rubric recommended by the state. Multiple Regression Analysis revealed that there was a significant relationship between the criterion variable student laboratory performance outcome of individuals who employed traditional laboratory reporting methods and the composite set of predictor variables. On the contrary, there was no significant relationship between the criterion variable student laboratory performance outcome of individuals who employed modified laboratory reporting methods and the composite set of predictor variables.

  14. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  15. Knowledge, Attitude and Practice of General Practitioners toward Complementary and Alternative Medicine: a Cross-Sectional Study.

    PubMed

    Barikani, Ameneh; Beheshti, Akram; Javadi, Maryam; Yasi, Marzieh

    2015-08-01

    Orientation of public and physicians to the complementary and alternative medicine (CAM) is one of the most prominent symbols of structural changes in the health service system. The aim of his study was a determination of knowledge, attitude, and practice of general practitioners in complementary and alternative medicine. This cross- sectional study was conducted in Qazvin, Iran in 2013. A self-administered questionnaire was used for collecting data including four information parts: population information, physicians' attitude and knowledge, methods of getting information and their function. A total of 228 physicians in Qazvin comprised the population of study according to the deputy of treatment's report of Qazvin University of Medical Sciences. A total of 150 physicians were selected randomly, and SPSS Statistical program was used to enter questionnaires' data. Results were analyzed as descriptive statistics and statistical analysis. Sixty percent of all responders were male. About sixty (59.4) percent of participating practitioners had worked less than 10 years.96.4 percent had a positive attitude towards complementary and alternative medicine. Knowledge of practitioners about traditional medicine in 11 percent was good, 36.3% and 52.7% had average and little information, respectively. 17.9% of practitioners offered their patients complementary and alternative medicine for treatment. Although there was little knowledge among practitioners about traditional medicine and complementary approaches, a significant percentage of them had attitude higher than the lower limit.

  16. Reframing Serial Murder Within Empirical Research.

    PubMed

    Gurian, Elizabeth A

    2017-04-01

    Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.

  17. Evaluating the decision accuracy and speed of clinical data visualizations.

    PubMed

    Pieczkiewicz, David S; Finkelstein, Stanley M

    2010-01-01

    Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.

  18. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    ERIC Educational Resources Information Center

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  19. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  20. Predictive validity of the UK clinical aptitude test in the final years of medical school: a prospective cohort study.

    PubMed

    Husbands, Adrian; Mathieson, Alistair; Dowell, Jonathan; Cleland, Jennifer; MacKenzie, Rhoda

    2014-04-23

    The UK Clinical Aptitude Test (UKCAT) was designed to address issues identified with traditional methods of selection. This study aims to examine the predictive validity of the UKCAT and compare this to traditional selection methods in the senior years of medical school. This was a follow-up study of two cohorts of students from two medical schools who had previously taken part in a study examining the predictive validity of the UKCAT in first year. The sample consisted of 4th and 5th Year students who commenced their studies at the University of Aberdeen or University of Dundee medical schools in 2007. Data collected were: demographics (gender and age group), UKCAT scores; Universities and Colleges Admissions Service (UCAS) form scores; admission interview scores; Year 4 and 5 degree examination scores. Pearson's correlations were used to examine the relationships between admissions variables, examination scores, gender and age group, and to select variables for multiple linear regression analysis to predict examination scores. Ninety-nine and 89 students at Aberdeen medical school from Years 4 and 5 respectively, and 51 Year 4 students in Dundee, were included in the analysis. Neither UCAS form nor interview scores were statistically significant predictors of examination performance. Conversely, the UKCAT yielded statistically significant validity coefficients between .24 and .36 in four of five assessments investigated. Multiple regression analysis showed the UKCAT made a statistically significant unique contribution to variance in examination performance in the senior years. Results suggest the UKCAT appears to predict performance better in the later years of medical school compared to earlier years and provides modest supportive evidence for the UKCAT's role in student selection within these institutions. Further research is needed to assess the predictive validity of the UKCAT against professional and behavioural outcomes as the cohort commences working life.

  1. Predictive validity of the UK clinical aptitude test in the final years of medical school: a prospective cohort study

    PubMed Central

    2014-01-01

    Background The UK Clinical Aptitude Test (UKCAT) was designed to address issues identified with traditional methods of selection. This study aims to examine the predictive validity of the UKCAT and compare this to traditional selection methods in the senior years of medical school. This was a follow-up study of two cohorts of students from two medical schools who had previously taken part in a study examining the predictive validity of the UKCAT in first year. Methods The sample consisted of 4th and 5th Year students who commenced their studies at the University of Aberdeen or University of Dundee medical schools in 2007. Data collected were: demographics (gender and age group), UKCAT scores; Universities and Colleges Admissions Service (UCAS) form scores; admission interview scores; Year 4 and 5 degree examination scores. Pearson’s correlations were used to examine the relationships between admissions variables, examination scores, gender and age group, and to select variables for multiple linear regression analysis to predict examination scores. Results Ninety-nine and 89 students at Aberdeen medical school from Years 4 and 5 respectively, and 51 Year 4 students in Dundee, were included in the analysis. Neither UCAS form nor interview scores were statistically significant predictors of examination performance. Conversely, the UKCAT yielded statistically significant validity coefficients between .24 and .36 in four of five assessments investigated. Multiple regression analysis showed the UKCAT made a statistically significant unique contribution to variance in examination performance in the senior years. Conclusions Results suggest the UKCAT appears to predict performance better in the later years of medical school compared to earlier years and provides modest supportive evidence for the UKCAT’s role in student selection within these institutions. Further research is needed to assess the predictive validity of the UKCAT against professional and behavioural outcomes as the cohort commences working life. PMID:24762134

  2. Nuclear magnetic resonance and chemometrics to assess geographical origin and quality of traditional food products.

    PubMed

    Consonni, R; Cagliani, L R

    2010-01-01

    In this globalization era, the opening of the markets has put at almost everybody's disposal a wide variety of foods, allowing everybody to taste food flavors and aromas from different nations. Notwithstanding this opportunity, countries try to preserve their markets by developing protection policies. A few countries have adopted different denominations to label their "typical food" products in order to give them additional value. Besides, the term "typical food" is widely thought of as something anchored to the local traditions, with geographical meaning and made with typical raw materials. Then a "typical food" starts to be considered "traditional" when it is made following specific and old recipes. As a matter of fact, these products acquire particular organoleptic characteristics that are not reproducible when produced in different places. In this review, NMR studies coupled to multivariate statistical analysis are presented with the aim of determining geographical origin and key quality characteristics. Copyright © 2010 Elsevier Inc. All rights reserved.

  3. Disaster response team FAST skills training with a portable ultrasound simulator compared to traditional training: pilot study.

    PubMed

    Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J

    2015-03-01

    Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between training groups. This pilot study of a deployed mixed-provider disaster response team suggests that a novel portable US simulator may provide equivalent skills training in comparison to traditional live instructor and model training. Further studies with a larger sample size and other measures of short- and long-term clinical performance are warranted.

  4. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    PubMed

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  5. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  6. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  7. A consistent framework for Horton regression statistics that leads to a modified Hack's law

    USGS Publications Warehouse

    Furey, P.R.; Troutman, B.M.

    2008-01-01

    A statistical framework is introduced that resolves important problems with the interpretation and use of traditional Horton regression statistics. The framework is based on a univariate regression model that leads to an alternative expression for Horton ratio, connects Horton regression statistics to distributional simple scaling, and improves the accuracy in estimating Horton plot parameters. The model is used to examine data for drainage area A and mainstream length L from two groups of basins located in different physiographic settings. Results show that confidence intervals for the Horton plot regression statistics are quite wide. Nonetheless, an analysis of covariance shows that regression intercepts, but not regression slopes, can be used to distinguish between basin groups. The univariate model is generalized to include n > 1 dependent variables. For the case where the dependent variables represent ln A and ln L, the generalized model performs somewhat better at distinguishing between basin groups than two separate univariate models. The generalized model leads to a modification of Hack's law where L depends on both A and Strahler order ??. Data show that ?? plays a statistically significant role in the modified Hack's law expression. ?? 2008 Elsevier B.V.

  8. A Statistical Analysis of Activity-Based and Traditional Introductory Algebra Physics Using the Force and Motion Conceptual Evaluation

    NASA Astrophysics Data System (ADS)

    Trecia Markes, Cecelia

    2006-03-01

    With a three-year FIPSE grant, it has been possible at the University of Nebraska at Kearney (UNK) to develop and implement activity- based introductory physics at the algebra level. It has generally been recognized that students enter physics classes with misconceptions about motion and force. Many of these misconceptions persist after instruction. Pretest and posttest responses on the ``Force and Motion Conceptual Evaluation'' (FMCE) are analyzed to determine the effectiveness of the activity- based method of instruction relative to the traditional (lecture/lab) method of instruction. Data were analyzed to determine the following: student understanding at the beginning of the course, student understanding at the end of the course, how student understanding is related to the type of class taken, student understanding based on gender and type of class. Some of the tests used are the t-test, the chi-squared test, and analysis of variance. The results of these tests will be presented, and their implications will be discussed.

  9. Data Mining: Going beyond Traditional Statistics

    ERIC Educational Resources Information Center

    Zhao, Chun-Mei; Luan, Jing

    2006-01-01

    The authors provide an overview of data mining, giving special attention to the relationship between data mining and statistics to unravel some misunderstandings about the two techniques. (Contains 1 figure.)

  10. [Prudent use price controls in Chinese medicines market: based on statistical data analysis].

    PubMed

    Yang, Guang; Wang, Nuo; Huang, Lu-Qi; Qiu, Hong-Yan; Guo, Lan-Ping

    2014-01-01

    A dispute about the decreasing-price problem of traditional Chinese medicine (TCM) has recently arisen. This article analyzes the statistical data of 1995-2011 in China, the results showed that the main responsibility of expensive health care has no direct relationship with the drug price. The price index of TCM rose significantly slower than the medicine prices, the production margins of TCM affected by the material prices has been diminishing since 1995, continuous price reduction will further depress profits of the TCM industry. Considering the pros and cons of raw materials vary greatly in price, decreasing medicine price behavior will force enterprises to use inferior materials in order to maintain corporate profits. The results have the guiding meaning to medicine price management.

  11. Spatial analysis of alcohol-related motor vehicle crash injuries in southeastern Michigan.

    PubMed

    Meliker, Jaymie R; Maio, Ronald F; Zimmerman, Marc A; Kim, Hyungjin Myra; Smith, Sarah C; Wilson, Mark L

    2004-11-01

    Temporal, behavioral and social risk factors that affect injuries resulting from alcohol-related motor vehicle crashes have been characterized in previous research. Much less is known about spatial patterns and environmental associations of alcohol-related motor vehicle crashes. The aim of this study was to evaluate geographic patterns of alcohol-related motor vehicle crashes and to determine if locations of alcohol outlets are associated with those crashes. In addition, we sought to demonstrate the value of integrating spatial and traditional statistical techniques in the analysis of this preventable public health risk. The study design was a cross-sectional analysis of individual-level blood alcohol content, traffic report information, census block group data, and alcohol distribution outlets. Besag and Newell's spatial analysis and traditional logistic regression both indicated that areas of low population density had more alcohol-related motor vehicle crashes than expected (P < 0.05). There was no significant association between alcohol outlets and alcohol-related motor vehicle crashes using distance analyses, logistic regression, and Chi-square. Differences in environmental or behavioral factors characteristic of areas of low population density may be responsible for the higher proportion of alcohol-related crashes occurring in these areas.

  12. Escape the Black Hole of Lecturing: Put Collaborative Ranking Tasks on Your Event Horizon

    NASA Astrophysics Data System (ADS)

    Hudgins, D. W.; Prather, E. E.; Grayson, D. J.

    2005-05-01

    At the University of Arizona, we have been developing and testing a new type of introductory astronomy curriculum material called Ranking Tasks. Ranking Tasks are a form of conceptual exercise that presents students with four to six physical situations, usually by pictures or diagrams, and asks students to rank order the situations based on some resulting effect. Our study developed design guidelines for Ranking Tasks based on learning theory and classroom pilot studies. Our research questions were: Do in-class collaborative Ranking Task exercises result in student conceptual gains when used in conjunction with traditional lecture-based instruction? And are these gains sufficient to justify implementing them into the astronomy classroom? We conducted a single-group repeated measures experiment across eight core introductory astronomy topics with 250 students at the University of Arizona in the Fall of 2004. The study found that traditional lecture-based instruction alone produced statistically significant gains - raising test scores to 61% post-lecture from 32% on the pretest. While significant, we find these gains to be unsatisfactory from a teaching and learning perspective. The study data shows that adding a collaborative learning component to the class structured around Ranking Task exercises helped students achieve statistically significant gains - with post-Ranking Task scores over the eight astronomy topic rising to 77%. Interestingly, we found that the normalized gain from the Ranking Tasks was equal to the entire previous gain from traditional instruction. Further analysis of the data revealed that Ranking Tasks equally benefited both genders; they also equally benefited both high and low-scoring median groups based on their pretest scores. Based on these results, we conclude that adding collaborative Ranking Task exercises to traditional lecture-based instruction can significantly improve student conceptual understanding of core topics in astronomy.

  13. Pooling sexes when assessing ground reaction forces during walking: Statistical Parametric Mapping versus traditional approach.

    PubMed

    Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo

    2015-07-16

    Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Driving factors of interactions between the exchange rate market and the commodity market: A wavelet-based complex network perspective

    NASA Astrophysics Data System (ADS)

    Wen, Shaobo; An, Haizhong; Chen, Zhihua; Liu, Xueyong

    2017-08-01

    In traditional econometrics, a time series must be in a stationary sequence. However, it usually shows time-varying fluctuations, and it remains a challenge to execute a multiscale analysis of the data and discover the topological characteristics of conduction in different scales. Wavelet analysis and complex networks in physical statistics have special advantages in solving these problems. We select the exchange rate variable from the Chinese market and the commodity price index variable from the world market as the time series of our study. We explore the driving factors behind the behavior of the two markets and their topological characteristics in three steps. First, we use the Kalman filter to find the optimal estimation of the relationship between the two markets. Second, wavelet analysis is used to extract the scales of the relationship that are driven by different frequency wavelets. Meanwhile, we search for the actual economic variables corresponding to different frequency wavelets. Finally, a complex network is used to search for the transfer characteristics of the combination of states driven by different frequency wavelets. The results show that statistical physics have a unique advantage over traditional econometrics. The Chinese market has time-varying impacts on the world market: it has greater influence when the world economy is stable and less influence in times of turmoil. The process of forming the state combination is random. Transitions between state combinations have a clustering feature. Based on these characteristics, we can effectively reduce the information burden on investors and correctly respond to the government's policy mix.

  15. Profiling and analysis of multiple constituents in Baizhu Shaoyao San before and after processing by stir-frying using UHPLC/Q-TOF-MS/MS coupled with multivariate statistical analysis.

    PubMed

    Xu, Yangyang; Cai, Hao; Cao, Gang; Duan, Yu; Pei, Ke; Tu, Sicong; Zhou, Jia; Xie, Li; Sun, Dongdong; Zhao, Jiayu; Liu, Jing; Wang, Xiaoqi; Shen, Lin

    2018-04-15

    Baizhu Shaoyao San (BSS) is a famous traditional Chinese medicinal formula widely used for the treatment of painful diarrhea, intestinal inflammation, and diarrhea-predominant irritable bowel syndrome. According to clinical medication, three medicinal herbs (Atractylodis Macrocephalae Rhizoma, Paeoniae Radix Alba, and Citri Reticulatae Pericarpium) included in BSS must be processed using some specific methods of stir-frying. On the basis of the classical theories of traditional Chinese medicine, the therapeutic effects of BSS would be significantly enhanced after processing. Generally, the changes of curative effects mainly result from the variations of inside chemical basis caused by the processing procedure. To find out the corresponding changes of chemical compositions in BSS after processing and to elucidate the material basis of the changed curative effects, an optimized ultra-high-performance liquid chromatography-quadrupole/time-of-flight mass spectrometry in positive and negative ion modes coupled with multivariate statistical analyses were developed. As a result, a total of 186 compounds were ultimately identified in crude and processed BSS, in which 62 marker compounds with significant differences between crude and processed BSS were found by principal component analysis and t-test. Compared with crude BSS, the contents of 23 compounds were remarkably decreased and the contents of 39 compounds showed notable increase in processed BSS. The transformation mechanisms of some changed compounds were appropriately inferred from the results. Furthermore, compounds with extremely significant differences might strengthen the effects of the whole herbal formula. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. A weighted U-statistic for genetic association analyses of sequencing data.

    PubMed

    Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing

    2014-12-01

    With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.

  17. GPUs for statistical data analysis in HEP: a performance study of GooFit on GPUs vs. RooFit on CPUs

    NASA Astrophysics Data System (ADS)

    Pompili, Alexis; Di Florio, Adriano; CMS Collaboration

    2016-10-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the Jψϕ invariant mass in the three-body decay B +→JψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerably resulting speed-up, while comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may apply or does not apply because its regularity conditions are not satisfied.

  18. Atrial Electrogram Fractionation Distribution before and after Pulmonary Vein Isolation in Human Persistent Atrial Fibrillation-A Retrospective Multivariate Statistical Analysis.

    PubMed

    Almeida, Tiago P; Chu, Gavin S; Li, Xin; Dastagir, Nawshin; Tuan, Jiun H; Stafford, Peter J; Schlindwein, Fernando S; Ng, G André

    2017-01-01

    Purpose: Complex fractionated atrial electrograms (CFAE)-guided ablation after pulmonary vein isolation (PVI) has been used for persistent atrial fibrillation (persAF) therapy. This strategy has shown suboptimal outcomes due to, among other factors, undetected changes in the atrial tissue following PVI. In the present work, we investigate CFAE distribution before and after PVI in patients with persAF using a multivariate statistical model. Methods: 207 pairs of atrial electrograms (AEGs) were collected before and after PVI respectively, from corresponding LA regions in 18 persAF patients. Twelve attributes were measured from the AEGs, before and after PVI. Statistical models based on multivariate analysis of variance (MANOVA) and linear discriminant analysis (LDA) have been used to characterize the atrial regions and AEGs. Results: PVI significantly reduced CFAEs in the LA (70 vs. 40%; P < 0.0001). Four types of LA regions were identified, based on the AEGs characteristics: (i) fractionated before PVI that remained fractionated after PVI (31% of the collected points); (ii) fractionated that converted to normal (39%); (iii) normal prior to PVI that became fractionated (9%) and; (iv) normal that remained normal (21%). Individually, the attributes failed to distinguish these LA regions, but multivariate statistical models were effective in their discrimination ( P < 0.0001). Conclusion: Our results have unveiled that there are LA regions resistant to PVI, while others are affected by it. Although, traditional methods were unable to identify these different regions, the proposed multivariate statistical model discriminated LA regions resistant to PVI from those affected by it without prior ablation information.

  19. A simple method of equine limb force vector analysis and its potential applications.

    PubMed

    Hobbs, Sarah Jane; Robinson, Mark A; Clayton, Hilary M

    2018-01-01

    Ground reaction forces (GRF) measured during equine gait analysis are typically evaluated by analyzing discrete values obtained from continuous force-time data for the vertical, longitudinal and transverse GRF components. This paper describes a simple, temporo-spatial method of displaying and analyzing sagittal plane GRF vectors. In addition, the application of statistical parametric mapping (SPM) is introduced to analyse differences between contra-lateral fore and hindlimb force-time curves throughout the stance phase. The overall aim of the study was to demonstrate alternative methods of evaluating functional (a)symmetry within horses. GRF and kinematic data were collected from 10 horses trotting over a series of four force plates (120 Hz). The kinematic data were used to determine clean hoof contacts. The stance phase of each hoof was determined using a 50 N threshold. Vertical and longitudinal GRF for each stance phase were plotted both as force-time curves and as force vector diagrams in which vectors originating at the centre of pressure on the force plate were drawn at intervals of 8.3 ms for the duration of stance. Visual evaluation was facilitated by overlay of the vector diagrams for different limbs. Summary vectors representing the magnitude (VecMag) and direction (VecAng) of the mean force over the entire stance phase were superimposed on the force vector diagram. Typical measurements extracted from the force-time curves (peak forces, impulses) were compared with VecMag and VecAng using partial correlation (controlling for speed). Paired samples t -tests (left v. right diagonal pair comparison and high v. low vertical force diagonal pair comparison) were performed on discrete and vector variables using traditional methods and Hotelling's T 2 tests on normalized stance phase data using SPM. Evidence from traditional statistical tests suggested that VecMag is more influenced by the vertical force and impulse, whereas VecAng is more influenced by the longitudinal force and impulse. When used to evaluate mean data from the group of ten sound horses, SPM did not identify differences between the left and right contralateral limb pairs or between limb pairs classified according to directional asymmetry. When evaluating a single horse, three periods were identified during which differences in the forces between the left and right forelimbs exceeded the critical threshold ( p  < .01). Traditional statistical analysis of 2D GRF peak values, summary vector variables and visual evaluation of force vector diagrams gave harmonious results and both methods identified the same inter-limb asymmetries. As alpha was more tightly controlled using SPM, significance was only found in the individual horse although T 2 plots followed the same trends as discrete analysis for the group. The techniques of force vector analysis and SPM hold promise for investigations of sidedness and asymmetry in horses.

  20. Size and shape measurement in contemporary cephalometrics.

    PubMed

    McIntyre, Grant T; Mossey, Peter A

    2003-06-01

    The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.

  1. Full information acquisition in scanning probe microscopy and spectroscopy

    DOEpatents

    Jesse, Stephen; Belianinov, Alex; Kalinin, Sergei V.; Somnath, Suhas

    2017-04-04

    Apparatus and methods are described for scanning probe microscopy and spectroscopy based on acquisition of full probe response. The full probe response contains valuable information about the probe-sample interaction that is lost in traditional scanning probe microscopy and spectroscopy methods. The full probe response is analyzed post data acquisition using fast Fourier transform and adaptive filtering, as well as multivariate analysis. The full response data is further compressed to retain only statistically significant components before being permanently stored.

  2. A Comparison of the Achievement of Statistics Students Enrolled in Online and Face-to-Face Settings

    ERIC Educational Resources Information Center

    Christmann, Edwin P.

    2017-01-01

    This study compared the achievement of male and female students who were enrolled in an online univariate statistics course to students enrolled in a traditional face-to-face univariate statistics course. The subjects, 47 graduate students enrolled in univariate statistics classes at a public, comprehensive university, were randomly assigned to…

  3. Something old, something new, something borrowed, something blue: a framework for the marriage of health econometrics and cost-effectiveness analysis.

    PubMed

    Hoch, Jeffrey S; Briggs, Andrew H; Willan, Andrew R

    2002-07-01

    Economic evaluation is often seen as a branch of health economics divorced from mainstream econometric techniques. Instead, it is perceived as relying on statistical methods for clinical trials. Furthermore, the statistic of interest in cost-effectiveness analysis, the incremental cost-effectiveness ratio is not amenable to regression-based methods, hence the traditional reliance on comparing aggregate measures across the arms of a clinical trial. In this paper, we explore the potential for health economists undertaking cost-effectiveness analysis to exploit the plethora of established econometric techniques through the use of the net-benefit framework - a recently suggested reformulation of the cost-effectiveness problem that avoids the reliance on cost-effectiveness ratios and their associated statistical problems. This allows the formulation of the cost-effectiveness problem within a standard regression type framework. We provide an example with empirical data to illustrate how a regression type framework can enhance the net-benefit method. We go on to suggest that practical advantages of the net-benefit regression approach include being able to use established econometric techniques, adjust for imperfect randomisation, and identify important subgroups in order to estimate the marginal cost-effectiveness of an intervention. Copyright 2002 John Wiley & Sons, Ltd.

  4. Epidemiology Characteristics, Methodological Assessment and Reporting of Statistical Analysis of Network Meta-Analyses in the Field of Cancer

    PubMed Central

    Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu

    2016-01-01

    Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997

  5. Comparative evaluation of traditional and self-priming hydrophilic resin

    PubMed Central

    Singla, Ruchi; Bogra, Poonam; Singal, Bhawana

    2012-01-01

    Background: The purpose of this study was to compare the microleakage of traditional composite (Charisma/Gluma Comfort Bond) and self-priming resin (Embrace Wetbond). Materials and Methods: Standardized Class V cavities partly in enamel and cementum were prepared in 20 extracted human premolars. Teeth were divided into two groups. Group 1 was restored with Charisma/Gluma Comfort Bond and Group 2 with Embrace Wetbond. The specimens were stored in distilled water at room temperature for 24 h and then subjected to 200 thermocycles at 5°C and 55°C with a 1 min dwell time. After thermocycling teeth were immersed in a 0.2% solution of methylene blue dye for 24 h. Teeth were sectioned vertically approximately midway through the facial and lingual surfaces using a diamond saw blade. Microleakage was evaluated at enamel and cementum surfaces using 10 × stereomicroscope. The statistical analysis was performed using Wilcoxon signed-rank test. Results: Wetbond showed less microleakage at occlusal and gingival margins as compared with Charisma/Gluma Comfort Bond and the results were statistically significant (P < 0.05). Conclusion: Class V cavities restored with Embrace Wetbond with fewer steps and fewer materials offers greater protection against microleakage at the tooth restorative interface. PMID:22876008

  6. A powerful score-based test statistic for detecting gene-gene co-association.

    PubMed

    Xu, Jing; Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Li, Hongkai; Wu, Xuesen; Xue, Fuzhong; Liu, Yanxun

    2016-01-29

    The genetic variants identified by Genome-wide association study (GWAS) can only account for a small proportion of the total heritability for complex disease. The existence of gene-gene joint effects which contains the main effects and their co-association is one of the possible explanations for the "missing heritability" problems. Gene-gene co-association refers to the extent to which the joint effects of two genes differ from the main effects, not only due to the traditional interaction under nearly independent condition but the correlation between genes. Generally, genes tend to work collaboratively within specific pathway or network contributing to the disease and the specific disease-associated locus will often be highly correlated (e.g. single nucleotide polymorphisms (SNPs) in linkage disequilibrium). Therefore, we proposed a novel score-based statistic (SBS) as a gene-based method for detecting gene-gene co-association. Various simulations illustrate that, under different sample sizes, marginal effects of causal SNPs and co-association levels, the proposed SBS has the better performance than other existed methods including single SNP-based and principle component analysis (PCA)-based logistic regression model, the statistics based on canonical correlations (CCU), kernel canonical correlation analysis (KCCU), partial least squares path modeling (PLSPM) and delta-square (δ (2)) statistic. The real data analysis of rheumatoid arthritis (RA) further confirmed its advantages in practice. SBS is a powerful and efficient gene-based method for detecting gene-gene co-association.

  7. A method for analyzing temporal patterns of variability of a time series from Poincare plots.

    PubMed

    Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E

    2012-07-01

    The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.

  8. Statistical Literacy Social Media Project for the Masses

    ERIC Educational Resources Information Center

    Gundlach, Ellen; Maybee, Clarence; O'Shea, Kevin

    2015-01-01

    This article examines a social media assignment used to teach and practice statistical literacy with over 400 students each semester in large-lecture traditional, fully online, and flipped sections of an introductory-level statistics course. Following the social media assignment, students completed a survey on how they approached the assignment.…

  9. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  10. Statistical Significance Testing in Second Language Research: Basic Problems and Suggestions for Reform

    ERIC Educational Resources Information Center

    Norris, John M.

    2015-01-01

    Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…

  11. Hyperspectral imaging coupled with chemometric analysis for non-invasive differentiation of black pens

    NASA Astrophysics Data System (ADS)

    Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna

    2016-11-01

    Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.

  12. Methodological considerations, such as directed acyclic graphs, for studying "acute on chronic" disease epidemiology: chronic obstructive pulmonary disease example.

    PubMed

    Tsai, Chu-Lin; Camargo, Carlos A

    2009-09-01

    Acute exacerbations of chronic disease are ubiquitous in clinical medicine, and thus far, there has been a paucity of integrated methodological discussion on this phenomenon. We use acute exacerbations of chronic obstructive pulmonary disease as an example to emphasize key epidemiological and statistical issues for this understudied field in clinical epidemiology. Directed acyclic graphs are a useful epidemiological tool to explain the differential effects of risk factor on health outcomes in studies of acute and chronic phases of disease. To study the pathogenesis of acute exacerbations of chronic disease, case-crossover design and time-series analysis are well-suited study designs to differentiate acute and chronic effect. Modeling changes over time and setting appropriate thresholds are important steps to separate acute from chronic phases of disease in serial measurements. In statistical analysis, acute exacerbations are recurrent events, and some individuals are more prone to recurrences than others. Therefore, appropriate statistical modeling should take into account intraindividual dependence. Finally, we recommend the use of "event-based" number needed to treat (NNT) to prevent a single exacerbation instead of traditional patient-based NNT. Addressing these methodological challenges will advance research quality in acute on chronic disease epidemiology.

  13. The application of latent curve analysis to testing developmental theories in intervention research.

    PubMed

    Curran, P J; Muthén, B O

    1999-08-01

    The effectiveness of a prevention or intervention program has traditionally been assessed using time-specific comparisons of mean levels between the treatment and the control groups. However, many times the behavior targeted by the intervention is naturally developing over time, and the goal of the treatment is to alter this natural or normative developmental trajectory. Examining time-specific mean levels can be both limiting and potentially misleading when the behavior of interest is developing systematically over time. It is argued here that there are both theoretical and statistical advantages associated with recasting intervention treatment effects in terms of normative and altered developmental trajectories. The recently developed technique of latent curve (LC) analysis is reviewed and extended to a true experimental design setting in which subjects are randomly assigned to a treatment intervention or a control condition. LC models are applied to both artificially generated and real intervention data sets to evaluate the efficacy of an intervention program. Not only do the LC models provide a more comprehensive understanding of the treatment and control group developmental processes compared to more traditional fixed-effects models, but LC models have greater statistical power to detect a given treatment effect. Finally, the LC models are modified to allow for the computation of specific power estimates under a variety of conditions and assumptions that can provide much needed information for the planning and design of more powerful but cost-efficient intervention programs for the future.

  14. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  15. Current application of chemometrics in traditional Chinese herbal medicine research.

    PubMed

    Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke

    2016-07-15

    Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Development and evaluation of statistical shape modeling for principal inner organs on torso CT images.

    PubMed

    Zhou, Xiangrong; Xu, Rui; Hara, Takeshi; Hirano, Yasushi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Kido, Shoji; Fujita, Hiroshi

    2014-07-01

    The shapes of the inner organs are important information for medical image analysis. Statistical shape modeling provides a way of quantifying and measuring shape variations of the inner organs in different patients. In this study, we developed a universal scheme that can be used for building the statistical shape models for different inner organs efficiently. This scheme combines the traditional point distribution modeling with a group-wise optimization method based on a measure called minimum description length to provide a practical means for 3D organ shape modeling. In experiments, the proposed scheme was applied to the building of five statistical shape models for hearts, livers, spleens, and right and left kidneys by use of 50 cases of 3D torso CT images. The performance of these models was evaluated by three measures: model compactness, model generalization, and model specificity. The experimental results showed that the constructed shape models have good "compactness" and satisfied the "generalization" performance for different organ shape representations; however, the "specificity" of these models should be improved in the future.

  17. Evaluation of surface characteristics of rotary nickel-titanium instruments produced by different manufacturing methods.

    PubMed

    Inan, U; Gurel, M

    2017-02-01

    Instrument fracture is a serious concern in endodontic practice. The aim of this study was to investigate the surface quality of new and used rotary nickel-titanium (NiTi) instruments manufactured by the traditional grinding process and twisting methods. Total 16 instruments of two rotary NiTi systems were used in this study. Eight Twisted Files (TF) (SybronEndo, Orange, CA, USA) and 8 Mtwo (VDW, Munich, Germany) instruments were evaluated. New and used of 4 experimental groups were evaluated using an atomic force microscopy (AFM). New and used instruments were analyzed on 3 points along a 3 mm. section at the tip of the instrument. Quantitative measurements according to the topographical deviations were recorded. The data were statistically analyzed with paired samples t-test and independent samples t-test. Mean root mean square (RMS) values for new and used TF 25.06 files were 10.70 ± 2.80 nm and 21.58 ± 6.42 nm, respectively, and the difference between them was statistically significant (P < 0.05). Mean RMS values for new and used Mtwo 25.06 files were 24.16 ± 9.30 nm and 39.15 ± 16.20 nm respectively, the difference between them also was statistically significant (P < 0.05). According to the AFM analysis, instruments produced by twisting method (TF 25.06) had better surface quality than the instruments produced by traditional grinding process (Mtwo 25.06 files).

  18. The five elements and Chinese-American mortality.

    PubMed

    Smith, Gary

    2006-01-01

    D. P. Phillips, T. E. Ruth, and L. M. Wagner (1993) reported that 1969-1990 California mortality data show that Chinese Americans are particularly vulnerable to diseases that Chinese astrology and traditional Chinese medicine associate with their birth years. For example, because fire is associated with the heart, a Chinese person born in a fire year (such as 1937) is more likely to die of heart disease than is a Chinese person born in a nonfire year. However, many diseases were excluded from this study, some diseases that were included have ambiguous links to birth years, and the statistical tests were indirect. A more complete statistical analysis and independent California mortality data for the years 1960-1968 and 1991-2002 did not replicate the original results. Copyright 2006 APA, all rights reserved.

  19. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  20. Novelty or knowledge? A study of using a student response system in non-major biology courses at a community college

    NASA Astrophysics Data System (ADS)

    Thames, Tasha Herrington

    The advancement in technology integration is laying the groundwork of a paradigm shift in the higher education system (Noonoo, 2011). The National Dropout Prevention Center (n.d.) claims that technology offers some of the best opportunities for presenting instruction to engage students in meaningful education, addressing multiple intelligences, and adjusting to students' various learning styles. The purpose of this study was to investigate if implementing clicker technology would have a statistically significant difference on student retention and student achievement, while controlling for learning styles, for students in non-major biology courses who were and were not subjected to the technology. This study also sought to identify if students perceived the use of clickers as beneficial to their learning. A quantitative quasi-experimental research design was utilized to determine the significance of differences in pre/posttest achievement scores between students who participated during the fall semester in 2014. Overall, 118 students (n = 118) voluntarily enrolled in the researcher's fall non-major Biology course at a southern community college. A total of 71 students were assigned to the experimental group who participated in instruction incorporating the ConcepTest Process with clicker technology along with traditional lecture. The remaining 51 students were assigned to the control group who participated in a traditional lecture format with peer instruction embedded. Statistical analysis revealed the experimental clicker courses did have higher posttest scores than the non-clicker control courses, but this was not significant (p >.05). Results also implied that clickers did not statistically help retain students to complete the course. Lastly, the results indicated that there were no significant statistical difference in student's clicker perception scores between the different learning style preferences.

  1. Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing.

    PubMed

    Xiao, Hao; Sun, Tianyang; Meng, Bo; Cheng, Lihong

    2017-01-01

    The rise of global value chains (GVCs) characterized by the so-called "outsourcing", "fragmentation production", and "trade in tasks" has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics.

  2. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  3. Comparing statistical and machine learning classifiers: alternatives for predictive modeling in human factors research.

    PubMed

    Carnahan, Brian; Meyer, Gérard; Kuntz, Lois-Ann

    2003-01-01

    Multivariate classification models play an increasingly important role in human factors research. In the past, these models have been based primarily on discriminant analysis and logistic regression. Models developed from machine learning research offer the human factors professional a viable alternative to these traditional statistical classification methods. To illustrate this point, two machine learning approaches--genetic programming and decision tree induction--were used to construct classification models designed to predict whether or not a student truck driver would pass his or her commercial driver license (CDL) examination. The models were developed and validated using the curriculum scores and CDL exam performances of 37 student truck drivers who had completed a 320-hr driver training course. Results indicated that the machine learning classification models were superior to discriminant analysis and logistic regression in terms of predictive accuracy. Actual or potential applications of this research include the creation of models that more accurately predict human performance outcomes.

  4. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    PubMed

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. The application of data mining techniques to oral cancer prognosis.

    PubMed

    Tseng, Wan-Ting; Chiang, Wei-Fan; Liu, Shyun-Yeu; Roan, Jinsheng; Lin, Chun-Nan

    2015-05-01

    This study adopted an integrated procedure that combines the clustering and classification features of data mining technology to determine the differences between the symptoms shown in past cases where patients died from or survived oral cancer. Two data mining tools, namely decision tree and artificial neural network, were used to analyze the historical cases of oral cancer, and their performance was compared with that of logistic regression, the popular statistical analysis tool. Both decision tree and artificial neural network models showed superiority to the traditional statistical model. However, as to clinician, the trees created by the decision tree models are relatively easier to interpret compared to that of the artificial neural network models. Cluster analysis also discovers that those stage 4 patients whose also possess the following four characteristics are having an extremely low survival rate: pN is N2b, level of RLNM is level I-III, AJCC-T is T4, and cells mutate situation (G) is moderate.

  6. Triacylglycerol Analysis in Human Milk and Other Mammalian Species: Small-Scale Sample Preparation, Characterization, and Statistical Classification Using HPLC-ELSD Profiles.

    PubMed

    Ten-Doménech, Isabel; Beltrán-Iturat, Eduardo; Herrero-Martínez, José Manuel; Sancho-Llopis, Juan Vicente; Simó-Alfonso, Ernesto Francisco

    2015-06-24

    In this work, a method for the separation of triacylglycerols (TAGs) present in human milk and from other mammalian species by reversed-phase high-performance liquid chromatography using a core-shell particle packed column with UV and evaporative light-scattering detectors is described. Under optimal conditions, a mobile phase containing acetonitrile/n-pentanol at 10 °C gave an excellent resolution among more than 50 TAG peaks. A small-scale method for fat extraction in these milks (particularly of interest for human milk samples) using minimal amounts of sample and reagents was also developed. The proposed extraction protocol and the traditional method were compared, giving similar results, with respect to the total fat and relative TAG contents. Finally, a statistical study based on linear discriminant analysis on the TAG composition of different types of milks (human, cow, sheep, and goat) was carried out to differentiate the samples according to their mammalian origin.

  7. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    PubMed

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  8. Testing the Association Between Traditional and Novel Indicators of County-Level Structural Racism and Birth Outcomes among Black and White Women.

    PubMed

    Chambers, Brittany D; Erausquin, Jennifer Toller; Tanner, Amanda E; Nichols, Tracy R; Brown-Jeffy, Shelly

    2017-12-07

    Despite decreases in infants born premature and at low birth weight in the United States (U.S.), racial disparities between Black and White women continue. In response, the purpose of this analysis was to examine associations between both traditional and novel indicators of county-level structural racism and birth outcomes among Black and White women. We merged individual-level data from the California Birth Statistical Master Files 2009-2013 with county-level data from the United States (U.S.) Census American Community Survey. We used hierarchical linear modeling to examine Black-White differences among 531,170 primiparous women across 33 California counties. Traditional (e.g., dissimilarity index) and novel indicators (e.g., Black to White ratio in elected office) were associated with earlier gestational age and lower birth weight among Black and White women. A traditional indicator was more strongly associated with earlier gestational age for Black women than for White women. This was the first study to empirically demonstrate that structural racism, measured by both traditional and novel indicators, is associated with poor health and wellbeing of infants born to Black and White women. However, findings indicate traditional indicators of structural racism, rather than novel indicators, better explain racial disparities in birth outcomes. Results also suggest the need to develop more innovative approaches to: (1) measure structural racism at the county-level and (2) reform public policies to increase integration and access to resources.

  9. [Clinical application evaluation of Guidelines for Diagnosis and Treatment of Common Diseases of Otolaryngology in Traditional Chinese Medicine].

    PubMed

    Liu, Yu-Qi; Liu, Meng-Yu; Li, Chun; Shi, Nan-Nan; Wang, Yue-Xi; Wang, Li-Ying; Zhao, Xue-Yao; Kou, Shuang; Han, Xue-Jie; Wang, Yan-Ping

    2017-09-01

    This study is to assess the Guidelines for Diagnosis and Treatment of Common Diseases of Otolaryngology in Traditional Chinese Medicine in clinical application and provide evidence for further guideline revision. The assessment was divided into applicability assessment and practicability assessment. The applicability assessment based on questionnaire survey and the traditional Chinese medicine (TCM) practitioners were asked to independently fill the Questionnaire for Applicability Assessment on the Guidelines for Diagnosis and Treatment in Traditional Chinese Medicine. The practicability assessment was based on prospective case investigation and analysis method and the TCM practitioners-in-charge filled the Case Investigation Questionnaire for Practicability Assessment on the Guidelines for Diagnosis and Treatment in Traditional Chinese Medicine. The data were analyzed in descriptive statistics. 151 questionnaires were investigated for applicability assessment and 1 016 patients were included for practicability assessment. The results showed that 88.74% of them were familiar with the guidelines and 45.70% used them. The guidelines quality and related items were similar in applicability assessment and practicability assessment, and scored highly as more than 85.00% except the "recuperating and prevention". The results suggested that the quality of Guidelines for Diagnosis and Treatment of Common Diseases of Otolaryngology in Traditional Chinese Medicine was high and could better guide the clinical practice. The "recuperating and prevention" part should be improved and the evidence data should be included in future guideline revision, so that the clinical utilization rate could be increased. Copyright© by the Chinese Pharmaceutical Association.

  10. Utilizing Wavelet Analysis to assess hydrograph change in northwestern North America

    NASA Astrophysics Data System (ADS)

    Tang, W.; Carey, S. K.

    2017-12-01

    Historical streamflow data in the mountainous regions of northwestern North America suggest that changes flows are driven by warming temperature, declining snowpack and glacier extent, and large-scale teleconnections. However, few sites exist that have robust long-term records for statistical analysis, and pervious research has focussed on high and low-flow indices along with trend analysis using Mann-Kendal test and other similar approaches. Furthermore, there has been less emphasis on ascertaining the drivers of change in changes in shape of the streamflow hydrograph compared with traditional flow metrics. In this work, we utilize wavelet analysis to evaluate changes in hydrograph characteristics for snowmelt driven rivers in northwestern North America across a range of scales. Results suggest that wavelets can be used to detect a lengthening and advancement of freshet with a corresponding decline in peak flows. Furthermore, the gradual transition of flows from nival to pluvial regimes in more southerly catchments is evident in the wavelet spectral power through time. This method of change detection is challenged by evaluating the statistical significance of changes in wavelet spectra as related to hydrograph form, yet ongoing work seeks to link these patters to driving weather and climate along with larger scale teleconnections.

  11. Sequential analysis of hydrochemical data for watershed characterization.

    PubMed

    Thyne, Geoffrey; Güler, Cüneyt; Poeter, Eileen

    2004-01-01

    A methodology for characterizing the hydrogeology of watersheds using hydrochemical data that combine statistical, geochemical, and spatial techniques is presented. Surface water and ground water base flow and spring runoff samples (180 total) from a single watershed are first classified using hierarchical cluster analysis. The statistical clusters are analyzed for spatial coherence confirming that the clusters have a geological basis corresponding to topographic flowpaths and showing that the fractured rock aquifer behaves as an equivalent porous medium on the watershed scale. Then principal component analysis (PCA) is used to determine the sources of variation between parameters. PCA analysis shows that the variations within the dataset are related to variations in calcium, magnesium, SO4, and HCO3, which are derived from natural weathering reactions, and pH, NO3, and chlorine, which indicate anthropogenic impact. PHREEQC modeling is used to quantitatively describe the natural hydrochemical evolution for the watershed and aid in discrimination of samples that have an anthropogenic component. Finally, the seasonal changes in the water chemistry of individual sites were analyzed to better characterize the spatial variability of vertical hydraulic conductivity. The integrated result provides a method to characterize the hydrogeology of the watershed that fully utilizes traditional data.

  12. A Commercial IOTV Cleaning Study

    DTIC Science & Technology

    2010-04-12

    manufacturer’s list price without taking into consideration of possible volume discount.  Equipment depreciation cost was calculated based on...Laundering with Prewash Spot Cleaning) 32 Table 12 Shrinkage Statistical Data (Traditional Wet Laundering without Prewash Spot Cleaning...Statistical Data (Computer-controlled Wet Cleaning without Prewash Spot Cleaning) 35 Table 15 Shrinkage Statistical Data (Liquid CO2 Cleaning

  13. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    ERIC Educational Resources Information Center

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  14. Comparing Student Success and Understanding in Introductory Statistics under Consensus and Simulation-Based Curricula

    ERIC Educational Resources Information Center

    Hldreth, Laura A.; Robison-Cox, Jim; Schmidt, Jade

    2018-01-01

    This study examines the transferability of results from previous studies of simulation-based curriculum in introductory statistics using data from 3,500 students enrolled in an introductory statistics course at Montana State University from fall 2013 through spring 2016. During this time, four different curricula, a traditional curriculum and…

  15. Statistical approach to the analysis of olive long-term pollen season trends in southern Spain.

    PubMed

    García-Mozo, H; Yaezel, L; Oteros, J; Galán, C

    2014-03-01

    Analysis of long-term airborne pollen counts makes it possible not only to chart pollen-season trends but also to track changing patterns in flowering phenology. Changes in higher plant response over a long interval are considered among the most valuable bioindicators of climate change impact. Phenological-trend models can also provide information regarding crop production and pollen-allergen emission. The interest of this information makes essential the election of the statistical analysis for time series study. We analysed trends and variations in the olive flowering season over a 30-year period (1982-2011) in southern Europe (Córdoba, Spain), focussing on: annual Pollen Index (PI); Pollen Season Start (PSS), Peak Date (PD), Pollen Season End (PSE) and Pollen Season Duration (PSD). Apart from the traditional Linear Regression analysis, a Seasonal-Trend Decomposition procedure based on Loess (STL) and an ARIMA model were performed. Linear regression results indicated a trend toward delayed PSE and earlier PSS and PD, probably influenced by the rise in temperature. These changes are provoking longer flowering periods in the study area. The use of the STL technique provided a clearer picture of phenological behaviour. Data decomposition on pollination dynamics enabled the trend toward an alternate bearing cycle to be distinguished from the influence of other stochastic fluctuations. Results pointed to show a rising trend in pollen production. With a view toward forecasting future phenological trends, ARIMA models were constructed to predict PSD, PSS and PI until 2016. Projections displayed a better goodness of fit than those derived from linear regression. Findings suggest that olive reproductive cycle is changing considerably over the last 30years due to climate change. Further conclusions are that STL improves the effectiveness of traditional linear regression in trend analysis, and ARIMA models can provide reliable trend projections for future years taking into account the internal fluctuations in time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  17. Connectivity-based fixel enhancement: Whole-brain statistical analysis of diffusion MRI measures in the presence of crossing fibres

    PubMed Central

    Raffelt, David A.; Smith, Robert E.; Ridgway, Gerard R.; Tournier, J-Donald; Vaughan, David N.; Rose, Stephen; Henderson, Robert; Connelly, Alan

    2015-01-01

    In brain regions containing crossing fibre bundles, voxel-average diffusion MRI measures such as fractional anisotropy (FA) are difficult to interpret, and lack within-voxel single fibre population specificity. Recent work has focused on the development of more interpretable quantitative measures that can be associated with a specific fibre population within a voxel containing crossing fibres (herein we use fixel to refer to a specific fibre population within a single voxel). Unfortunately, traditional 3D methods for smoothing and cluster-based statistical inference cannot be used for voxel-based analysis of these measures, since the local neighbourhood for smoothing and cluster formation can be ambiguous when adjacent voxels may have different numbers of fixels, or ill-defined when they belong to different tracts. Here we introduce a novel statistical method to perform whole-brain fixel-based analysis called connectivity-based fixel enhancement (CFE). CFE uses probabilistic tractography to identify structurally connected fixels that are likely to share underlying anatomy and pathology. Probabilistic connectivity information is then used for tract-specific smoothing (prior to the statistical analysis) and enhancement of the statistical map (using a threshold-free cluster enhancement-like approach). To investigate the characteristics of the CFE method, we assessed sensitivity and specificity using a large number of combinations of CFE enhancement parameters and smoothing extents, using simulated pathology generated with a range of test-statistic signal-to-noise ratios in five different white matter regions (chosen to cover a broad range of fibre bundle features). The results suggest that CFE input parameters are relatively insensitive to the characteristics of the simulated pathology. We therefore recommend a single set of CFE parameters that should give near optimal results in future studies where the group effect is unknown. We then demonstrate the proposed method by comparing apparent fibre density between motor neurone disease (MND) patients with control subjects. The MND results illustrate the benefit of fixel-specific statistical inference in white matter regions that contain crossing fibres. PMID:26004503

  18. Improving Nursing Students' Learning Outcomes in Fundamentals of Nursing Course through Combination of Traditional and e-Learning Methods.

    PubMed

    Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin

    2018-01-01

    Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills.

  19. Improving Nursing Students' Learning Outcomes in Fundamentals of Nursing Course through Combination of Traditional and e-Learning Methods

    PubMed Central

    Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin

    2018-01-01

    Background: Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. Materials and Methods: A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Results: Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Conclusions: Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills. PMID:29861761

  20. Effects of huangqi and bear bile on recurrent parotitis in children: a new clinical approach.

    PubMed

    Ruan, Wen-hua; Huang, Mei-li; He, Xiao-lei; Zhang, Feng; Tao, Hai-biao

    2013-03-01

    To evaluate the pharmacological effects of traditional Chinese medicine, bear bile capsule and Huangqi granule, on recurrent parotitis in children. In this prospective, controlled, and randomized study, a total of 151 young children were divided into three groups: Group A included massaging the children's parotid region and melting vitamin C in their mouth daily; Group B included swallowing bear bile capsule and Huangqi granule daily; and Group C included massages and vitamin C as prescribed in Group A, and traditional Chinese medicine as prescribed in Group B. Children were treated individually for one month and then a follow-up study was conducted for 1 to 3.5 years. Analysis of variance (ANOVA) and Ridit analysis were employed for statistical analysis. The recurrence rate decreased in every group, but was significantly more in Groups B and C when compared to Group A. The recurrences significantly decreased (P<0.01) in Group B and their recovery rate was as high as 63%, significantly better than those of the other groups (P<0.01). Huangqi and bear bile could be a novel clinical approach for treating recurrent parotitis in children.

  1. Effects of Huangqi and bear bile on recurrent parotitis in children: a new clinical approach*

    PubMed Central

    Ruan, Wen-hua; Huang, Mei-li; He, Xiao-lei; Zhang, Feng; Tao, Hai-biao

    2013-01-01

    Objective: To evaluate the pharmacological effects of traditional Chinese medicine, bear bile capsule and Huangqi granule, on recurrent parotitis in children. Methods: In this prospective, controlled, and randomized study, a total of 151 young children were divided into three groups: Group A included massaging the children’s parotid region and melting vitamin C in their mouth daily; Group B included swallowing bear bile capsule and Huangqi granule daily; and Group C included massages and vitamin C as prescribed in Group A, and traditional Chinese medicine as prescribed in Group B. Children were treated individually for one month and then a follow-up study was conducted for 1 to 3.5 years. Analysis of variance (ANOVA) and Ridit analysis were employed for statistical analysis. Results: The recurrence rate decreased in every group, but was significantly more in Groups B and C when compared to Group A. The recurrences significantly decreased (P<0.01) in Group B and their recovery rate was as high as 63%, significantly better than those of the other groups (P<0.01). Conclusions: Huangqi and bear bile could be a novel clinical approach for treating recurrent parotitis in children. PMID:23463769

  2. [Near infrared reflectance spectroscopy (NIRS): a novel approach to reconstructing historical changes of primary productivity in Antarctic lake].

    PubMed

    Chen, Qian-Qian; Liu, Xiao-Dong; Liu, Wen-Qi; Jiang, Shan

    2011-10-01

    Compared with traditional chemical analysis methods, reflectance spectroscopy has the advantages of speed, minimal or no sample preparation, non-destruction, and low cost. In order to explore the potential application of spectroscopy technology in the paleolimnological study on Antarctic lakes, we took a lake sediment core in Mochou Lake at Zhongshan Station of Antarctic, and analyzed the near infrared reflectance spectroscopy (NIRS) data in the sedimentary samples. The results showed that the factor loadings of principal component analysis (PCA) displayed very similar depth-profile change pattern with the S2 index, a reliable proxy for the change in historical lake primary productivity. The correlation analysis showed that the values of PCA factor loading and S2 were correlated significantly, suggesting that it is feasible to infer paleoproductivity changes recorded in Antarctic lakes using NIRS technology. Compared to the traditional method of the trough area between 650 and 700 nm, the authors found that the PCA statistical approach was more accurate for reconstructing the change in historical lake primary productivity. The results reported here demonstrate that reflectance spectroscopy can provide a rapid method for the reconstruction of lake palaeoenviro nmental change in the remote Antarctic regions.

  3. Identification by random forest method of HLA class I amino acid substitutions associated with lower survival at day 100 in unrelated donor hematopoietic cell transplantation.

    PubMed

    Marino, S R; Lin, S; Maiers, M; Haagenson, M; Spellman, S; Klein, J P; Binkowski, T A; Lee, S J; van Besien, K

    2012-02-01

    The identification of important amino acid substitutions associated with low survival in hematopoietic cell transplantation (HCT) is hampered by the large number of observed substitutions compared with the small number of patients available for analysis. Random forest analysis is designed to address these limitations. We studied 2107 HCT recipients with good or intermediate risk hematological malignancies to identify HLA class I amino acid substitutions associated with reduced survival at day 100 post transplant. Random forest analysis and traditional univariate and multivariate analyses were used. Random forest analysis identified amino acid substitutions in 33 positions that were associated with reduced 100 day survival, including HLA-A 9, 43, 62, 63, 76, 77, 95, 97, 114, 116, 152, 156, 166 and 167; HLA-B 97, 109, 116 and 156; and HLA-C 6, 9, 11, 14, 21, 66, 77, 80, 95, 97, 99, 116, 156, 163 and 173. In all 13 had been previously reported by other investigators using classical biostatistical approaches. Using the same data set, traditional multivariate logistic regression identified only five amino acid substitutions associated with lower day 100 survival. Random forest analysis is a novel statistical methodology for analysis of HLA mismatching and outcome studies, capable of identifying important amino acid substitutions missed by other methods.

  4. 1H NMR spectroscopy analysis of metabolites in the kidneys provides new insight into pathophysiological mechanisms: applications for treatment with Cordyceps sinensis.

    PubMed

    Zhong, Fang; Liu, Xia; Zhou, Qiao; Hao, Xu; Lu, Ying; Guo, Shanmai; Wang, Weiming; Lin, Donghai; Chen, Nan

    2012-02-01

    The number of patients with chronic kidney disease (CKD) is continuously growing worldwide. Treatment with traditional Chinese medicine might slow the progression of CKD. In this study, we evaluated the renal protective effects of the Chinese herb Cordyceps sinensis in rats with 5/6 nephrectomy. Male Sprague-Dawley mice (weighing 150-200 g) were subjected to 5/6 nephrectomy. The rats were divided into three groups: (i) untreated nephrectomized group (OP group, n = 16), (ii) oral administration of C. sinensis-treated (4 mg/kg/day) nephrectomized group (CS group, n = 16) and (iii) sham-operated group (SO group, n = 16). The rats were sacrificed at 4 and 8 weeks after 5/6 nephrectomy, and the kidneys, serum and urine were collected for (1)H nuclear magnetic resonance spectral analysis. Multivariate statistical techniques and statistical metabolic correlation comparison analysis were performed to identify metabolic changes in aqueous kidney extracts between these groups. Significant differences between these groups were discovered in the metabolic profiles of the biofluids and kidney extracts. Pathways including the citrate cycle, branched-chain amino acid metabolism and the metabolites that regulate permeate pressure were disturbed in the OP group compared to the SO group; in addition, these pathways were reversed by C. sinensis treatment. Biochemistry and electron microscopic images verified that C. sinensis has curative effects on chronic renal failure. These results were confirmed by metabonomics results. Our study demonstrates that C. sinensis has potential curative effects on CKD, and our metabonomics results provided new insight into the mechanism of treatment of this traditional Chinese medicine.

  5. Dietary inflammatory index and risk of lung cancer and other respiratory conditions among heavy smokers in the COSMOS screening study.

    PubMed

    Maisonneuve, Patrick; Shivappa, Nitin; Hébert, James R; Bellomi, Massimo; Rampinelli, Cristiano; Bertolotti, Raffaella; Spaggiari, Lorenzo; Palli, Domenico; Veronesi, Giulia; Gnagnarella, Patrizia

    2016-04-01

    To test whether the inflammatory potential of diet, as measured using the dietary inflammatory index (DII), is associated with risk of lung cancer or other respiratory conditions and to compare results obtained with those based on the aMED score, an established dietary index that measures adherence to the traditional Mediterranean diet. In 4336 heavy smokers enrolled in a prospective, non-randomized lung cancer screening program, we measured participants' diets at baseline using a self-administered food frequency questionnaire from which dietary scores were calculated. Cox proportional hazards and logistic regression models were used to assess association between the dietary indices and lung cancer diagnosed during annual screening, and other respiratory outcomes that were recorded at baseline, respectively. In multivariable analysis, adjusted for baseline lung cancer risk (estimated from age, sex, smoking history, and asbestos exposure) and total energy, both DII and aMED scores were associated with dyspnoea (p trend = 0.046 and 0.02, respectively) and radiological evidence of emphysema (p trend = 0.0002 and 0.02). After mutual adjustment of the two dietary scores, only the association between DII and radiological evidence of emphysema (Q4 vs. Q1, OR 1.30, 95 % CI 1.01-1.67, p trend = 0.012) remained statistically significant. At univariate analysis, both DII and aMED were associated with lung cancer risk, but in fully adjusted multivariate analysis, only the association with aMED remained statistically significant (p trend = 0.04). Among heavy smokers, a pro-inflammatory diet, as indicated by increasing DII score, is associated with dyspnoea and radiological evidence of emphysema. A traditional Mediterranean diet, which is associated with a lower DII, may lower lung cancer risk.

  6. Dietary inflammatory index and risk of lung cancer and other respiratory conditions among heavy smokers in the COSMOS screening study

    PubMed Central

    Shivappa, Nitin; Hébert, James R.; Bellomi, Massimo; Rampinelli, Cristiano; Bertolotti, Raffaella; Spaggiari, Lorenzo; Palli, Domenico; Veronesi, Giulia; Gnagnarella, Patrizia

    2016-01-01

    Purpose To test whether the inflammatory potential of diet, as measured using the dietary inflammatory index (DII), is associated with risk of lung cancer or other respiratory conditions and to compare results obtained with those based on the aMED score, an established dietary index that measures adherence to the traditional Mediterranean diet. Methods In 4336 heavy smokers enrolled in a prospective, non-randomized lung cancer screening program, we measured participants’ diets at baseline using a self-administered food frequency questionnaire from which dietary scores were calculated. Cox proportional hazards and logistic regression models were used to assess association between the dietary indices and lung cancer diagnosed during annual screening, and other respiratory outcomes that were recorded at baseline, respectively. Results In multivariable analysis, adjusted for baseline lung cancer risk (estimated from age, sex, smoking history, and asbestos exposure) and total energy, both DII and aMED scores were associated with dyspnoea (p trend = 0.046 and 0.02, respectively) and radiological evidence of emphysema (p trend = 0.0002 and 0.02). After mutual adjustment of the two dietary scores, only the association between DII and radiological evidence of emphysema (Q4 vs. Q1, OR 1.30, 95 % CI 1.01–1.67, p trend = 0.012) remained statistically significant. At univariate analysis, both DII and aMED were associated with lung cancer risk, but in fully adjusted multivariate analysis, only the association with aMED remained statistically significant (p trend = 0.04). Conclusions Among heavy smokers, a pro-inflammatory diet, as indicated by increasing DII score, is associated with dyspnoea and radiological evidence of emphysema. A traditional Mediterranean diet, which is associated with a lower DII, may lower lung cancer risk. PMID:25953452

  7. Medical students preference of problem-based learning or traditional lectures in King Abdulaziz University, Jeddah, Saudi Arabia.

    PubMed

    Ibrahim, Nahla Khamis; Banjar, Shorooq; Al-Ghamdi, Amal; Al-Darmasi, Moroj; Khoja, Abeer; Turkistani, Jamela; Arif, Rwan; Al-Sebyani, Awatif; Musawa, Al-Anoud; Basfar, Wijdan

    2014-01-01

    Problem-based learning (PBL) is the most important educational innovations in the past 4 decades. The objective of the study was to compare between the preference of medical students for PBL and the preference for traditional lectures regarding learning outcomes (e.g., knowledge, attitude, and skills) gained from both methods. A cross-sectional study was conducted among medical students who studied the hybrid curriculum (PBL and traditional lectures) in King Abdulaziz University, Jeddah, in 2011. Data was collected through a pre-constructed, validated, confidentially anonymous, and self-administered questionnaire. Students' perceptions toward PBL and traditional lectures were assessed through their response to 20 statements inquired about both methods of learning using a five-point Likert scale. Descriptive and analytic statistics were performed using SPSS, version 21 (SPSS Inc, Chicago, Ill., USA). Learners preferred PBL more to traditional lectures for better linking the knowledge of basic and clinical sciences (t test=10.15, P < .001). However, no statistical significant difference (P > .05) was observed regarding the amount of basic knowledge recalled from both methods. Students preferred PBL more to lectures for better learning attitudes, skills, future outcomes, and learning satisfaction (P < .05). PBL motivates students to learn better than lecturing (P < .05). From students' opinion, the mean total skill gained from PBL (47.2 [10.6]) was much higher than that of lectures (33.0 [9.9]), and a highly statistical significant difference was observed (t test=20.9, P < .001). Students preferred PBL more to traditional lectures for improving most of learning outcome domains, especially, learning attitudes and skills. Introducing hybrid-PBL curriculum in all Saudi universities is highly recommended.

  8. Confounding in statistical mediation analysis: What it is and how to address it.

    PubMed

    Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P

    2017-11-01

    Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Decision Tree Analysis of Traditional Risk Factors of Carotid Atherosclerosis and a Cutpoint-Based Prevention Strategy

    PubMed Central

    Lv, Lihong; Xiao, Yufei; Tu, Jiangfeng; Tao, Lisha; Wu, Jiaqi; Tang, Xiaoxiao; Pan, Wensheng

    2014-01-01

    Background Reducing the exposure to risk factors for the prevention of cardio-cerebral vascular disease is a crucial issue. Few reports have described practical interventions for preventing cardiovascular disease in different genders and age groups, particularly detailed and specific cutpoint-based prevention strategies. Methods We collected the health examination data of 5822 subjects between 20 and 80 years of age. The administration of medical questionnaires and physical examinations and the measurement of blood pressure, fasting plasma glucose (FPG) and blood lipids [total cholesterol (TC), triglycerides (TG), high density lipoprotein–cholesterol (HDL-C), and low density lipoprotein-cholesterol (LDL-C)] were performed by physicians. Carotid ultrasound was performed to examine the carotid intima-media thickness (CIMT), which was defined as carotid atherosclerosis when CIMT ≥0.9 mm. Decision tree analysis was used to screen for the most important risk factors for carotid atherosclerosis and to identify the relevant cutpoints. Results In the study population, the incidence of carotid atherosclerosis was 12.20% (men: 14.10%, women: 9.20%). The statistical analysis showed significant differences in carotid atherosclerosis incidence between different genders (P<0.0001) and age groups (P<0.001). The decision tree analysis showed that in men, the most important traditional risk factors for carotid atherosclerosis were TC (cutpoint [CP]: 6.31 mmol/L) between the ages of 20–40 and FPG (CP: 5.79 mmol/L) between the ages of 41–59. By comparison, LDL-C (CP: 4.27 mmol/L) became the major risk factor when FPG ≤5.79 mmol/L. FPG (CP: 5.52 mmol/L) and TG (CP: 1.51 mmol/L) were the most important traditional risk factors for women between 20–40 and 41–59 years of age, respectively. Conclusion Traditional risk factors and relevant cutpoints were not identical in different genders and age groups. A specific gender and age group-based cutpoint strategy might contribute to preventing cardiovascular disease. PMID:25398126

  10. [Design and implementation of Chinese materia medica resources survey results display system].

    PubMed

    Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Wang, Ling; Zhao, Yan-Ping; Jing, Zhi-Xian; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    From the beginning of the fourth national census of traditional Chinese medicine resources in 2011, a large amount of data have been collected and compiled, including wild medicinal plant resource data, cultivation of medicinal plant information, traditional knowledge, and specimen information. The traditional paper-based recording method is inconvenient for query and application. The B/S architecture, JavaWeb framework and SOA are used to design and develop the fourth national census results display platform. Through the data integration and sorting, the users are to provide with integrated data services and data query display solutions. The platform realizes the fine data classification, and has the simple data retrieval and the university statistical analysis function. The platform uses Echarts components, Geo Server, Open Layers and other technologies to provide a variety of data display forms such as charts, maps and other visualization forms, intuitive reflects the number, distribution and type of Chinese material medica resources. It meets the data mapping requirements of different levels of users, and provides support for management decision-making. Copyright© by the Chinese Pharmaceutical Association.

  11. Fuzzy fault tree assessment based on improved AHP for fire and explosion accidents for steel oil storage tanks.

    PubMed

    Shi, Lei; Shuai, Jian; Xu, Kui

    2014-08-15

    Fire and explosion accidents of steel oil storage tanks (FEASOST) occur occasionally during the petroleum and chemical industry production and storage processes and often have devastating impact on lives, the environment and property. To contribute towards the development of a quantitative approach for assessing the occurrence probability of FEASOST, a fault tree of FEASOST is constructed that identifies various potential causes. Traditional fault tree analysis (FTA) can achieve quantitative evaluation if the failure data of all of the basic events (BEs) are available, which is almost impossible due to the lack of detailed data, as well as other uncertainties. This paper makes an attempt to perform FTA of FEASOST by a hybrid application between an expert elicitation based improved analysis hierarchy process (AHP) and fuzzy set theory, and the occurrence possibility of FEASOST is estimated for an oil depot in China. A comparison between statistical data and calculated data using fuzzy fault tree analysis (FFTA) based on traditional and improved AHP is also made. Sensitivity and importance analysis has been performed to identify the most crucial BEs leading to FEASOST that will provide insights into how managers should focus effective mitigation. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Statistical model with two order parameters for ductile and soft fiber bundles in nanoscience and biomaterials.

    PubMed

    Rinaldi, Antonio

    2011-04-01

    Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).

  13. Analysis of Medication Error Reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Young, Jonathan; Santell, John

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison,more » and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.« less

  14. Key Aspects of a Computerized Statistics Course.

    ERIC Educational Resources Information Center

    Wells, Karin L.; Marsh, Lawrence C.

    1997-01-01

    Looks at ways in which computer-assisted instruction transforms three traditional aspects of college teaching: lectures are replaced with multimedia presentations; homework becomes electronic, with instant grading and detailed explanations; and traditional office hours are replaced with electronic mail, list-serv, and live screen interaction…

  15. Prognostic value of coronary computed tomographic angiography findings in asymptomatic individuals: a 6-year follow-up from the prospective multicentre international CONFIRM study.

    PubMed

    Cho, Iksung; Al'Aref, Subhi J; Berger, Adam; Ó Hartaigh, Bríain; Gransar, Heidi; Valenti, Valentina; Lin, Fay Y; Achenbach, Stephan; Berman, Daniel S; Budoff, Matthew J; Callister, Tracy Q; Al-Mallah, Mouaz H; Cademartiri, Filippo; Chinnaiyan, Kavitha; Chow, Benjamin J W; DeLago, Augustin; Villines, Todd C; Hadamitzky, Martin; Hausleiter, Joerg; Leipsic, Jonathon; Shaw, Leslee J; Kaufmann, Philipp A; Feuchtner, Gudrun; Kim, Yong-Jin; Maffei, Erica; Raff, Gilbert; Pontone, Gianluca; Andreini, Daniele; Marques, Hugo; Rubinshtein, Ronen; Chang, Hyuk-Jae; Min, James K

    2018-03-14

    The long-term prognostic benefit of coronary computed tomographic angiography (CCTA) findings of coronary artery disease (CAD) in asymptomatic populations is unknown. From the prospective multicentre international CONFIRM long-term study, we evaluated asymptomatic subjects without known CAD who underwent both coronary artery calcium scoring (CACS) and CCTA (n = 1226). Coronary computed tomographic angiography findings included the severity of coronary artery stenosis, plaque composition, and coronary segment location. Using the C-statistic and likelihood ratio tests, we evaluated the incremental prognostic utility of CCTA findings over a base model that included a panel of traditional risk factors (RFs) as well as CACS to predict long-term all-cause mortality. During a mean follow-up of 5.9 ± 1.2 years, 78 deaths occurred. Compared with the traditional RF alone (C-statistic 0.64), CCTA findings including coronary stenosis severity, plaque composition, and coronary segment location demonstrated improved incremental prognostic utility beyond traditional RF alone (C-statistics range 0.71-0.73, all P < 0.05; incremental χ2 range 20.7-25.5, all P < 0.001). However, no added prognostic benefit was offered by CCTA findings when added to a base model containing both traditional RF and CACS (C-statistics P > 0.05, for all). Coronary computed tomographic angiography improved prognostication of 6-year all-cause mortality beyond a set of conventional RF alone, although, no further incremental value was offered by CCTA when CCTA findings were added to a model incorporating RF and CACS.

  16. An unsupervised classification technique for multispectral remote sensing data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Cummings, R. E.

    1973-01-01

    Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.

  17. Sources of Safety Data and Statistical Strategies for Design and Analysis: Postmarket Surveillance.

    PubMed

    Izem, Rima; Sanchez-Kam, Matilde; Ma, Haijun; Zink, Richard; Zhao, Yueqin

    2018-03-01

    Safety data are continuously evaluated throughout the life cycle of a medical product to accurately assess and characterize the risks associated with the product. The knowledge about a medical product's safety profile continually evolves as safety data accumulate. This paper discusses data sources and analysis considerations for safety signal detection after a medical product is approved for marketing. This manuscript is the second in a series of papers from the American Statistical Association Biopharmaceutical Section Safety Working Group. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from passive postmarketing surveillance systems compared to other sources. Signal detection has traditionally relied on spontaneous reporting databases that have been available worldwide for decades. However, current regulatory guidelines and ease of reporting have increased the size of these databases exponentially over the last few years. With such large databases, data-mining tools using disproportionality analysis and helpful graphics are often used to detect potential signals. Although the data sources have many limitations, analyses of these data have been successful at identifying safety signals postmarketing. Experience analyzing these dynamic data is useful in understanding the potential and limitations of analyses with new data sources such as social media, claims, or electronic medical records data.

  18. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  19. Monte Carlo study on pulse response of underwater optical channel

    NASA Astrophysics Data System (ADS)

    Li, Jing; Ma, Yong; Zhou, Qunqun; Zhou, Bo; Wang, Hongyuan

    2012-06-01

    Pulse response of the underwater wireless optical channel is significant for the analysis of channel capacity and error probability. Traditional vector radiative transfer theory (VRT) is not able to deal with the effect of receiving aperture. On the other hand, general water tank experiments cannot acquire an accurate pulse response due to the limited time resolution of the photo-electronic detector. We present a Monte Carlo simulation model to extract the time-domain pulse response undersea. In comparison with the VRT model, a more accurate pulse response for practical ocean communications could be achieved through statistical analysis of the received photons. The proposed model is more reasonable for the study of the underwater optical channel.

  20. A Model for Developing and Assessing Community College Students' Conceptions of the Range, Interquartile Range, and Standard Deviation

    ERIC Educational Resources Information Center

    Turegun, Mikhail

    2011-01-01

    Traditional curricular materials and pedagogical strategies have not been effective in developing conceptual understanding of statistics topics and statistical reasoning abilities of students. Much of the changes proposed by statistics education research and the reform movement over the past decade have supported efforts to transform teaching…

  1. Reform-Oriented Teaching of Introductory Statistics in the Health, Social and Behavioral Sciences--Historical Context and Rationale

    ERIC Educational Resources Information Center

    Hassad, Rossi A.

    2009-01-01

    There is widespread emphasis on reform in the teaching of introductory statistics at the college level. Underpinning this reform is a consensus among educators and practitioners that traditional curricular materials and pedagogical strategies have not been effective in promoting statistical literacy, a competency that is becoming increasingly…

  2. Kidney measures beyond traditional risk factors for cardiovascular prediction: A collaborative meta-analysis

    PubMed Central

    Matsushita, Kunihiro; Coresh, Josef; Sang, Yingying; Chalmers, John; Fox, Caroline; Guallar, Eliseo; Jafar, Tazeen; Jassal, Simerjot K.; Landman, Gijs W.D.; Muntner, Paul; Roderick, Paul; Sairenchi, Toshimi; Schöttker, Ben; Shankar, Anoop; Shlipak, Michael; Tonelli, Marcello; Townend, Jonathan; van Zuilen, Arjan; Yamagishi, Kazumasa; Yamashita, Kentaro; Gansevoort, Ron; Sarnak, Mark; Warnock, David G.; Woodward, Mark; Ärnlöv, Johan

    2015-01-01

    Background The utility of estimated glomerular filtration rate (eGFR) and albuminuria for cardiovascular prediction is controversial. Methods We meta-analyzed individual-level data from 24 cohorts (with a median follow-up time longer than 4 years, varying from 4.2 to 19.0 years) in the Chronic Kidney Disease Prognosis Consortium (637,315 participants without a history of cardiovascular disease) and assessed C-statistic difference and reclassification improvement for cardiovascular mortality and fatal and non-fatal cases of coronary heart disease, stroke, and heart failure in 5-year timeframe, contrasting prediction models consisting of traditional risk factors with and without creatinine-based eGFR and/or albuminuria (either albumin-to-creatinine ratio [ACR] or semi-quantitative dipstick proteinuria). Findings The addition of eGFR and ACR significantly improved the discrimination of cardiovascular outcomes beyond traditional risk factors in general populations, but the improvement was greater with ACR than with eGFR and more evident for cardiovascular mortality (c-statistic difference 0.0139 [95%CI 0.0105–0.0174] and 0.0065 [0.0042–0.0088], respectively) and heart failure (0.0196 [0.0108–0.0284] and 0.0109 [0.0059–0.0159]) than for coronary disease (0.0048 [0.0029–0.0067] and 0.0036 [0.0019–0.0054]) and stroke (0.0105 [0.0058–0.0151] and 0.0036 [0.0004–0.0069]). Dipstick proteinuria demonstrated smaller improvement than ACR. The discrimination improvement with kidney measures was especially evident in individuals with diabetes or hypertension but remained significant with ACR for cardiovascular mortality and heart failure in those without either of these conditions. In participants with chronic kidney disease (CKD), the combination of eGFR and ACR for risk discrimination outperformed most single traditional predictors; the c-statistic for cardiovascular mortality declined by 0.023 [0.016–0.030] vs. <0.007 when omitting eGFR and ACR vs. any single modifiable traditional predictors, respectively. Interpretation Creatinine-based eGFR and albuminuria should be taken into account for cardiovascular prediction, especially when they are already assessed for clinical purpose and/or cardiovascular mortality and heart failure are the outcomes of interest (e.g., the European guidelines on cardiovascular prevention). ACR may have particularly broad implications for cardiovascular prediction. In CKD populations, the simultaneous assessment of eGFR and ACR will facilitate improved cardiovascular risk classification, supporting current CKD guidelines. Funding US National Kidney Foundation and NIDDK PMID:26028594

  3. STS-based education in non-majors college biology

    NASA Astrophysics Data System (ADS)

    Henderson, Phyllis Lee

    The study explored the effect of the science-technology-society (STS) and traditional teaching methods in non-majors biology classes at a community college. It investigated the efficacy of the two methods in developing cognitive abilities at Bloom's first three levels of learning. It compared retention rates in classes taught in the two methods. Changes in student attitude relating to anxiety, fear, and interest in biology were explored. The effect of each method on grade attainment among men and women was investigated. The effect of each method on grade attainment among older and younger students was examined. Results of the study indicated that no significant differences, relating to retention or student attitude, existed in classes taught in the two methods. The study found no significant cognitive gains at Bloom's first three levels in classes taught in the traditional format. In the STS classes no significant gains were uncovered at Bloom's first level of cognition. Statistically significant gains were found in the STS classes at Bloom's second and third levels of cognition. In the classes taught in the traditional format no difference was identified in grade attainment between males and females. In the STS-based classes a small correlational difference between males and females was found with males receiving lower grades than expected. No difference in grade attainment was found between older and younger students taught in the traditional format. In the STS-based classes a small statistically significant difference in grade attainment was uncovered between older and younger students with older students receiving more A's and fewer C's than expected. This study found no difference in the grades of older, female students as compared to all other students in the traditionally taught classes. A weak statistically significant difference was discovered between grade attainment of older, female students and all other students in the STS classes with older, female students earning more A's and fewer C's than expected. It was concluded that among the students examined in this investigation STS teaching methods enhanced cognitive gains at Bloom's second and third levels of cognition. STS also strengthened grade attainment among older students and female students. Recommendations for further study included replication of the study to include a larger sample size, other types of institutions, and other academic disciplines in science. Expansion of the study to Bloom's fourth and fifth levels, use of a standardized testing instruments to determine attitude, analysis using qualitative methods of investigation, and refinement of the study to provide a true experimental design were also suggested.

  4. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  5. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  6. Examining the effectiveness of discriminant function analysis and cluster analysis in species identification of male field crickets based on their calling songs.

    PubMed

    Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini

    2013-01-01

    Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6-7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification.

  7. Meta-Analysis of Rare Binary Adverse Event Data

    PubMed Central

    Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.

    2013-01-01

    We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068

  8. Connected Text Reading and Differences in Text Reading Fluency in Adult Readers

    PubMed Central

    Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke

    2013-01-01

    The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177

  9. Comparison of repair techniques in small and medium-sized rotator cuff tears in cadaveric sheep shoulders.

    PubMed

    Onay, Ulaş; Akpınar, Sercan; Akgün, Rahmi Can; Balçık, Cenk; Tuncay, Ismail Cengiz

    2013-01-01

    The aim of this study was to compare new knotless single-row and double-row suture anchor techniques with traditional transosseous suture techniques for different sized rotator cuff tears in an animal model. The study included 56 cadaveric sheep shoulders. Supraspinatus cuff tears of 1 cm repaired with new knotless single-row suture anchor technique and supraspinatus and infraspinatus rotator cuff tears of 3 cm repaired with double-row suture anchor technique were compared to traditional transosseous suture techniques and control groups. The repaired tendons were loaded with 5 mm/min static velocity with 2.5 kgN load cell in Instron 8874 machine until the repair failure. The 1 cm transosseous group was statistically superior to 1 cm control group (p=0.021, p<0.05) and the 3 cm SpeedBridge group was statistically superior to the 1 cm SpeedFix group (p=0.012, p<0.05). The differences between the other groups were not statistically significant. No significant difference was found between the new knotless suture anchor techniques and traditional transosseous suture techniques.

  10. Multivariate analysis of fears in dental phobic patients according to a reduced FSS-II scale.

    PubMed

    Hakeberg, M; Gustafsson, J E; Berggren, U; Carlsson, S G

    1995-10-01

    This study analyzed and assessed dimensions of a questionnaire developed to measure general fears and phobias. A previous factor analysis among 109 dental phobics had revealed a five-factor structure with 22 items and an explained total variance of 54%. The present study analyzed the same material using a multivariate statistical procedure (LISREL) to reveal structural latent variables. The LISREL analysis, based on the correlation matrix, yielded a chi-square of 216.6 with 195 degrees of freedom (P = 0.138) and showed a model with seven latent variables. One was a general fear factor correlated to all 22 items. The other six factors concerned "Illness & Death" (5 items), "Failures & Embarrassment" (5 items), "Social situations" (5 items), "Physical injuries" (4 items), "Animals & Natural phenomena" (4 items). One item (opposite sex) was included in both "Failures & Embarrassment" and "Social situations". The last factor, "Social interaction", combined all the items in "Failures & Embarrassment" and "Social situations" (9 items). In conclusion, this multivariate statistical analysis (LISREL) revealed and confirmed a factor structure similar to our previous study, but added two important dimensions not shown with a traditional factor analysis. This reduced FSS-II version measures general fears and phobias and may be used on a routine clinical basis as well as in dental phobia research.

  11. Characterization and discrimination of raw and vinegar-baked Bupleuri radix based on UHPLC-Q-TOF-MS coupled with multivariate statistical analysis.

    PubMed

    Lei, Tianli; Chen, Shifeng; Wang, Kai; Zhang, Dandan; Dong, Lin; Lv, Chongning; Wang, Jing; Lu, Jincai

    2018-02-01

    Bupleuri Radix is a commonly used herb in clinic, and raw and vinegar-baked Bupleuri Radix are both documented in the Pharmacopoeia of People's Republic of China. According to the theories of traditional Chinese medicine, Bupleuri Radix possesses different therapeutic effects before and after processing. However, the chemical mechanism of this processing is still unknown. In this study, ultra-high-performance liquid chromatography with quadruple time-of-flight mass spectrometry coupled with multivariate statistical analysis including principal component analysis and orthogonal partial least square-discriminant analysis was developed to holistically compare the difference between raw and vinegar-baked Bupleuri Radix for the first time. As a result, 50 peaks in raw and processed Bupleuri Radix were detected, respectively, and a total of 49 peak chemical compounds were identified. Saikosaponin a, saikosaponin d, saikosaponin b 3 , saikosaponin e, saikosaponin c, saikosaponin b 2 , saikosaponin b 1 , 4''-O-acetyl-saikosaponin d, hyperoside and 3',4'-dimethoxy quercetin were explored as potential markers of raw and vinegar-baked Bupleuri Radix. This study has been successfully applied for global analysis of raw and vinegar-processed samples. Furthermore, the underlying hepatoprotective mechanism of Bupleuri Radix was predicted, which was related to the changes of chemical profiling. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Analysis of the economic structure of the eating-out sector: The case of Spain.

    PubMed

    Cabiedes-Miragaya, Laura

    2017-12-01

    The objective of this article is to analyse the structure of the Spanish eating-out sector from an economic point of view, and more specifically, from the supply perspective. This aspect has been studied less than the demand side, almost certainly due to the gaps which exist in available official statistics in Spain, and which have been filled basically with consumer surveys. For this reason, focus is also placed on the economic relevance of the sector and attention is drawn to the serious shortcomings regarding official statistics in this domain, in contrast to the priority that hotel industry statistics have traditionally received in Spain. Based on official statistics, a descriptive analysis was carried out, focused mainly, though not exclusively, on diverse structural aspects of the sector. Special emphasis was placed on issues such as business demography (for instance, number and types of enterprises, survival rates, size distribution, and age structure), market concentration and structure of costs. Among other conclusions, the analysis allowed us to conclude that: part of the sector is more concentrated than it may at first appear to be; the dual structure of the sector described by the literature in relation to other countries is also present in the Spanish case; and the impact of ICTs (Information and Communication Technologies) on the sector are, and will foreseeably continue to be, particularly relevant. The main conclusion of this study refers to the fact that consumers have gained prominence in their contribution to shaping the structure of the sector. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A simple method of equine limb force vector analysis and its potential applications

    PubMed Central

    Robinson, Mark A.; Clayton, Hilary M.

    2018-01-01

    Background Ground reaction forces (GRF) measured during equine gait analysis are typically evaluated by analyzing discrete values obtained from continuous force-time data for the vertical, longitudinal and transverse GRF components. This paper describes a simple, temporo-spatial method of displaying and analyzing sagittal plane GRF vectors. In addition, the application of statistical parametric mapping (SPM) is introduced to analyse differences between contra-lateral fore and hindlimb force-time curves throughout the stance phase. The overall aim of the study was to demonstrate alternative methods of evaluating functional (a)symmetry within horses. Methods GRF and kinematic data were collected from 10 horses trotting over a series of four force plates (120 Hz). The kinematic data were used to determine clean hoof contacts. The stance phase of each hoof was determined using a 50 N threshold. Vertical and longitudinal GRF for each stance phase were plotted both as force-time curves and as force vector diagrams in which vectors originating at the centre of pressure on the force plate were drawn at intervals of 8.3 ms for the duration of stance. Visual evaluation was facilitated by overlay of the vector diagrams for different limbs. Summary vectors representing the magnitude (VecMag) and direction (VecAng) of the mean force over the entire stance phase were superimposed on the force vector diagram. Typical measurements extracted from the force-time curves (peak forces, impulses) were compared with VecMag and VecAng using partial correlation (controlling for speed). Paired samples t-tests (left v. right diagonal pair comparison and high v. low vertical force diagonal pair comparison) were performed on discrete and vector variables using traditional methods and Hotelling’s T2 tests on normalized stance phase data using SPM. Results Evidence from traditional statistical tests suggested that VecMag is more influenced by the vertical force and impulse, whereas VecAng is more influenced by the longitudinal force and impulse. When used to evaluate mean data from the group of ten sound horses, SPM did not identify differences between the left and right contralateral limb pairs or between limb pairs classified according to directional asymmetry. When evaluating a single horse, three periods were identified during which differences in the forces between the left and right forelimbs exceeded the critical threshold (p < .01). Discussion Traditional statistical analysis of 2D GRF peak values, summary vector variables and visual evaluation of force vector diagrams gave harmonious results and both methods identified the same inter-limb asymmetries. As alpha was more tightly controlled using SPM, significance was only found in the individual horse although T2 plots followed the same trends as discrete analysis for the group. Conclusions The techniques of force vector analysis and SPM hold promise for investigations of sidedness and asymmetry in horses. PMID:29492341

  14. Comparison of the effects block and traditional schedules have on the number of students who are proficient on the Biology End-of-Course Test in forty public high schools in the state of North Carolina

    NASA Astrophysics Data System (ADS)

    Bonner, Tonia Anita

    This study examined the difference between the number of overall students, African-American students, and students with disabilities on a semester 4 x 4 block schedule who were proficient on the North Carolina Biology End-of-Course Test and the number of the same group of students on a traditional 45-50 minute yearlong schedule who were proficient on the NC Biology End-of-Course Test in the state of North Carolina during the 2009--2010 school year. A causal-comparative design was used and three null hypotheses were tested using chi-square analysis. Archival data was used. The results showed that there was a significant association between the number of the overall students and African-American students who were proficient on the NC Biology EOC Test when taught biology on a 4 x 4 semester block versus a traditional schedule. However, no statistically significant relationship existed between the number of students with disabilities who were educated on 4 x 4 semester block schedule and those students with disabilities who were educated on a six or seven period traditional schedule in biology. Suggestions for further research are included.

  15. Modeling work zone crash frequency by quantifying measurement errors in work zone length.

    PubMed

    Yang, Hong; Ozbay, Kaan; Ozturk, Ozgur; Yildirimoglu, Mehmet

    2013-06-01

    Work zones are temporary traffic control zones that can potentially cause safety problems. Maintaining safety, while implementing necessary changes on roadways, is an important challenge traffic engineers and researchers have to confront. In this study, the risk factors in work zone safety evaluation were identified through the estimation of a crash frequency (CF) model. Measurement errors in explanatory variables of a CF model can lead to unreliable estimates of certain parameters. Among these, work zone length raises a major concern in this analysis because it may change as the construction schedule progresses generally without being properly documented. This paper proposes an improved modeling and estimation approach that involves the use of a measurement error (ME) model integrated with the traditional negative binomial (NB) model. The proposed approach was compared with the traditional NB approach. Both models were estimated using a large dataset that consists of 60 work zones in New Jersey. Results showed that the proposed improved approach outperformed the traditional approach in terms of goodness-of-fit statistics. Moreover it is shown that the use of the traditional NB approach in this context can lead to the overestimation of the effect of work zone length on the crash occurrence. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Analysis of longitudinal "time series" data in toxicology.

    PubMed

    Cox, C; Cory-Slechta, D A

    1987-02-01

    Studies focusing on chronic toxicity or on the time course of toxicant effect often involve repeated measurements or longitudinal observations of endpoints of interest. Experimental design considerations frequently necessitate between-group comparisons of the resulting trends. Typically, procedures such as the repeated-measures analysis of variance have been used for statistical analysis, even though the required assumptions may not be satisfied in some circumstances. This paper describes an alternative analytical approach which summarizes curvilinear trends by fitting cubic orthogonal polynomials to individual profiles of effect. The resulting regression coefficients serve as quantitative descriptors which can be subjected to group significance testing. Randomization tests based on medians are proposed to provide a comparison of treatment and control groups. Examples from the behavioral toxicology literature are considered, and the results are compared to more traditional approaches, such as repeated-measures analysis of variance.

  17. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    PubMed Central

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  18. The impact of blended teaching on knowledge, satisfaction, and self-directed learning in nursing undergraduates: a randomized, controlled trial.

    PubMed

    Gagnon, Marie-Pierre; Gagnon, Johanne; Desmartis, Marie; Njoya, Merlin

    2013-01-01

    This study aimed to assess the effectiveness of a blended-teaching intervention using Internet-based tutorials coupled with traditional lectures in an introduction to research undergraduate nursing course. Effects of the intervention were compared with conventional, face-to-face classroom teaching on three outcomes: knowledge, satisfaction, and self-learning readiness. A two-group, randomized, controlled design was used, involving 112 participants. Descriptive statistics and analysis of covariance (ANCOVA) were performed. The teaching method was found to have no direct impact on knowledge acquisition, satisfaction, and self-learning readiness. However, motivation and teaching method had an interaction effect on knowledge acquisition by students. Among less motivated students, those in the intervention group performed better than those who received traditional training. These findings suggest that this blended-teaching method could better suit some students, depending on their degree of motivation and level of self-directed learning readiness.

  19. Exploring and revitalizing Indigenous food networks in Saskatchewan, Canada, as a way to improve food security.

    PubMed

    Gendron, Fidji; Hancherow, Anna; Norton, Ashley

    2017-10-01

    The project discussed in this paper was designed to expand research and instigate revitalization of Indigenous food networks in Saskatchewan, Canada, by exploring the current state of local Indigenous food networks, creating a Facebook page, organizing volunteer opportunities and surveying workshop participants regarding their knowledge and interest in Indigenous foods. The survey included Likert scale questions and qualitative questions. Project activities and survey results are discussed using statistical and qualitative analysis of the themes. Results indicate that participants are very interested in learning more about, and having greater access to, traditional foods and suggest that supporting Indigenous food networks may be an appropriate response to food insecurity in communities. Elders and community members are vital players in Indigenous foods exploration and revitalization in Saskatchewan by passing on traditional education. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Anxiety, depression, and cognitive impairment in dementia-specific and traditional assisted living.

    PubMed

    Kang, Hyunwook; Smith, Marianne; Buckwalter, Kathleen C; Ellingrod, Vicki; Schultz, Susan K

    2010-01-01

    With the rapid growth of the assisted living (AL) industry, the number of AL residences providing dementia care continues to increase. The purpose of this article is to describe and compare demographic characteristics; frequency and type of psychiatric diagnoses; level of cognition, depression, and anxiety symptoms; and use of psychotropic medication among older adults in dementia-specific assisted living (DSAL) and traditional assisted living (TAL) residences. Secondary analysis of screening data collected during a cross-sectional, descriptive pilot project compared 18 participants from two DSAL facilities and 28 participants from three TAL facilities. DSAL participants with dementia were more cognitively impaired than TAL participants with dementia (p < 0.001) and used more antipsychotic (67%), anxiolytic (60%), antidepressant (53%), and cognitive-enhancing (87%) medications. No statistically significant differences in demographic factors or levels of anxiety or depression were observed among residents in either setting. Copyright 2010, SLACK Incorporated.

  1. Evaluating the use of simulation with beginning nursing students.

    PubMed

    Alfes, Celeste M

    2011-02-01

    The purpose of this quasi-experimental study was to evaluate and compare the effectiveness of simulation versus a traditional skills laboratory method in promoting self-confidence and satisfaction with learning among beginning nursing students. A single convenience sample of 63 first-semester baccalaureate nursing students learning effective comfort care measures were recruited to compare the two teaching methods. Students participating in the simulation experience were statistically more confident than students participating in the traditional group. There was a slight, nonsignificant difference in satisfaction with learning between the two groups. Bivariate analysis revealed a significant positive relationship between self-confidence and satisfaction. Students in both groups reported higher levels of self-confidence following the learning experiences. Findings may influence the development of simulation experiences for beginning nursing students and encourage the implementation of simulation as a strand from beginning to end in nursing curricula. Copyright 2011, SLACK Incorporated.

  2. Patient responses to Er:YAG laser when used for conservative dentistry.

    PubMed

    Fornaini, Carlo; Riceputi, David; Lupi-Pegurier, Laurence; Rocca, Jean Paul

    2012-11-01

    The utilization of laser technology in conservative dentistry offers several advantages compared with traditional instruments, but one of the still unsolved problems is the difficulty in describing and explaining these advantages to patients. The aims of this study were to verify the efficacy of the way patients are informed and to evaluate their satisfaction with laser-assisted treatment. Before treatment, 100 patients were given a brochure that explained the relevant laser-assisted dental procedures, and after dental treatment an 11-item questionnaire was administered to the patients to evaluate their satisfaction with the treatment. Statistical analysis showed high levels of satisfaction for all the questions, especially those regarding the choice between laser therapy and traditional instruments (100%), choosing laser in the future (89%), and recommending it to family and friends (84%). This study may be relevant when determining the overall satisfaction of patients with this new technology.

  3. Anxiety, Depression, and Cognitive Impairment in Dementia-Specific and Traditional Assisted Living

    PubMed Central

    Kang, Hyunwook; Smith, Marianne; Buckwalter, Kathleen C.; Ellingrod, Vicki; Schultz, Susan K.

    2010-01-01

    With the rapid growth of the assisted living (AL) industry, the number of AL residences providing dementia care continues to increase. The purpose of this article is to describe and compare demographic characteristics; frequency and type of psychiatric diagnoses; level of cognition, depression, and anxiety symptoms; and use of psychotropic medication among older adults in dementia-specific assisted living (DSAL) and traditional assisted living (TAL) residences. Secondary analysis of screening data collected during a cross-sectional, descriptive pilot project compared 18 participants from two DSAL facilities and 28 participants from three TAL facilities. DSAL participants with dementia were more cognitively impaired than TAL participants with dementia (p < 0.001) and used more antipsychotic (67%), anxiolytic (60%), antidepressant (53%), and cognitive-enhancing (87%) medications. No statistically significant differences in demographic factors or levels of anxiety or depression were observed among residents in either setting. PMID:20047249

  4. Pancrustacean phylogeny: hexapods are terrestrial crustaceans and maxillopods are not monophyletic

    PubMed Central

    Regier, Jerome C.; Shultz, Jeffrey W.; Kambic, Robert E.

    2005-01-01

    Recent molecular analyses indicate that crustaceans and hexapods form a clade (Pancrustacea or Tetraconata), but relationships among its constituent lineages, including monophyly of crustaceans, are controversial. Our phylogenetic analysis of three protein-coding nuclear genes from 62 arthropods and lobopods (Onychophora and Tardigrada) demonstrates that Hexapoda is most closely related to the crustaceans Branchiopoda (fairy shrimp, water fleas, etc.) and Cephalocarida+Remipedia, thereby making hexapods terrestrial crustaceans and the traditionally defined Crustacea paraphyletic. Additional findings are that Malacostraca (crabs, isopods, etc.) unites with Cirripedia (barnacles, etc.) and they, in turn, with Copepoda, making the traditional crustacean class Maxillopoda paraphyletic. Ostracoda (seed shrimp)—either all or a subgroup—is associated with Branchiura (fish lice) and likely to be basal to all other pancrustaceans. A Bayesian statistical (non-clock) estimate of divergence times suggests a Precambrian origin for Pancrustacea (600 Myr ago or more), which precedes the first unambiguous arthropod fossils by over 60 Myr. PMID:15734694

  5. A pre-admission program for underrepresented minority and disadvantaged students: application, acceptance, graduation rates and timeliness of graduating from medical school.

    PubMed

    Strayhorn, G

    2000-04-01

    To determine whether students' performances in a pre-admission program predicted whether participants would (1) apply to medical school, (2) get accepted, and (3) graduate. Using prospectively collected data from participants in the University of North Carolina at Chapel Hill's Medical Education Development Program (MEDP) and data from the Association of American Colleges Student and Applicant Information Management System, the author identified 371 underrepresented minority (URM) students who were full-time participants and completed the program between 1984 and 1989, prior to their acceptance into medical school. Logistic regression analysis was used to determine whether MEDP performance significantly predicted (after statistically controlling for traditional predictors of these outcomes) the proportions of URM participants who applied to medical school and were accepted, the timeliness of graduating, and the proportion graduating. Odds ratios with 95% confidence intervals were calculated to determine the associations between the independent and outcome variables. In separate logistic regression models, MEDP performance predicted the study's outcomes after statistically controlling for traditional predictors with 95% confidence intervals. Pre-admission programs with similar outcomes can improve the diversity of the physician workforce and the access to health care for underrepresented minority and economically disadvantaged populations.

  6. First arrival time picking for microseismic data based on DWSW algorithm

    NASA Astrophysics Data System (ADS)

    Li, Yue; Wang, Yue; Lin, Hongbo; Zhong, Tie

    2018-03-01

    The first arrival time picking is a crucial step in microseismic data processing. When the signal-to-noise ratio (SNR) is low, however, it is difficult to get the first arrival time accurately with traditional methods. In this paper, we propose the double-sliding-window SW (DWSW) method based on the Shapiro-Wilk (SW) test. The DWSW method is used to detect the first arrival time by making full use of the differences between background noise and effective signals in the statistical properties. Specifically speaking, we obtain the moment corresponding to the maximum as the first arrival time of microseismic data when the statistic of our method reaches its maximum. Hence, in our method, there is no need to select the threshold, which makes the algorithm more facile when the SNR of microseismic data is low. To verify the reliability of the proposed method, a series of experiments is performed on both synthetic and field microseismic data. Our method is compared with the traditional short-time and long-time average (STA/LTA) method, the Akaike information criterion, and the kurtosis method. Analysis results indicate that the accuracy rate of the proposed method is superior to that of the other three methods when the SNR is as low as - 10 dB.

  7. Size of the Dynamic Bead in Polymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agapov, Alexander L; Sokolov, Alexei P

    2010-01-01

    Presented analysis of neutron, mechanical, and MD simulation data available in the literature demonstrates that the dynamic bead size (the smallest subchain that still exhibits the Rouse-like dynamics) in most of the polymers is significantly larger than the traditionally defined Kuhn segment. Moreover, our analysis emphasizes that even the static bead size (e.g., chain statistics) disagrees with the Kuhn segment length. We demonstrate that the deficiency of the Kuhn segment definition is based on the assumption of a chain being completely extended inside a single bead. The analysis suggests that representation of a real polymer chain by the bead-and-spring modelmore » with a single parameter C cannot be correct. One needs more parameters to reflect correctly details of the chain structure in the bead-and-spring model.« less

  8. A randomized study of new sling exercise treatment vs traditional physiotherapy for patients with chronic whiplash-associated disorders with unsettled compensation claims.

    PubMed

    Vikne, John; Oedegaard, Arit; Laerum, Even; Ihlebaek, Camilla; Kirkesola, Gitle

    2007-04-01

    Many patients with chronic whiplash-associated disorders have reduced neuromuscular control of the neck and head. It has been proposed that a new sling exercise therapy may promote neuromuscular control of the neck. To compare the effects of traditional physiotherapy vs traditional physiotherapy combined with a new sling exercise therapy on discomfort and function in patients with chronic whiplash-associated disorders who have unsettled compensation claims; and to investigate possible additional effects of guided, long-term home training. A randomized multi-centre trial with 4 parallel groups. A total of 214 patients were assigned randomly to 4 treatment groups, and received either traditional physiotherapy with or without home training, or new sling exercise therapy with or without home training. Outcome measures were pain, disability, psychological distress, sick leave and physical tests. A total of 171 patients (80%) completed the study. There were no important statistical or clinical differences between the groups after 4 months of treatment. There was a small statistically significant effect at 12-month follow-up in both groups with home training regarding pain during rest (p = 0.05) and reported fatigue in the final week (p = 0.02). No statistically significant differences were found between the traditional physiotherapy group and the new sling exercise group, with or without home training. Since the groups were not compared with a control group without treatment, we cannot conclude that the studied treatments are effective for patients with whiplash-associated disorder, only that they did not differ in our study.

  9. Lateral ventricle morphology analysis via mean latitude axis.

    PubMed

    Paniagua, Beatriz; Lyall, Amanda; Berger, Jean-Baptiste; Vachet, Clement; Hamer, Robert M; Woolson, Sandra; Lin, Weili; Gilmore, John; Styner, Martin

    2013-03-29

    Statistical shape analysis has emerged as an insightful method for evaluating brain structures in neuroimaging studies, however most shape frameworks are surface based and thus directly depend on the quality of surface alignment. In contrast, medial descriptions employ thickness information as alignment-independent shape metric. We propose a joint framework that computes local medial thickness information via a mean latitude axis from the well-known spherical harmonic (SPHARM-PDM) shape framework. In this work, we applied SPHARM derived medial representations to the morphological analysis of lateral ventricles in neonates. Mild ventriculomegaly (MVM) subjects are compared to healthy controls to highlight the potential of the methodology. Lateral ventricles were obtained from MRI scans of neonates (9-144 days of age) from 30 MVM subjects as well as age- and sex-matched normal controls (60 total). SPHARM-PDM shape analysis was extended to compute a mean latitude axis directly from the spherical parameterization. Local thickness and area was straightforwardly determined. MVM and healthy controls were compared using local MANOVA and compared with the traditional SPHARM-PDM analysis. Both surface and mean latitude axis findings differentiate successfully MVM and healthy lateral ventricle morphology. Lateral ventricles in MVM neonates show enlarged shapes in tail and head. Mean latitude axis is able to find significant differences all along the lateral ventricle shape, demonstrating that local thickness analysis provides significant insight over traditional SPHARM-PDM. This study is the first to precisely quantify 3D lateral ventricle morphology in MVM neonates using shape analysis.

  10. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  11. Clinically significant and practical! Enhancing precision does make a difference. Reply to McGlinchey and Jacobson, Hsu, and Speer.

    PubMed

    Hageman, W J; Arrindell, W A

    1999-12-01

    Based on a secondary analysis of the Jacobson and Truax [Jacobson, N.S. & Truax, P. (1991). a statistical approach to defining meaningful change in psychotherapy research. Journal of Consulting and Clinical Psychology, 59, 12-19.] data using both their own traditional approach and the refined method advanced by Hageman and Arrindell [Hageman, W.J.J.M., & Arrindell, W.A. (1999). Establishing clinically significant change: increment of precision and the distinction between individual and group level of analysis. Behaviour Research and Therapy, 37, 1169-1193], McGlinchey and Jacobson [McGlinchey, J. B., & Jacobson, N. S. (1999). Clinically significant but impractical? A response to Hageman and Arrindell. Behaviour Research and Therapy, 37, 1211-1217.] reported practically identical findings on reliable and clinically significant change across the two approaches. This led McGlinchey and Jacobson to conclude that there is little practical gain in utilizing the refined method over the traditional approach. Close inspection of the data used by McGlinchey and Jacobson however revealed a serious mistake with respect to the value of the standard error of measurement that was employed in their calculations. When the proper index value was utilised, further re-analysis by the present authors disclosed clear differences (i.e. different classifications of S's) across the two approaches. Importantly, these differences followed exactly the same pattern as depicted in Table 2 in Hageman and Arrindell (1999). The theoretical advantages of the refined method, i.e. enhanced precision, appropriate distinction between analysis at the individual and group levels, and maximal comparability of findings across studies, exceed those of the traditional method. Application of the refined method may be carried out within approximately half an hour, which not only supports its practical manageability, but also challenges the suggestion of McGlinchey and Jacobson (1999) that the relevant method would be too complex (impractical) for the average scientist. The reader is offered the opportunity of obtaining an SPSS setup in the form of an ASCII text file by means of which the relevant calculations can be carried out. The ways in which the valuable commentaries by Hsu [Hsu, L. M. (1999). A comparison of three methods of identifying reliable and clinically significant client changes: commentary on Hageman and Arrindell. Behaviour Research and Therapy, 37, 1195-1202.] and Speer [Speer, D. C. (1999). What is the role of two-wave designs in clinical research? Comment on Hageman and Arrindell. Behaviour Research and Therapy, 37, 1203-1210.) contribute to a better understanding of the technical/statistical backgrounds of the traditional and refined methods were also discussed.

  12. Using Technology to Promote Mathematical Discourse Concerning Women in Mathematics

    ERIC Educational Resources Information Center

    Phy, Lyn

    2008-01-01

    This paper discusses uses of technology to facilitate mathematical discourse concerning women in mathematics. Such a topic can be introduced in various traditional courses such as algebra, geometry, trigonometry, probability and statistics, or calculus, but it is not included in traditional textbooks. Through the ideas presented here, you can…

  13. Attitudes and Achievement in Introductory Psychological Statistics Classes: Traditional versus Computer-Supported Instruction.

    ERIC Educational Resources Information Center

    Gratz, Zandra S.; And Others

    A study was conducted at a large, state-supported college in the Northeast to establish a mechanism by which a popular software package, Statistical Package for the Social Sciences (SPSS), could be used in psychology program statistics courses in such a way that no prior computer expertise would be needed on the part of the faculty or the…

  14. Predicting Acquisition of Learning Outcomes: A Comparison of Traditional and Activity-Based Instruction in an Introductory Statistics Course.

    ERIC Educational Resources Information Center

    Geske, Jenenne A.; Mickelson, William T.; Bandalos, Deborah L.; Jonson, Jessica; Smith, Russell W.

    The bulk of experimental research related to reforms in the teaching of statistics concentrates on the effects of alternative teaching methods on statistics achievement. This study expands on that research by including an examination of the effects of instructor and the interaction between instructor and method on achievement as well as attitudes,…

  15. The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…

  16. The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…

  17. Comparison between Complementary Dietary Treatment of Alzheimer Disease in Iranian Traditional Medicine and Modern Medicine

    PubMed Central

    AHMADIAN-ATTARI, Mohammad Mahdi; MOSADDEGH, Mahmoud; KAZEMNEJAD, Anooshiravan; NOORBALA, Ahmad Ali

    2013-01-01

    Abstract Background Dietary notifications have been introduced recently for Alzheimer Disease (AD). In Iranian old medical manuscripts, there are some nutritional recommendations related to Nesyan (AD equivalent). The aim of this article was to compare dietary recommendations of Iranian traditional medicine (ITM) with novel medical outcomes. Methods 1) Searching for dietary recommendations and abstinences described in ITM credible manuscripts; 2) Extracting fatty components of ITM diet according to the database of the Department of Agriculture of the USA; 3) Statistical analysis of fatty elements of traditionally recommended foods via Mann-Whitney Test in comparison with elements of the abstinent ones; 4) Searching for AD dietary recommendations and abstinences which currently published in medical journals; 5) Comparing traditional and new dietary suggestions with each other. Results 1) Traditionally recommended foods are fattier than abstinent ones (P<0.001). There are meaningful differences between unsaturated fatty acids (UFAs) (P<0.001), saturated fatty acids (P<0.001), and cholesterol (P<0.05) of recommended foods and abstinent ones. 2) Traditionally recommended diet is also fattier than the abstinent diet (4.5 times); UFAs of the recommended diet is 11 times more than that of the abstinent one; it is the same story for cholesterol (1.4 times); 3) Recent studies show that diets with high amounts of UFAs have positive effects on AD; a considerable number of papers emphasizes on probable positive role of cholesterol on AD; 4) Traditional recommended diet is in agreement with recent studies. Conclusion ITM recommended diet which is full of unsaturated fatty acids and cholesterol can be utilized for complementary treatment of AD. PMID:26060643

  18. Up Close and Personal: Using Narrative Inquiry to Examine Persistence Strategies of Non-Traditional African American Women Students on a Traditionally Oriented University Campus

    ERIC Educational Resources Information Center

    White, Sharon Lee

    2009-01-01

    According to the National Center for Education Statistics (NCES), nearly half of the enrolled college students in the United States of America (USA) are 24 years of age or older. Over one-third are at least 35 years old, which translates into over four million students being a part of growing mature and/or non-traditional student population. Women…

  19. Tele-Ophthalmology for Age-Related Macular Degeneration and Diabetic Retinopathy Screening: A Systematic Review and Meta-Analysis.

    PubMed

    Kawaguchi, Atsushi; Sharafeldin, Noha; Sundaram, Aishwarya; Campbell, Sandy; Tennant, Matthew; Rudnisky, Christopher; Weis, Ezekiel; Damji, Karim F

    2018-04-01

    To synthesize high-quality evidence to compare traditional in-person screening and tele-ophthalmology screening. Only randomized controlled trials (RCTs) were included in this systematic review and meta-analysis. The intervention of interest was any type of tele-ophthalmology, including screening of diseases using remote devices. Studies involved patients receiving care from any trained provider via tele-ophthalmology, compared with those receiving equivalent face-to-face care. A search was executed on the following databases: Medline, EMBASE, EBM Reviews, Global Health, EBSCO-CINAHL, SCOPUS, ProQuest Dissertations and Theses Global, OCLC Papers First, and Web of Science Core Collection. Six outcomes of care for age-related macular degeneration (AMD), diabetic retinopathy (DR), or glaucoma were measured and analyzed. Two hundred thirty-seven records were assessed at the full-text level; six RCTs fulfilled inclusion criteria and were included in this review. Four studies involved participants with diabetes mellitus, and two studies examined choroidal neovascularization in AMD. Only data of detection of disease and participation in the screening program were used for the meta-analysis. Tele-ophthalmology had a 14% higher odds to detect disease than traditional examination; however, the result was not statistically significant (n = 2,012, odds ratio: 1.14, 95% confidence interval (CI): 0.52-2.53, p = 0.74). Meta-analysis results show that odds of having DR screening in the tele-ophthalmology group was 13.15 (95% CI: 8.01-21.61; p < 0.001) compared to the traditional screening program. The current evidence suggests that tele-ophthalmology for DR and age-related macular degeneration is as effective as in-person examination and potentially increases patient participation in screening.

  20. Plasma Lipidomic Profiles Improve on Traditional Risk Factors for the Prediction of Cardiovascular Events in Type 2 Diabetes Mellitus.

    PubMed

    Alshehry, Zahir H; Mundra, Piyushkumar A; Barlow, Christopher K; Mellett, Natalie A; Wong, Gerard; McConville, Malcolm J; Simes, John; Tonkin, Andrew M; Sullivan, David R; Barnes, Elizabeth H; Nestel, Paul J; Kingwell, Bronwyn A; Marre, Michel; Neal, Bruce; Poulter, Neil R; Rodgers, Anthony; Williams, Bryan; Zoungas, Sophia; Hillis, Graham S; Chalmers, John; Woodward, Mark; Meikle, Peter J

    2016-11-22

    Clinical lipid measurements do not show the full complexity of the altered lipid metabolism associated with diabetes mellitus or cardiovascular disease. Lipidomics enables the assessment of hundreds of lipid species as potential markers for disease risk. Plasma lipid species (310) were measured by a targeted lipidomic analysis with liquid chromatography electrospray ionization-tandem mass spectrometry on a case-cohort (n=3779) subset from the ADVANCE trial (Action in Diabetes and Vascular Disease: Preterax and Diamicron-MR Controlled Evaluation). The case-cohort was 61% male with a mean age of 67 years. All participants had type 2 diabetes mellitus with ≥1 additional cardiovascular risk factors, and 35% had a history of macrovascular disease. Weighted Cox regression was used to identify lipid species associated with future cardiovascular events (nonfatal myocardial infarction, nonfatal stroke, and cardiovascular death) and cardiovascular death during a 5-year follow-up period. Multivariable models combining traditional risk factors with lipid species were optimized with the Akaike information criteria. C statistics and NRIs were calculated within a 5-fold cross-validation framework. Sphingolipids, phospholipids (including lyso- and ether- species), cholesteryl esters, and glycerolipids were associated with future cardiovascular events and cardiovascular death. The addition of 7 lipid species to a base model (14 traditional risk factors and medications) to predict cardiovascular events increased the C statistic from 0.680 (95% confidence interval [CI], 0.678-0.682) to 0.700 (95% CI, 0.698-0.702; P<0.0001) with a corresponding continuous NRI of 0.227 (95% CI, 0.219-0.235). The prediction of cardiovascular death was improved with the incorporation of 4 lipid species into the base model, showing an increase in the C statistic from 0.740 (95% CI, 0.738-0.742) to 0.760 (95% CI, 0.757-0.762; P<0.0001) and a continuous net reclassification index of 0.328 (95% CI, 0.317-0.339). The results were validated in a subcohort with type 2 diabetes mellitus (n=511) from the LIPID trial (Long-Term Intervention With Pravastatin in Ischemic Disease). The improvement in the prediction of cardiovascular events, above traditional risk factors, demonstrates the potential of plasma lipid species as biomarkers for cardiovascular risk stratification in diabetes mellitus. URL: https://clinicaltrials.gov. Unique identifier: NCT00145925. © 2016 American Heart Association, Inc.

  1. Effectiveness of Neuromuscular Electrical Stimulation on Patients With Dysphagia With Medullary Infarction.

    PubMed

    Zhang, Ming; Tao, Tao; Zhang, Zhao-Bo; Zhu, Xiao; Fan, Wen-Guo; Pu, Li-Jun; Chu, Lei; Yue, Shou-Wei

    2016-03-01

    To evaluate and compare the effects of neuromuscular electrical stimulation (NMES) acting on the sensory input or motor muscle in treating patients with dysphagia with medullary infarction. Prospective randomized controlled study. Department of physical medicine and rehabilitation. Patients with dysphagia with medullary infarction (N=82). Participants were randomized over 3 intervention groups: traditional swallowing therapy, sensory approach combined with traditional swallowing therapy, and motor approach combined with traditional swallowing therapy. Electrical stimulation sessions were for 20 minutes, twice a day, for 5d/wk, over a 4-week period. Swallowing function was evaluated by the water swallow test and Standardized Swallowing Assessment, oral intake was evaluated by the Functional Oral Intake Scale, quality of life was evaluated by the Swallowing-Related Quality of Life (SWAL-QOL) Scale, and cognition was evaluated by the Mini-Mental State Examination (MMSE). There were no statistically significant differences between the groups in age, sex, duration, MMSE score, or severity of the swallowing disorder (P>.05). All groups showed improved swallowing function (P≤.01); the sensory approach combined with traditional swallowing therapy group showed significantly greater improvement than the other 2 groups, and the motor approach combined with traditional swallowing therapy group showed greater improvement than the traditional swallowing therapy group (P<.05). SWAL-QOL Scale scores increased more significantly in the sensory approach combined with traditional swallowing therapy and motor approach combined with traditional swallowing therapy groups than in the traditional swallowing therapy group, and the sensory approach combined with traditional swallowing therapy and motor approach combined with traditional swallowing therapy groups showed statistically significant differences (P=.04). NMES that targets either sensory input or motor muscle coupled with traditional therapy is conducive to recovery from dysphagia and improves quality of life for patients with dysphagia with medullary infarction. A sensory approach appears to be better than a motor approach. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Building gene expression profile classifiers with a simple and efficient rejection option in R.

    PubMed

    Benso, Alfredo; Di Carlo, Stefano; Politano, Gianfranco; Savino, Alessandro; Hafeezurrehman, Hafeez

    2011-01-01

    The collection of gene expression profiles from DNA microarrays and their analysis with pattern recognition algorithms is a powerful technology applied to several biological problems. Common pattern recognition systems classify samples assigning them to a set of known classes. However, in a clinical diagnostics setup, novel and unknown classes (new pathologies) may appear and one must be able to reject those samples that do not fit the trained model. The problem of implementing a rejection option in a multi-class classifier has not been widely addressed in the statistical literature. Gene expression profiles represent a critical case study since they suffer from the curse of dimensionality problem that negatively reflects on the reliability of both traditional rejection models and also more recent approaches such as one-class classifiers. This paper presents a set of empirical decision rules that can be used to implement a rejection option in a set of multi-class classifiers widely used for the analysis of gene expression profiles. In particular, we focus on the classifiers implemented in the R Language and Environment for Statistical Computing (R for short in the remaining of this paper). The main contribution of the proposed rules is their simplicity, which enables an easy integration with available data analysis environments. Since in the definition of a rejection model tuning of the involved parameters is often a complex and delicate task, in this paper we exploit an evolutionary strategy to automate this process. This allows the final user to maximize the rejection accuracy with minimum manual intervention. This paper shows how the use of simple decision rules can be used to help the use of complex machine learning algorithms in real experimental setups. The proposed approach is almost completely automated and therefore a good candidate for being integrated in data analysis flows in labs where the machine learning expertise required to tune traditional classifiers might not be available.

  3. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical calculations such as running averages are easily implemented against the time-centric columns of data.

  4. Orexinergic Neurotransmission in Temperature Responses to Methamphetamine and Stress: Mathematical Modeling as a Data Assimilation Approach

    PubMed Central

    Behrouzvaziri, Abolhassan; Fu, Daniel; Tan, Patrick; Yoo, Yeonjoo; Zaretskaia, Maria V.; Rusyniak, Daniel E.; Molkov, Yaroslav I.; Zaretsky, Dmitry V.

    2015-01-01

    Experimental Data Orexinergic neurotransmission is involved in mediating temperature responses to methamphetamine (Meth). In experiments in rats, SB-334867 (SB), an antagonist of orexin receptors (OX1R), at a dose of 10 mg/kg decreases late temperature responses (t>60 min) to an intermediate dose of Meth (5 mg/kg). A higher dose of SB (30 mg/kg) attenuates temperature responses to low dose (1 mg/kg) of Meth and to stress. In contrast, it significantly exaggerates early responses (t<60 min) to intermediate and high doses (5 and 10 mg/kg) of Meth. As pretreatment with SB also inhibits temperature response to the stress of injection, traditional statistical analysis of temperature responses is difficult. Mathematical Modeling We have developed a mathematical model that explains the complexity of temperature responses to Meth as the interplay between excitatory and inhibitory nodes. We have extended the developed model to include the stress of manipulations and the effects of SB. Stress is synergistic with Meth on the action on excitatory node. Orexin receptors mediate an activation of on both excitatory and inhibitory nodes by low doses of Meth, but not on the node activated by high doses (HD). Exaggeration of early responses to high doses of Meth involves disinhibition: low dose of SB decreases tonic inhibition of HD and lowers the activation threshold, while the higher dose suppresses the inhibitory component. Using a modeling approach to data assimilation appears efficient in separating individual components of complex response with statistical analysis unachievable by traditional data processing methods. PMID:25993564

  5. Mathematical Analysis of Vehicle Delivery Scale of Bike-Sharing Rental Nodes

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Liu, J.; Liu, L.

    2018-04-01

    Aiming at the lack of scientific and reasonable judgment of vehicles delivery scale and insufficient optimization of scheduling decision, based on features of the bike-sharing usage, this paper analyses the applicability of the discrete time and state of the Markov chain, and proves its properties to be irreducible, aperiodic and positive recurrent. Based on above analysis, the paper has reached to the conclusion that limit state (steady state) probability of the bike-sharing Markov chain only exists and is independent of the initial probability distribution. Then this paper analyses the difficulty of the transition probability matrix parameter statistics and the linear equations group solution in the traditional solving algorithm of the bike-sharing Markov chain. In order to improve the feasibility, this paper proposes a "virtual two-node vehicle scale solution" algorithm which considered the all the nodes beside the node to be solved as a virtual node, offered the transition probability matrix, steady state linear equations group and the computational methods related to the steady state scale, steady state arrival time and scheduling decision of the node to be solved. Finally, the paper evaluates the rationality and accuracy of the steady state probability of the proposed algorithm by comparing with the traditional algorithm. By solving the steady state scale of the nodes one by one, the proposed algorithm is proved to have strong feasibility because it lowers the level of computational difficulty and reduces the number of statistic, which will help the bike-sharing companies to optimize the scale and scheduling of nodes.

  6. Unwanted pregnancy and traditional self-induced abortion methods known among women aged 15 to 49.

    PubMed

    Sensoy, Nazli; Dogan, Nurhan; Sen, Kubra; Aslan, Halit; Tore-Baser, Ayca

    2015-05-01

    To determine the traditional methods known and used to terminate an unwanted pregnancy and the fertility characteristics of married women. The descriptive cross-sectional study was conducted in Turkey at Afyonkarahisar Zübeyde Hanim Child and Maternity Hospital's outpatient clinic between December 27, 2010 and January 7, 2011, and comprised married women aged 17 to 49 who presented for an examination. Questions related to socio-demographic and fertility characteristics as well as known and used traditional abortion methods were included in the questionnaire which was administered through face-to-face interviews. SPSS 18.0 was used for statistical analysis. The median age of the 600 women in the study was 29.5 (range: 17-49) years. Overall, 134 (22.3%) women had experienced an unwanted pregnancy. In 53 (39.6%) cases, the unwanted pregnancy had occurred between the ages of 30 and 39, and 116(86.6%) women had married when they were between 15 and 24 (p< 0.008) years old. Pregnancy had been concluded normally in 78(58.2%)women with an unwanted pregnancy and 34(35.8%)preferred the withdrawal method for contraception. Traditional abortion methods were known to 413(68.8%)women, but only 8(1.3%) had used any of them. The harms of using a traditional abortion method were known to 464(77.3%)women. Very few women used traditional abortion methods to terminate pregnancy. Knowing the characteristics of women and their need for family planning should be the first priority for the prevention of unwanted pregnancies.

  7. Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging

    PubMed Central

    Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895

  8. Effect and safety of early weight-bearing on the outcome after open-wedge high tibial osteotomy: a systematic review and meta-analysis.

    PubMed

    Lee, O-Sung; Ahn, Soyeon; Lee, Yong Seuk

    2017-07-01

    The purpose of this systematic review and meta-analysis was to evaluate the effectiveness and safety of early weight-bearing by comparing clinical and radiological outcomes between early and traditional delayed weight-bearing after OWHTO. A rigorous and systematic approach was used. The methodological quality was also assessed. Results that are possible to be compared in two or more than two articles were presented as forest plots. A 95% confidence interval was calculated for each effect size, and we calculated the I 2 statistic, which presents the percentage of total variation attributable to the heterogeneity among studies. The random-effects model was used to calculate the effect size. Six articles were included in the final analysis. All case groups were composed of early full weight-bearing within 2 weeks. All control groups were composed of late full weight-bearing between 6 weeks and 2 months. Pooled analysis was possible for the improvement in Lysholm score, but there was no statistically significant difference shown between groups. Other clinical results were also similar between groups. Four studies reported mechanical femorotibial angle (mFTA) and this result showed no statistically significant difference between groups in the pooled analysis. Furthermore, early weight-bearing showed more favorable results in some radiologic results (osseointegration and patellar height) and complications (thrombophlebitis and recurrence). Our analysis supports that early full weight-bearing after OWHTO using a locking plate leads to improvement in outcomes and was comparable to the delayed weight-bearing in terms of clinical and radiological outcomes. On the contrary, early weight-bearing was more favorable with respect to some radiologic parameters and complications compared with delayed weight-bearing.

  9. Simultaneous analysis and quality assurance for diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.

  10. Bayesian statistics and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  11. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  12. Identification of Major Histocompatibility Complex-Regulated Body Odorants by Statistical Analysis of a Comparative Gas Chromatography/Mass Spectrometry Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willse, Alan R.; Belcher, Ann; Preti, George

    2005-04-15

    Gas chromatography (GC), combined with mass spectrometry (MS) detection, is a powerful analytical technique that can be used to separate, quantify, and identify volatile compounds in complex mixtures. This paper examines the application of GC-MS in a comparative experiment to identify volatiles that differ in concentration between two groups. A complex mixture might comprise several hundred or even thousands of volatile compounds. Because their number and location in a chromatogram generally are unknown, and because components overlap in populous chromatograms, the statistical problems offer significant challenges beyond traditional two-group screening procedures. We describe a statistical procedure to compare two-dimensional GC-MSmore » profiles between groups, which entails (1) signal processing: baseline correction and peak detection in single ion chromatograms; (2) aligning chromatograms in time; (3) normalizing differences in overall signal intensities; and (4) detecting chromatographic regions that differ between groups. Compared to existing approaches, the proposed method is robust to errors made at earlier stages of analysis, such as missed peaks or slightly misaligned chromatograms. To illustrate the method, we identify differences in GC-MS chromatograms of ether-extracted urine collected from two nearly identical inbred groups of mice, to investigate the relationship between odor and genetics of the major histocompatibility complex.« less

  13. Immediate Feedback Assessment Technique in a Chemistry Classroom

    NASA Astrophysics Data System (ADS)

    Taylor, Kate R.

    The Immediate Feedback Assessment Technique, or IFAT, is a new testing system that turns a student's traditional multiple-choice testing into a chance for hands-on learning; and provides teachers with an opportunity to obtain more information about a student's knowledge during testing. In the current study we wanted to know if: When students are given the second-chance afforded by the IFAT system, are they guessing or using prior knowledge when making their second chance choice. Additionally, while there has been some adaptation of this testing system in non-science disciplines, we wanted to study if the IFAT-system would be well- received among faculty in the sciences, more specifically chemistry faculty. By comparing the students rate of success on second-chance afforded by the IFAT-system versus the statistical likelihood of guessing correctly, statistical analysis was used to determine if we observed enough students earning the second-chance points to reject the likelihood that students were randomly guessing. Our data analysis revealed that is statistically highly unlikely that students were only guessing when the IFAT system was utilized. (It is important to note that while we can find that students are getting the answer correct at a much higher rate than random guessing we can never truly know if every student is using thought or not.).

  14. Identifying currents in the gene pool for bacterial populations using an integrative approach.

    PubMed

    Tang, Jing; Hanage, William P; Fraser, Christophe; Corander, Jukka

    2009-08-01

    The evolution of bacterial populations has recently become considerably better understood due to large-scale sequencing of population samples. It has become clear that DNA sequences from a multitude of genes, as well as a broad sample coverage of a target population, are needed to obtain a relatively unbiased view of its genetic structure and the patterns of ancestry connected to the strains. However, the traditional statistical methods for evolutionary inference, such as phylogenetic analysis, are associated with several difficulties under such an extensive sampling scenario, in particular when a considerable amount of recombination is anticipated to have taken place. To meet the needs of large-scale analyses of population structure for bacteria, we introduce here several statistical tools for the detection and representation of recombination between populations. Also, we introduce a model-based description of the shape of a population in sequence space, in terms of its molecular variability and affinity towards other populations. Extensive real data from the genus Neisseria are utilized to demonstrate the potential of an approach where these population genetic tools are combined with an phylogenetic analysis. The statistical tools introduced here are freely available in BAPS 5.2 software, which can be downloaded from http://web.abo.fi/fak/mnf/mate/jc/software/baps.html.

  15. OPLS statistical model versus linear regression to assess sonographic predictors of stroke prognosis.

    PubMed

    Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi

    2012-01-01

    The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.

  16. Duality between Time Series and Networks

    PubMed Central

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  17. Foreign exchange market data analysis reveals statistical features that predict price movement acceleration.

    PubMed

    Nacher, Jose C; Ochiai, Tomoshiro

    2012-05-01

    Increasingly accessible financial data allow researchers to infer market-dynamics-based laws and to propose models that are able to reproduce them. In recent years, several stylized facts have been uncovered. Here we perform an extensive analysis of foreign exchange data that leads to the unveiling of a statistical financial law. First, our findings show that, on average, volatility increases more when the price exceeds the highest (or lowest) value, i.e., breaks the resistance line. We call this the breaking-acceleration effect. Second, our results show that the probability P(T) to break the resistance line in the past time T follows power law in both real data and theoretically simulated data. However, the probability calculated using real data is rather lower than the one obtained using a traditional Black-Scholes (BS) model. Taken together, the present analysis characterizes a different stylized fact of financial markets and shows that the market exceeds a past (historical) extreme price fewer times than expected by the BS model (the resistance effect). However, when the market does, we predict that the average volatility at that time point will be much higher. These findings indicate that any Markovian model does not faithfully capture the market dynamics.

  18. Foreign exchange market data analysis reveals statistical features that predict price movement acceleration

    NASA Astrophysics Data System (ADS)

    Nacher, Jose C.; Ochiai, Tomoshiro

    2012-05-01

    Increasingly accessible financial data allow researchers to infer market-dynamics-based laws and to propose models that are able to reproduce them. In recent years, several stylized facts have been uncovered. Here we perform an extensive analysis of foreign exchange data that leads to the unveiling of a statistical financial law. First, our findings show that, on average, volatility increases more when the price exceeds the highest (or lowest) value, i.e., breaks the resistance line. We call this the breaking-acceleration effect. Second, our results show that the probability P(T) to break the resistance line in the past time T follows power law in both real data and theoretically simulated data. However, the probability calculated using real data is rather lower than the one obtained using a traditional Black-Scholes (BS) model. Taken together, the present analysis characterizes a different stylized fact of financial markets and shows that the market exceeds a past (historical) extreme price fewer times than expected by the BS model (the resistance effect). However, when the market does, we predict that the average volatility at that time point will be much higher. These findings indicate that any Markovian model does not faithfully capture the market dynamics.

  19. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2017-01-01

    Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237

  20. Analysis of stationary and dynamic factors affecting highway accident occurrence: A dynamic correlated grouped random parameters binary logit approach.

    PubMed

    Fountas, Grigorios; Sarwar, Md Tawfiq; Anastasopoulos, Panagiotis Ch; Blatt, Alan; Majka, Kevin

    2018-04-01

    Traditional accident analysis typically explores non-time-varying (stationary) factors that affect accident occurrence on roadway segments. However, the impact of time-varying (dynamic) factors is not thoroughly investigated. This paper seeks to simultaneously identify pre-crash stationary and dynamic factors of accident occurrence, while accounting for unobserved heterogeneity. Using highly disaggregate information for the potential dynamic factors, and aggregate data for the traditional stationary elements, a dynamic binary random parameters (mixed) logit framework is employed. With this approach, the dynamic nature of weather-related, and driving- and pavement-condition information is jointly investigated with traditional roadway geometric and traffic characteristics. To additionally account for the combined effect of the dynamic and stationary factors on the accident occurrence, the developed random parameters logit framework allows for possible correlations among the random parameters. The analysis is based on crash and non-crash observations between 2011 and 2013, drawn from urban and rural highway segments in the state of Washington. The findings show that the proposed methodological framework can account for both stationary and dynamic factors affecting accident occurrence probabilities, for panel effects, for unobserved heterogeneity through the use of random parameters, and for possible correlation among the latter. The comparative evaluation among the correlated grouped random parameters, the uncorrelated random parameters logit models, and their fixed parameters logit counterpart, demonstrate the potential of the random parameters modeling, in general, and the benefits of the correlated grouped random parameters approach, specifically, in terms of statistical fit and explanatory power. Published by Elsevier Ltd.

  1. Tackling Misconceptions about Linear Associations

    ERIC Educational Resources Information Center

    Huey, Maryann E.; Baker, Deidra L.

    2015-01-01

    Many teachers of required secondary school mathematics classes are introducing statistics and probability topics traditionally relegated to college or AP Statistics courses. As a result, they need guidance in preparing lesson plans and orchestrating effective classroom discussions. In this article, the authors will describe the students' learning…

  2. Effects of group sexual counseling on the traditional perceptions and attitudes of Iranian pregnant women

    PubMed Central

    Navidian, Ali; Rigi, Shahindokht Navabi; Soltani, Parvin

    2016-01-01

    Background Marital relationships may fluctuate due to physical and psychological changes during pregnancy. This study aimed to investigate the effect of group sexual counseling on the traditional perceptions and attitudes of pregnant women. Methods This was a quasiexperimental intervention study. Among the pregnant women who were referred to health care centers in Zahedan, Iran, in 2015 for routine care during pregnancy, 100 individuals were chosen and randomly categorized into two groups: intervention (n=50) and control (n=50). Variables were the participant’s attitudes and beliefs on sexual activity during pregnancy. The data were collected during pregnancy using the Sexual Activities and Attitudes Questionnaire. The questionnaire was completed before and 6 weeks after five sessions of group sexual counseling. Data were analyzed using SPSS software (Version 20) with descriptive and analytical statistics. Results The mean of score changes for sexual attitudes and traditional perceptions in the intervention group was significantly higher than that in the control group (P<0.0001). Analysis of covariance also showed that the mean score of the participant’s traditional perceptions and sexual attitudes in both groups was significantly different after the group sexual counseling. Discussion Due to the positive effect of group sexual counseling on improving the attitudes of pregnant women about sexual issues and reframing the traditional perceptions over sexual activities during pregnancy, it is recommended that this educational intervention should be integrated into counseling and prenatal care for pregnant women. PMID:27366105

  3. Genetics problem solving and worldview

    NASA Astrophysics Data System (ADS)

    Dale, Esther

    The research goal was to determine whether worldview relates to traditional and real-world genetics problem solving. Traditionally, scientific literacy emphasized content knowledge alone because it was sufficient to solve traditional problems. The contemporary definition of scientific literacy is, "The knowledge and understanding of scientific concepts and processes required for personal decision-making, participation in civic and cultural affairs and economic productivity" (NRC, 1996). An expanded definition of scientific literacy is needed to solve socioscientific issues (SSI), complex social issues with conceptual, procedural, or technological associations with science. Teaching content knowledge alone assumes that students will find the scientific explanation of a phenomenon to be superior to a non-science explanation. Formal science and everyday ways of thinking about science are two different cultures (Palmer, 1999). Students address this rift with cognitive apartheid, the boxing away of science knowledge from other types of knowledge (Jedege & Aikenhead, 1999). By addressing worldview, cognitive apartheid may decrease and scientific literacy may increase. Introductory biology students at the University of Minnesota during fall semester 2005 completed a written questionnaire-including a genetics content-knowledge test, four genetic dilemmas, the Worldview Assessment Instrument (WAI) and some items about demographics and religiosity. Six students responded to the interview protocol. Based on statistical analysis and interview data, this study concluded the following: (1) Worldview, in the form of metaphysics, relates to solving traditional genetic dilemmas. (2) Worldview, in the form of agency, relates to solving traditional genetics problems. (3) Thus, worldview must be addressed in curriculum, instruction, and assessment.

  4. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  5. Breaking the Tradition of Summer Vacation to Raise Academic Achievement

    ERIC Educational Resources Information Center

    Ramos, Barbara Kay

    2011-01-01

    This study found that students in settings with a year-round calendar statistically outperformed students with traditional calendars in a school-within-a-school setting in mathematics. The study included reading and math achievement of fifth graders in three school-within-a-school year-round elementary schools. Overall, the study made 16…

  6. The Runners and Injury Longitudinal Study: Injury Recovery Supplement (TRAILS_IR)

    DTIC Science & Technology

    2013-08-01

    2) develop statistical models that integrate biomechanical, behavioral, and psychological risk factors for injury, (3) determine the length of...Running Mechanics and Flexibility Between Runners in Minimalist and Traditional Footwear ”......14...annual meeting entitled “Differences in Running Mechanics and Flexibility between Runners in Minimalist and Traditional Footwear ”. The following

  7. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  8. Managing Clustered Data Using Hierarchical Linear Modeling

    ERIC Educational Resources Information Center

    Warne, Russell T.; Li, Yan; McKyer, E. Lisako J.; Condie, Rachel; Diep, Cassandra S.; Murano, Peter S.

    2012-01-01

    Researchers in nutrition research often use cluster or multistage sampling to gather participants for their studies. These sampling methods often produce violations of the assumption of data independence that most traditional statistics share. Hierarchical linear modeling is a statistical method that can overcome violations of the independence…

  9. A peaking-regulation-balance-based method for wind & PV power integrated accommodation

    NASA Astrophysics Data System (ADS)

    Zhang, Jinfang; Li, Nan; Liu, Jun

    2018-02-01

    Rapid development of China’s new energy in current and future should be focused on cooperation of wind and PV power. Based on the analysis of system peaking balance, combined with the statistical features of wind and PV power output characteristics, a method of comprehensive integrated accommodation analysis of wind and PV power is put forward. By the electric power balance during night peaking load period in typical day, wind power installed capacity is determined firstly; then PV power installed capacity could be figured out by midday peak load hours, which effectively solves the problem of uncertainty when traditional method hard determines the combination of the wind and solar power simultaneously. The simulation results have validated the effectiveness of the proposed method.

  10. Meteor tracking via local pattern clustering in spatio-temporal domain

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel

    2016-09-01

    Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).

  11. Application of meta-analysis methods for identifying proteomic expression level differences.

    PubMed

    Amess, Bob; Kluge, Wolfgang; Schwarz, Emanuel; Haenisch, Frieder; Alsaif, Murtada; Yolken, Robert H; Leweke, F Markus; Guest, Paul C; Bahn, Sabine

    2013-07-01

    We present new statistical approaches for identification of proteins with expression levels that are significantly changed when applying meta-analysis to two or more independent experiments. We showed that the Euclidean distance measure has reduced risk of false positives compared to the rank product method. Our Ψ-ranking method has advantages over the traditional fold-change approach by incorporating both the fold-change direction as well as the p-value. In addition, the second novel method, Π-ranking, considers the ratio of the fold-change and thus integrates all three parameters. We further improved the latter by introducing our third technique, Σ-ranking, which combines all three parameters in a balanced nonparametric approach. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  13. Younger African American Adults' Use of Religious Songs to Manage Stressful Life Events.

    PubMed

    Hamilton, Jill B; Stewart, Jennifer M; Thompson, Keitra; Alvarez, Carmen; Best, Nakia C; Amoah, Kevin; Carlton-LaNey, Iris B

    2017-02-01

    The aim of this study was to explore the use of religious songs in response to stressful life events among young African American adults. Fifty-five young African American adults aged 18-49 participated in a qualitative study involving criterion sampling and open-ended interviews. Data analysis included content analysis and descriptive statistics. Stressful life events were related to work or school; caregiving and death of a family member; and relationships. Religious songs represented five categories: Instructive, Communication with God, Thanksgiving and Praise, Memory of Forefathers, and Life after Death. The tradition of using religious songs in response to stressful life events continues among these young adults. Incorporating religious songs into health-promoting interventions might enhance their cultural relevance to this population.

  14. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  15. Optimizing α for better statistical decisions: a case study involving the pace-of-life syndrome hypothesis: optimal α levels set to minimize Type I and II errors frequently result in different conclusions from those using α = 0.05.

    PubMed

    Mudge, Joseph F; Penny, Faith M; Houlahan, Jeff E

    2012-12-01

    Setting optimal significance levels that minimize Type I and Type II errors allows for more transparent and well-considered statistical decision making compared to the traditional α = 0.05 significance level. We use the optimal α approach to re-assess conclusions reached by three recently published tests of the pace-of-life syndrome hypothesis, which attempts to unify occurrences of different physiological, behavioral, and life history characteristics under one theory, over different scales of biological organization. While some of the conclusions reached using optimal α were consistent to those previously reported using the traditional α = 0.05 threshold, opposing conclusions were also frequently reached. The optimal α approach reduced probabilities of Type I and Type II errors, and ensured statistical significance was associated with biological relevance. Biologists should seriously consider their choice of α when conducting null hypothesis significance tests, as there are serious disadvantages with consistent reliance on the traditional but arbitrary α = 0.05 significance level. Copyright © 2012 WILEY Periodicals, Inc.

  16. Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing

    PubMed Central

    Meng, Bo; Cheng, Lihong

    2017-01-01

    The rise of global value chains (GVCs) characterized by the so-called “outsourcing”, “fragmentation production”, and “trade in tasks” has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics. PMID:28081201

  17. Statistical analysis of atom probe data: detecting the early stages of solute clustering and/or co-segregation.

    PubMed

    Hyde, J M; Cerezo, A; Williams, T J

    2009-04-01

    Statistical analysis of atom probe data has improved dramatically in the last decade and it is now possible to determine the size, the number density and the composition of individual clusters or precipitates such as those formed in reactor pressure vessel (RPV) steels during irradiation. However, the characterisation of the onset of clustering or co-segregation is more difficult and has traditionally focused on the use of composition frequency distributions (for detecting clustering) and contingency tables (for detecting co-segregation). In this work, the authors investigate the possibility of directly examining the neighbourhood of each individual solute atom as a means of identifying the onset of solute clustering and/or co-segregation. The methodology involves comparing the mean observed composition around a particular type of solute with that expected from the overall composition of the material. The methodology has been applied to atom probe data obtained from several irradiated RPV steels. The results show that the new approach is more sensitive to fine scale clustering and co-segregation than that achievable using composition frequency distribution and contingency table analyses.

  18. Bayesian Estimation of Thermonuclear Reaction Rates for Deuterium+Deuterium Reactions

    NASA Astrophysics Data System (ADS)

    Gómez Iñesta, Á.; Iliadis, C.; Coc, A.

    2017-11-01

    The study of d+d reactions is of major interest since their reaction rates affect the predicted abundances of D, 3He, and 7Li. In particular, recent measurements of primordial D/H ratios call for reduced uncertainties in the theoretical abundances predicted by Big Bang nucleosynthesis (BBN). Different authors have studied reactions involved in BBN by incorporating new experimental data and a careful treatment of systematic and probabilistic uncertainties. To analyze the experimental data, Coc et al. used results of ab initio models for the theoretical calculation of the energy dependence of S-factors in conjunction with traditional statistical methods based on χ 2 minimization. Bayesian methods have now spread to many scientific fields and provide numerous advantages in data analysis. Astrophysical S-factors and reaction rates using Bayesian statistics were calculated by Iliadis et al. Here we present a similar analysis for two d+d reactions, d(d, n)3He and d(d, p)3H, that has been translated into a total decrease of the predicted D/H value by 0.16%.

  19. Identifying hearing loss by means of iridology.

    PubMed

    Stearn, Natalie; Swanepoel, De Wet

    2006-11-13

    Isolated reports of hearing loss presenting as markings on the iris exist, but to date the effectiveness of iridology to identify hearing loss has not been investigated. This study therefore aimed to determine the efficacy of iridological analysis in the identification of moderate to profound sensorineural hearing loss in adolescents. A controlled trial was conducted with an iridologist, blind to the actual hearing status of participants, analyzing the irises of participants with and without hearing loss. Fifty hearing impaired and fifty normal hearing subjects, between the ages of 15 and 19 years, controlled for gender, participated in the study. An experienced iridologist analyzed the randomised set of participants' irises. A 70% correct identification of hearing status was obtained by iridological analyses with a false negative rate of 41% compared to a 19% false positive rate. The respective sensitivity and specificity rates therefore came to 59% and 81%. Iridological analysis of hearing status indicated a statistically significant relationship to actual hearing status (P < 0.05). Although statistically significant sensitivity and specificity rates for identifying hearing loss by iridology were not comparable to those of traditional audiological screening procedures.

  20. Unified risk analysis of fatigue failure in ductile alloy components during all three stages of fatigue crack evolution process.

    PubMed

    Patankar, Ravindra

    2003-10-01

    Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.

  1. Ultrasonic test of resistance spot welds based on wavelet package analysis.

    PubMed

    Liu, Jing; Xu, Guocheng; Gu, Xiaopeng; Zhou, Guanghao

    2015-02-01

    In this paper, ultrasonic test of spot welds for stainless steel sheets has been studied. It is indicated that traditional ultrasonic signal analysis in either time domain or frequency domain remains inadequate to evaluate the nugget diameter of spot welds. However, the method based on wavelet package analysis in time-frequency domain can easily distinguish the nugget from the corona bond by extracting high-frequency signals in different positions of spot welds, thereby quantitatively evaluating the nugget diameter. The results of ultrasonic test fit the actual measured value well. Mean value of normal distribution of error statistics is 0.00187, and the standard deviation is 0.1392. Furthermore, the quality of spot welds was evaluated, and it is showed ultrasonic nondestructive test based on wavelet packet analysis can be used to evaluate the quality of spot welds, and it is more reliable than single tensile destructive test. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Study of Aided Diagnosis of Hepatic Carcinoma Based on Artificial Neural Network Combined with Tumor Marker Group

    NASA Astrophysics Data System (ADS)

    Tan, Shanjuan; Feng, Feifei; Wu, Yongjun; Wu, Yiming

    To develop a computer-aided diagnostic scheme by using an artificial neural network (ANN) combined with tumor markers for diagnosis of hepatic carcinoma (HCC) as a clinical assistant method. 140 serum samples (50 malignant, 40 benign and 50 normal) were analyzed for α-fetoprotein (AFP), carbohydrate antigen 125 (CA125), carcinoembryonic antigen (CEA), sialic acid (SA) and calcium (Ca). The five tumor marker values were then used as ANN inputs data. The result of ANN was compared with that of discriminant analysis by receiver operating characteristic (ROC) curve (AUC) analysis. The diagnostic accuracy of ANN and discriminant analysis among all samples of the test group was 95.5% and 79.3%, respectively. Analysis of multiple tumor markers based on ANN may be a better choice than the traditional statistical methods for differentiating HCC from benign or normal.

  3. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Comparison of Response Surface and Kriging Models in the Multidisciplinary Design of an Aerospike Nozzle

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.

    1998-01-01

    The use of response surface models and kriging models are compared for approximating non-random, deterministic computer analyses. After discussing the traditional response surface approach for constructing polynomial models for approximation, kriging is presented as an alternative statistical-based approximation method for the design and analysis of computer experiments. Both approximation methods are applied to the multidisciplinary design and analysis of an aerospike nozzle which consists of a computational fluid dynamics model and a finite element analysis model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations. Four optimization problems are formulated and solved using both approximation models. While neither approximation technique consistently outperforms the other in this example, the kriging models using only a constant for the underlying global model and a Gaussian correlation function perform as well as the second order polynomial response surface models.

  5. Accessing and integrating data and knowledge for biomedical research.

    PubMed

    Burgun, A; Bodenreider, O

    2008-01-01

    To review the issues that have arisen with the advent of translational research in terms of integration of data and knowledge, and survey current efforts to address these issues. Using examples form the biomedical literature, we identified new trends in biomedical research and their impact on bioinformatics. We analyzed the requirements for effective knowledge repositories and studied issues in the integration of biomedical knowledge. New diagnostic and therapeutic approaches based on gene expression patterns have brought about new issues in the statistical analysis of data, and new workflows are needed are needed to support translational research. Interoperable data repositories based on standard annotations, infrastructures and services are needed to support the pooling and meta-analysis of data, as well as their comparison to earlier experiments. High-quality, integrated ontologies and knowledge bases serve as a source of prior knowledge used in combination with traditional data mining techniques and contribute to the development of more effective data analysis strategies. As biomedical research evolves from traditional clinical and biological investigations towards omics sciences and translational research, specific needs have emerged, including integrating data collected in research studies with patient clinical data, linking omics knowledge with medical knowledge, modeling the molecular basis of diseases, and developing tools that support in-depth analysis of research data. As such, translational research illustrates the need to bridge the gap between bioinformatics and medical informatics, and opens new avenues for biomedical informatics research.

  6. Interrupted time-series analysis: studying trends in neurosurgery.

    PubMed

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  7. A Value Analysis of Lean Processes in Target Value Design and Integrated Project Delivery.

    PubMed

    Nanda, Upali; K Rybkowski, Zofia; Pati, Sipra; Nejati, Adeleh

    2017-04-01

    To investigate what key stakeholders consider to be the advantages and the opportunities for improvement in using lean thinking and tools in the integrated project delivery (IPD) process. A detailed literature review was followed by case study of a Lean-IPD project. Interviews with members of the project leadership team, focus groups with the integrated team as well as the design team, and an online survey of all stakeholders were conducted. Statistical analysis and thematic content analysis were used to analyze the data, followed by a plus-delta analysis. (1) Learning is a large, implicit benefit of Lean-IPD that is not currently captured by any success metric; (2) the cardboard mock-up was the most successful lean strategy; (3) although a collaborative project, the level of influence of different stakeholder groups was perceived to be different by different stakeholders; (4) overall, Lean-IPD was rated as better than traditional design-bid-build methods; and (5) opportunities for improvement reported were increase in accurate cost estimating, more efficient use of time, perception of imbalance of control/influence, and need for facilitation (which represents different points of view). While lean tools and an IPD method are preferred to traditional design-bid-build methods, the perception of different stakeholders varies and more work needs to be done to allow a truly shared decision-making model. Learning was identified as one of the biggest advantages.

  8. Effect of the Earth's rotation on subduction processes

    NASA Astrophysics Data System (ADS)

    Levin, B. W.; Rodkin, M. V.; Sasorova, E. V.

    2017-09-01

    The role played by the Earth's rotation is very important in problems of physics of the atmosphere and ocean. The importance of inertia forces is traditionally estimated by the value of the Rossby number: if this parameter is small, the Coriolis force considerably affects the character of movements. In the case of convection in the Earth's mantle and movements of lithospheric plates, the Rossby number is quite small; therefore, the effect of the Coriolis force is reflected in the character of movements of the lithospheric plates. Analysis of statistical data on subduction zones verifies this suggestion.

  9. Systematic review on traditional medicinal plants used for the treatment of malaria in Ethiopia: trends and perspectives.

    PubMed

    Alebie, Getachew; Urga, Befikadu; Worku, Amha

    2017-08-01

    Ethiopia is endowed with abundant medicinal plant resources and traditional medicinal practices. However, available research evidence on indigenous anti-malarial plants is highly fragmented in the country. The present systematic review attempted to explore, synthesize and compile ethno-medicinal research evidence on anti-malarial medicinal plants in Ethiopia. A systematic web search analysis and review was conducted on research literature pertaining to medicinal plants used for traditional malaria treatment in Ethiopia. Data were collected from a total of 82 Ethiopian studies meeting specific inclusion criteria including published research articles and unpublished thesis reports. SPSS Version 16 was used to summarize relevant ethno-botanical/medicinal information using descriptive statistics, frequency, percentage, tables, and bar graphs. A total of 200 different plant species (from 71 families) used for traditional malaria treatment were identified in different parts of Ethiopia. Distribution and usage pattern of anti-malarial plants showed substantial variability across different geographic settings. A higher diversity of anti-malarial plants was reported from western and southwestern parts of the country. Analysis of ethno-medicinal recipes indicated that mainly fresh leaves were used for preparation of remedies. Decoction, concoction and eating/chewing were found to be the most frequently employed herbal remedy preparation methods. Notably, anti-malarial herbal remedies were administered by oral route. Information on potential side effects of anti-malarial herbal preparations was patchy. However, some anti-malarial plants were reported to have potentially serious side effects using different local antidotes and some specific contra-indications. The study highlighted a rich diversity of indigenous anti-malarial medicinal plants with equally divergent herbal remedy preparation and use pattern in Ethiopia. Baseline information gaps were observed in key geographic settings. Likewise, herbal remedy toxicity risks and countermeasures generally entailed more exhaustive investigation. Experimental research and advanced chemical analysis are also required to validate the therapeutic potential of anti-malarial compounds from promising plant species.

  10. Impact of a Single Unusually Large Rainfall Event on the Level of Risk Used for Infrastructure Design

    NASA Astrophysics Data System (ADS)

    Dhakal, N.; Jain, S.

    2013-12-01

    Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.

  11. Characteristics of genomic signatures derived using univariate methods and mechanistically anchored functional descriptors for predicting drug- and xenobiotic-induced nephrotoxicity.

    PubMed

    Shi, Weiwei; Bugrim, Andrej; Nikolsky, Yuri; Nikolskya, Tatiana; Brennan, Richard J

    2008-01-01

    ABSTRACT The ideal toxicity biomarker is composed of the properties of prediction (is detected prior to traditional pathological signs of injury), accuracy (high sensitivity and specificity), and mechanistic relationships to the endpoint measured (biological relevance). Gene expression-based toxicity biomarkers ("signatures") have shown good predictive power and accuracy, but are difficult to interpret biologically. We have compared different statistical methods of feature selection with knowledge-based approaches, using GeneGo's database of canonical pathway maps, to generate gene sets for the classification of renal tubule toxicity. The gene set selection algorithms include four univariate analyses: t-statistics, fold-change, B-statistics, and RankProd, and their combination and overlap for the identification of differentially expressed probes. Enrichment analysis following the results of the four univariate analyses, Hotelling T-square test, and, finally out-of-bag selection, a variant of cross-validation, were used to identify canonical pathway maps-sets of genes coordinately involved in key biological processes-with classification power. Differentially expressed genes identified by the different statistical univariate analyses all generated reasonably performing classifiers of tubule toxicity. Maps identified by enrichment analysis or Hotelling T-square had lower classification power, but highlighted perturbed lipid homeostasis as a common discriminator of nephrotoxic treatments. The out-of-bag method yielded the best functionally integrated classifier. The map "ephrins signaling" performed comparably to a classifier derived using sparse linear programming, a machine learning algorithm, and represents a signaling network specifically involved in renal tubule development and integrity. Such functional descriptors of toxicity promise to better integrate predictive toxicogenomics with mechanistic analysis, facilitating the interpretation and risk assessment of predictive genomic investigations.

  12. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  13. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. The political context of social inequalities and health.

    PubMed

    Navarro, V; Shi, L

    2001-02-01

    This analysis reflects on the importance of political parties, and the policies they implement when in government, in determining the level of equalities/inequalities in a society, the extent of the welfare state (including the level of health care coverage by the state), the employment/unemployment rate, and the level of population health. The study looks at the impact of the major political traditions in the advanced OECD countries during the golden years of capitalism (1945-1980) -- social democratic, Christian democratic, liberal, and ex-fascist -- in four areas: (1) the main determinants of income inequalities, such as the overall distribution of income derived from capital versus labor, wage dispersion in the labor force, the redistributive effect of the welfare state, and the levels and types of employment/ unemployment; (2) levels of public expenditures and health care benefits coverage; (3) public support of services to families, such as child care and domiciliary care; and (4) the level of population health as measured by infant mortality rates. The results indicate that political traditions more committed to redistributive policies (both economic and social) and full-employment policies, such as the social democratic parties, were generally more successful in improving the health of populations, such as reducing infant mortality. The erroneous assumption of a conflict between social equity and economic efficiency, as in the liberal tradition, is also discussed. The study aims at filling a void in the growing health and social inequalities literature, which rarely touches on the importance of political forces in influencing inequalities. The data used in the study are largely from OECD health data for 1997 and 1998; the OECD statistical services; the comparative welfare state data set assembled by Huber, Ragin and Stephens; and the US Bureau of Labor Statistics.

  15. Reducing Wait Time for Lung Cancer Diagnosis and Treatment: Impact of a Multidisciplinary, Centralized Referral Program.

    PubMed

    Common, Jessica L; Mariathas, Hensley H; Parsons, Kaylah; Greenland, Jonathan D; Harris, Scott; Bhatia, Rick; Byrne, SuzanneC

    2018-06-04

    A multidisciplinary, centralized referral program was established at our institution in 2014 to reduce delays in lung cancer diagnosis and treatment following diagnostic imaging observed with the traditional, primary care provider-led referral process. The main objectives of this retrospective cohort study were to determine if referral to a Thoracic Triage Panel (TTP): 1) expedites lung cancer diagnosis and treatment initiation; and 2) leads to more appropriate specialist consultation. Patients with a diagnosis of lung cancer and initial diagnostic imaging between March 1, 2015, and February 29, 2016, at a Memorial University-affiliated tertiary care centre in St John's, Newfoundland, were identified and grouped according to whether they were referred to the TTP or managed through a traditional referral process. Wait times (in days) from first abnormal imaging to biopsy and treatment initiation were recorded. Statistical analysis was performed using the Wilcoxon rank-sum test. A total of 133 patients who met inclusion criteria were identified. Seventy-nine patients were referred to the TTP and 54 were managed by traditional means. There was a statistically significant reduction in median wait times for patients referred to the TTP. Wait time from first abnormal imaging to biopsy decreased from 61.5 to 36.0 days (P < .0001). Wait time from first abnormal imaging to treatment initiation decreased from 118.0 to 80.0 days (P < .001). The percentage of specialist consultations that led to treatment was also greater for patients referred to the TTP. A collaborative, centralized intake and referral program helps to reduce wait time for diagnosis and treatment of lung cancer. Copyright © 2018 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  16. Time Management in the Operating Room: An Analysis of the Dedicated Minimally Invasive Surgery Suite

    PubMed Central

    Hsiao, Kenneth C.; Machaidze, Zurab

    2004-01-01

    Background: Dedicated minimally invasive surgery suites are available that contain specialized equipment to facilitate endoscopic surgery. Laparoscopy performed in a general operating room is hampered by the multitude of additional equipment that must be transported into the room. The objective of this study was to compare the preparation times between procedures performed in traditional operating rooms versus dedicated minimally invasive surgery suites to see whether operating room efficiency is improved in the specialized room. Methods: The records of 50 patients who underwent laparoscopic procedures between September 2000 and April 2002 were retrospectively reviewed. Twenty-three patients underwent surgery in a general operating room and 18 patients in an minimally invasive surgery suite. Nine patients were excluded because of cystoscopic procedures undergone prior to laparoscopy. Various time points were recorded from which various time intervals were derived, such as preanesthesia time, anesthesia induction time, and total preparation time. A 2-tailed, unpaired Student t test was used for statistical analysis. Results: The mean preanesthesia time was significantly faster in the minimally invasive surgery suite (12.2 minutes) compared with that in the traditional operating room (17.8 minutes) (P=0.013). Mean anesthesia induction time in the minimally invasive surgery suite (47.5 minutes) was similar to time in the traditional operating room (45.7 minutes) (P=0.734). The average total preparation time for the minimally invasive surgery suite (59.6 minutes) was not significantly faster than that in the general operating room (63.5 minutes) (P=0.481). Conclusion: The amount of time that elapses between the patient entering the room and anesthesia induction is statically shorter in a dedicated minimally invasive surgery suite. Laparoscopic surgery is performed more efficiently in a dedicated minimally invasive surgery suite versus a traditional operating room. PMID:15554269

  17. CLASSIFICATION OF IRANIAN NURSES ACCORDING TO THEIR MENTAL HEALTH OUTCOMES USING GHQ-12 QUESTIONNAIRE: A COMPARISON BETWEEN LATENT CLASS ANALYSIS AND K-MEANS CLUSTERING WITH TRADITIONAL SCORING METHOD

    PubMed Central

    Jamali, Jamshid; Ayatollahi, Seyyed Mohammad Taghi

    2015-01-01

    Background: Nurses constitute the most providers of health care systems. Their mental health can affect the quality of services and patients’ satisfaction. General Health Questionnaire (GHQ-12) is a general screening tool used to detect mental disorders. Scoring method and determining thresholds for this questionnaire are debatable and the cut-off points can vary from sample to sample. This study was conducted to estimate the prevalence of mental disorders among Iranian nurses using GHQ-12 and also compare Latent Class Analysis (LCA) and K-means clustering with traditional scoring method. Methodology: A cross-sectional study was carried out in Fars and Bushehr provinces of southern Iran in 2014. Participants were 771 Iranian nurses, who filled out the GHQ-12 questionnaire. Traditional scoring method, LCA and K-means were used to estimate the prevalence of mental disorder among Iranian nurses. Cohen’s kappa statistic was applied to assess the agreement between the LCA and K-means with traditional scoring method of GHQ-12. Results: The nurses with mental disorder by scoring method, LCA and K-mean were 36.3% (n=280), 32.2% (n=248), and 26.5% (n=204), respectively. LCA and logistic regression revealed that the prevalence of mental disorder in females was significantly higher than males. Conclusion: Mental disorder in nurses was in a medium level compared to other people living in Iran. There was a little difference between prevalence of mental disorder estimated by scoring method, K-means and LCA. According to the advantages of LCA than K-means and different results in scoring method, we suggest LCA for classification of Iranian nurses according to their mental health outcomes using GHQ-12 questionnaire PMID:26622202

  18. CLASSIFICATION OF IRANIAN NURSES ACCORDING TO THEIR MENTAL HEALTH OUTCOMES USING GHQ-12 QUESTIONNAIRE: A COMPARISON BETWEEN LATENT CLASS ANALYSIS AND K-MEANS CLUSTERING WITH TRADITIONAL SCORING METHOD.

    PubMed

    Jamali, Jamshid; Ayatollahi, Seyyed Mohammad Taghi

    2015-10-01

    Nurses constitute the most providers of health care systems. Their mental health can affect the quality of services and patients' satisfaction. General Health Questionnaire (GHQ-12) is a general screening tool used to detect mental disorders. Scoring method and determining thresholds for this questionnaire are debatable and the cut-off points can vary from sample to sample. This study was conducted to estimate the prevalence of mental disorders among Iranian nurses using GHQ-12 and also compare Latent Class Analysis (LCA) and K-means clustering with traditional scoring method. A cross-sectional study was carried out in Fars and Bushehr provinces of southern Iran in 2014. Participants were 771 Iranian nurses, who filled out the GHQ-12 questionnaire. Traditional scoring method, LCA and K-means were used to estimate the prevalence of mental disorder among Iranian nurses. Cohen's kappa statistic was applied to assess the agreement between the LCA and K-means with traditional scoring method of GHQ-12. The nurses with mental disorder by scoring method, LCA and K-mean were 36.3% (n=280), 32.2% (n=248), and 26.5% (n=204), respectively. LCA and logistic regression revealed that the prevalence of mental disorder in females was significantly higher than males. Mental disorder in nurses was in a medium level compared to other people living in Iran. There was a little difference between prevalence of mental disorder estimated by scoring method, K-means and LCA. According to the advantages of LCA than K-means and different results in scoring method, we suggest LCA for classification of Iranian nurses according to their mental health outcomes using GHQ-12 questionnaire.

  19. Topological signatures of interstellar magnetic fields - I. Betti numbers and persistence diagrams

    NASA Astrophysics Data System (ADS)

    Makarenko, Irina; Shukurov, Anvar; Henderson, Robin; Rodrigues, Luiz F. S.; Bushby, Paul; Fletcher, Andrew

    2018-04-01

    The interstellar medium (ISM) is a magnetized system in which transonic or supersonic turbulence is driven by supernova explosions. This leads to the production of intermittent, filamentary structures in the ISM gas density, whilst the associated dynamo action also produces intermittent magnetic fields. The traditional theory of random functions, restricted to second-order statistical moments (or power spectra), does not adequately describe such systems. We apply topological data analysis (TDA), sensitive to all statistical moments and independent of the assumption of Gaussian statistics, to the gas density fluctuations in a magnetohydrodynamic simulation of the multiphase ISM. This simulation admits dynamo action, so produces physically realistic magnetic fields. The topology of the gas distribution, with and without magnetic fields, is quantified in terms of Betti numbers and persistence diagrams. Like the more standard correlation analysis, TDA shows that the ISM gas density is sensitive to the presence of magnetic fields. However, TDA gives us important additional information that cannot be obtained from correlation functions. In particular, the Betti numbers per correlation cell are shown to be physically informative. Magnetic fields make the ISM more homogeneous, reducing the abundance of both isolated gas clouds and cavities, with a stronger effect on the cavities. Remarkably, the modification of the gas distribution by magnetic fields is captured by the Betti numbers even in regions more than 300 pc from the mid-plane, where the magnetic field is weaker and correlation analysis fails to detect any signatures of magnetic effects.

  20. Quantifying predictability in a model with statistical features of the atmosphere

    PubMed Central

    Kleeman, Richard; Majda, Andrew J.; Timofeyev, Ilya

    2002-01-01

    The Galerkin truncated inviscid Burgers equation has recently been shown by the authors to be a simple model with many degrees of freedom, with many statistical properties similar to those occurring in dynamical systems relevant to the atmosphere. These properties include long time-correlated, large-scale modes of low frequency variability and short time-correlated “weather modes” at smaller scales. The correlation scaling in the model extends over several decades and may be explained by a simple theory. Here a thorough analysis of the nature of predictability in the idealized system is developed by using a theoretical framework developed by R.K. This analysis is based on a relative entropy functional that has been shown elsewhere by one of the authors to measure the utility of statistical predictions precisely. The analysis is facilitated by the fact that most relevant probability distributions are approximately Gaussian if the initial conditions are assumed to be so. Rather surprisingly this holds for both the equilibrium (climatological) and nonequilibrium (prediction) distributions. We find that in most cases the absolute difference in the first moments of these two distributions (the “signal” component) is the main determinant of predictive utility variations. Contrary to conventional belief in the ensemble prediction area, the dispersion of prediction ensembles is generally of secondary importance in accounting for variations in utility associated with different initial conditions. This conclusion has potentially important implications for practical weather prediction, where traditionally most attention has focused on dispersion and its variability. PMID:12429863

  1. Adaptation of Lorke's method to determine and compare ED50 values: the cases of two anticonvulsants drugs.

    PubMed

    Garrido-Acosta, Osvaldo; Meza-Toledo, Sergio Enrique; Anguiano-Robledo, Liliana; Valencia-Hernández, Ignacio; Chamorro-Cevallos, Germán

    2014-01-01

    We determined the median effective dose (ED50) values for the anticonvulsants phenobarbital and sodium valproate using a modification of Lorke's method. This modification allowed appropriate statistical analysis and the use of a smaller number of mice per compound tested. The anticonvulsant activities of phenobarbital and sodium valproate were evaluated in male CD1 mice by maximal electroshock (MES) and intraperitoneal administration of pentylenetetrazole (PTZ). The anticonvulsant ED50 values were obtained through modifications of Lorke's method that involved changes in the selection of the three first doses in the initial test and the fourth dose in the second test. Furthermore, a test was added to evaluate the ED50 calculated by the modified Lorke's method, allowing statistical analysis of the data and determination of the confidence limits for ED50. The ED50 for phenobarbital against MES- and PTZ-induced seizures was 16.3mg/kg and 12.7mg/kg, respectively. The sodium valproate values were 261.2mg/kg and 159.7mg/kg, respectively. These results are similar to those found using the traditional methods of finding ED50, suggesting that the modifications made to Lorke's method generate equal results using fewer mice while increasing confidence in the statistical analysis. This adaptation of Lorke's method can be used to determine median letal dose (LD50) or ED50 for compounds with other pharmacological activities. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. The neuronal correlates of intranasal trigeminal function – An ALE meta-analysis of human functional brain imaging data

    PubMed Central

    Albrecht, Jessica; Kopietz, Rainer; Frasnelli, Johannes; Wiesmann, Martin; Hummel, Thomas; Lundström, Johan N.

    2009-01-01

    Almost every odor we encounter in daily life has the capacity to produce a trigeminal sensation. Surprisingly, few functional imaging studies exploring human neuronal correlates of intranasal trigeminal function exist, and results are to some degree inconsistent. We utilized activation likelihood estimation (ALE), a quantitative voxel-based meta-analysis tool, to analyze functional imaging data (fMRI/PET) following intranasal trigeminal stimulation with carbon dioxide (CO2), a stimulus known to exclusively activate the trigeminal system. Meta-analysis tools are able to identify activations common across studies, thereby enabling activation mapping with higher certainty. Activation foci of nine studies utilizing trigeminal stimulation were included in the meta-analysis. We found significant ALE scores, thus indicating consistent activation across studies, in the brainstem, ventrolateral posterior thalamic nucleus, anterior cingulate cortex, insula, precentral gyrus, as well as in primary and secondary somatosensory cortices – a network known for the processing of intranasal nociceptive stimuli. Significant ALE values were also observed in the piriform cortex, insula, and the orbitofrontal cortex, areas known to process chemosensory stimuli, and in association cortices. Additionally, the trigeminal ALE statistics were directly compared with ALE statistics originating from olfactory stimulation, demonstrating considerable overlap in activation. In conclusion, the results of this meta-analysis map the human neuronal correlates of intranasal trigeminal stimulation with high statistical certainty and demonstrate that the cortical areas recruited during the processing of intranasal CO2 stimuli include those outside traditional trigeminal areas. Moreover, through illustrations of the considerable overlap between brain areas that process trigeminal and olfactory information; these results demonstrate the interconnectivity of flavor processing. PMID:19913573

  3. Evaluation of different PCR primers for denaturing gradient gel electrophoresis (DGGE) analysis of fungal community structure in traditional fermentation starters used for Hong Qu glutinous rice wine.

    PubMed

    Lv, Xu-Cong; Jiang, Ya-Jun; Liu, Jie; Guo, Wei-Ling; Liu, Zhi-Bin; Zhang, Wen; Rao, Ping-Fan; Ni, Li

    2017-08-16

    Denaturing gradient gel electrophoresis (DGGE) has become a widely used tool to examine microbial community structure. However, when DGGE is applied to evaluate the fungal community of traditional fermentation starters, the choice of hypervariable ribosomal RNA gene regions is still controversial. In the current study, several previously published fungal PCR primer sets were compared and evaluated using PCR-DGGE, with the purpose of screening a suitable primer set to study the fungal community of traditional fermentation starters for Hong Qu glutinous rice wine. Firstly, different primer sets were used to amplify different hypervariable regions from pure fungal cultures. Except NS1/FR1+ and ITS1fGC/ITS4, other primer sets (NL1+/LS2R, NL3A/NL4GC, FF390/FR1+, NS1/GCFung, NS3+/YM951r and ITS1fGC/ITS2r) amplified the target DNA sequences successfully. Secondly, the selected primer sets were further evaluated based on their resolution to distinguish different fungal cultures through DGGE fingerprints. Three primer sets (NL1+/LS2R, NS1/GCFung and ITS1fGC/ITS2r) were finally selected for investigating the fungal community structure of different traditional fermentation starters for Hong Qu glutinous rice wine. The internal transcribed spacer (ITS) region amplified by ITS1fGC/ITS2r, which is more hypervariable than the 18S rRNA gene and 26S rRNA gene, provides an excellent tool to separate amplification products of different fungal species. Results indicated that PCR-DGGE profile using ITS1fGC/ITS2r showed more abundant fungal species than that using NL1+/LS2R and NS1/GCFung. Therefore, ITS1fGC/ITS2r is the most suitable primer set for PCR-DGGE analysis of fungal community structure in traditional fermentation starters for Hong Qu glutinous rice wine. DGGE profiles based on ITS1fGC/ITS2r revealed the presence of twenty-four fungal species in traditional fermentation starter. A significant difference of fungal community can be observed directly from DGGE fingerprints and principal component analysis. The statistical analysis results based on the band intensities of fungal DGGE profile showed that Saccharomyces cerevisiae, Saccharomycopsis fibuligera, Rhizopus oryzae, Monascus purpureus and Aspergillus niger were the dominant fungal species. In conclusion, the comparison of several primer sets for fungal PCR-DGGE would be useful to enrich our knowledge of the fungal community structures associated with traditional fermentation starters, which may facilitate the development of better starter cultures for manufacturing Chinese Hong Qu glutinous rice wine. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Development of a method for efficient cost-effective screening of Aspergillus niger mutants having increased production of glucoamylase.

    PubMed

    Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping

    2017-05-01

    To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3  U ml -1 , a 70% higher yield of glucoamylase than its parent strain.

  5. Forming a joint dialogue among faith healers, traditional healers and formal health workers in mental health in a Kenyan setting: towards common grounds.

    PubMed

    Musyimi, Christine W; Mutiso, Victoria N; Nandoya, Erick S; Ndetei, David M

    2016-01-07

    Qualitative evidence on dialogue formation and collaboration is very scanty in Kenya. This study thus aimed at the formation of dialogue and establishment of collaboration among the informal (faith and traditional healers) and formal health workers (clinicians) in enhancing community-based mental health in rural Kenya. Qualitative approach was used to identify barriers and solutions for dialogue formation by conducting nine Focus Group Discussions each consisting of 8-10 participants. Information on age, gender and role in health care setting as well as practitioners' (henceforth used to mean informal (faith and traditional healers) and formal health workers) perceptions on dialogue was collected to evaluate dialogue formation. Qualitative and quantitative data analysis was performed using thematic content analysis and Statistical Package Social Sciences (SPSS) software respectively. We identified four dominant themes such as; (i) basic understanding about mental illnesses, (ii) interaction and treatment skills of the respondents to mentally ill persons, (iii) referral gaps and mistrust among the practitioners and (iv) dialogue formation among the practitioners. Although participants were conversant with the definition of mental illness and had interacted with a mentally ill person in their routine practice, they had basic information on the causes and types of mental illness. Traditional and faith healers felt demeaned by the clinicians who disregarded their mode of treatment stereotyping them as "dirty". After various discussions, majority of practitioners showed interest in collaborating with each other and stated that they had joined the dialogue in order interact with people committed to improving the lives of patients. Dialogue formation between the formal and the informal health workers is crucial in establishing trust and respect between both practitioners and in improving mental health care in Kenya. This approach could be scaled up among all the registered traditional and faith healers in Kenya.

  6. Comparison of traditional and sensor-based electronic stethoscopes in beagle dogs.

    PubMed

    Szilvási, Viktória; Vörös, Károly; Manczur, Ferenc; Reiczigel, Jenő; Novák, István; Máthé, Akos; Fekete, Dániel

    2013-03-01

    The objective of this study was to compare the auscultatory findings using traditional and electronic sensor-based stethoscopes. Thirty-three adult healthy Beagles (20 females, 13 males, mean age: 4.8 years, range 1.4-8 years) were auscultated by four investigators with different experiences (INVEST-1, -2, -3 and -4) independently with both stethoscopes. Final cardiological diagnoses were established by echocardiography. Mitral murmurs were heard with both stethoscopes by all investigators and echocardiography revealed mild mitral valve insufficiency in 7 dogs (21%, 4 females, 3 males). The statistical sensitivity (Se) in recognising cardiac murmurs proved to be 82% using the traditional stethoscope and 75% using the electronic one in the mean of the four examiners, whilst statistical specificity (Sp) was 99% by the traditional and 100% by the electronic stethoscope. The means of the auscultatory sensitivity differences between the two stethoscopes were 0.36 on the left and 0.59 on the right hemithorax, demonstrating an advantage for the electronic stethoscope being more obvious above the right hemithorax (P = 0.0340). The electronic stethoscope proved to be superior to the traditional one in excluding cardiac murmurs and especially in auscultation over the right hemithorax. Mitral valve disease was relatively common in this clinically healthy research Beagle population.

  7. Standardized data collection to build prediction models in oncology: a prototype for rectal cancer.

    PubMed

    Meldolesi, Elisa; van Soest, Johan; Damiani, Andrea; Dekker, Andre; Alitto, Anna Rita; Campitelli, Maura; Dinapoli, Nicola; Gatta, Roberto; Gambacorta, Maria Antonietta; Lanzotti, Vito; Lambin, Philippe; Valentini, Vincenzo

    2016-01-01

    The advances in diagnostic and treatment technology are responsible for a remarkable transformation in the internal medicine concept with the establishment of a new idea of personalized medicine. Inter- and intra-patient tumor heterogeneity and the clinical outcome and/or treatment's toxicity's complexity, justify the effort to develop predictive models from decision support systems. However, the number of evaluated variables coming from multiple disciplines: oncology, computer science, bioinformatics, statistics, genomics, imaging, among others could be very large thus making traditional statistical analysis difficult to exploit. Automated data-mining processes and machine learning approaches can be a solution to organize the massive amount of data, trying to unravel important interaction. The purpose of this paper is to describe the strategy to collect and analyze data properly for decision support and introduce the concept of an 'umbrella protocol' within the framework of 'rapid learning healthcare'.

  8. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.

  9. Using Artificial Neural Networks in Educational Research: Some Comparisons with Linear Statistical Models.

    ERIC Educational Resources Information Center

    Everson, Howard T.; And Others

    This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…

  10. Multiple-Solution Problems in a Statistics Classroom: An Example

    ERIC Educational Resources Information Center

    Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing

    2017-01-01

    The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact…

  11. An Automated Circulation System for a Small Technical Library.

    ERIC Educational Resources Information Center

    Culnan, Mary J.

    The traditional manually-controlled circulation records of the Burroughs Corporation Library in Goleta, California, presented problems of inaccuracies, time time-consuming searches, and lack of use statistics. An automated system with the capacity to do file maintenance and statistical record-keeping was implemented on a Burroughts B1700 computer.…

  12. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  13. Inverting an Introductory Statistics Classroom

    ERIC Educational Resources Information Center

    Kraut, Gertrud L.

    2015-01-01

    The inverted classroom allows more in-class time for inquiry-based learning and for working through more advanced problem-solving activities than does the traditional lecture class. The skills acquired in this learning environment offer benefits far beyond the statistics classroom. This paper discusses four ways that can make the inverted…

  14. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    NASA Astrophysics Data System (ADS)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the ARMA class. In most cases the relative improvement of SETAR models against AR models of first order was low ranging between 1% and 4% with the exception of the three-regime model for the River Stour time-series where the improvement was 48.9%. In comparison, the relative improvement of MSW models was between 44.6% and 52.5 for two-regime and from 60.4% to 75% for three-regime models. However, the visual assessment of models plotted against original datasets showed that despite a high value of RSS, some ARMA models could describe the analyzed time-series better than AR, MA and SETAR models with lower values of RSS. In both datasets MSW models provided a very good visual fit describing most of the extreme values.

  15. Environmental Literacy Development: A Comparison between Online and Traditional Campus Courses

    NASA Astrophysics Data System (ADS)

    Taylor, James Young

    As traditional educational efforts expand into the online environment, academic research is needed to determine if effective environmental education could be replicated in the virtual classroom in higher education. Although previous research showed that the online course delivery could be an effective means of teaching environmental facts, what had yet to be determined is if there was a significance difference in the development of an environmental literacy, represented by attitudes and behaviors between online and traditional campus students, at a university within the Western United States. To determine if there was a measured statistical difference in environmental literacy following course completion this causal comparative quantitative study built on the theoretical foundations of environmental literacy development and used the Measures of Ecological Attitudes and Knowledge Scale and New Ecological Paradigm. From a sample of 205 undergraduate environmental science students it was determined, through the use of two tailed t tests at the 0.05 significance level, that no statistical difference in environmental knowledge, actual commitment, and global environmental awareness were evident. However, statistical differences existed in verbal commitment and emotional connection to the environment. Both the online and the traditional campus classroom are shown to be effective in the development of environmental literacy. As technology continues to be incorporated in higher education, environmental educators should see technology as an additional tool in environmental literacy development. However, the identified differences in emotional and verbal commitment should be further investigated.

  16. Application of a Fuzzy Verification Technique for Assessment of the Weather Running Estimate-Nowcast (WRE-N) Model

    DTIC Science & Technology

    2016-10-01

    comes when considering numerous scores and statistics during a preliminary evaluation of the applicability of the fuzzy- verification minimum coverage...The selection of thresholds with which to generate categorical-verification scores and statistics from the application of both traditional and...of statistically significant numbers of cases; the latter presents a challenge of limited application for assessment of the forecast models’ ability

  17. A holistic water balance of Austria - how does the quantitative proportion of urban water requirements relate to other users?

    PubMed

    Vanham, D

    2012-01-01

    Traditional water use statistics only include the blue water withdrawal/consumption of municipalities, industry and irrigated agriculture. When, however, green water use of the agricultural sector is included as well as the virtual water use/water footprint (WF), water use quantity statistics become very different. In common water use statistics, Austria withdraws in total about 2.5 km(3) per year, only 3% of available resources (total discharge 81.4 km(3) = surface and ground water). The total water consumption (0.5 km(3)) is less than 1% of available resources. Urban (municipal) water requirements account for 27% of total withdrawal or 33% of consumption. When agricultural green water use (cropland) is included in statistics, the fraction of municipal water requirements diminishes to 7.6% of total withdrawal and 2.5% of total consumption. If the evapotranspiration of grassland and alpine meadows is also included in agricultural green water use, this fraction decreases to 3.2% and 0.9% respectively. When the WF is assessed as base value for water use in Austria, the municipal water use represents 5.8% of this value. In this globalized world, these traditional water use statistics are no longer recommendable. Only a holistic water balance approach really represents water use statistics.

  18. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  19. Cat dissection and human cadaver prosection versus sculpting human structures from clay: A comparison of alternate approaches to human anatomy laboratory education

    NASA Astrophysics Data System (ADS)

    Waters, John R.

    Dissection and vivisection are traditional approaches to biology laboratory education. In the case of human anatomy teaching laboratories, there is a long tradition of using human and animal cadaver specimens in the classroom. In a review of the literature comparing traditional dissection and vivisection lessons to alternative lessons designed to reduce the time spent dissecting or the numbers of animals used, we conclude that it is difficult to come to any conclusion regarding the efficacy of different approaches. An analysis of the literature is confounded because many studies have very low statistical power or other methodological weaknesses, and investigators rely on a wide variety of testing instruments to measure an equally varied number of course objectives. Additional well designed studies are necessary before educators can reach any informed conclusions about the efficacy of traditional versus alternative approaches to laboratory education. In our experiments, we compared a traditional cat dissection based undergraduate human anatomy lesson to an alternative where students sculpted human muscles onto plastic human skeletons. Students in the alternative treatment performed significantly better than their peers in the traditional treatment when answering both lower and higher order human anatomy questions. In a subsequent experiment with a similar design, we concluded that the superior performance of the students in the alternative treatment on anatomy exams was likely due to the similarity between the human anatomy representation studied in lab, and the human anatomy questions asked on the exams. When the anatomy questions were presented in the context of a cat specimen, students in the traditional cat dissection treatment outperformed their peers in the alternative treatment. In a final experiment where student performance on a human anatomy exam was compared between a traditional prosected human cadaver treatment and the alternative clay sculpting treatment, no significant difference were detected, suggesting that the complexity or simplicity of the anatomy representation is less important than the similarity between the learning experience and the testing experience.

  20. A Bayesian approach to the statistical analysis of device preference studies.

    PubMed

    Fu, Haoda; Qu, Yongming; Zhu, Baojin; Huster, William

    2012-01-01

    Drug delivery devices are required to have excellent technical specifications to deliver drugs accurately, and in addition, the devices should provide a satisfactory experience to patients because this can have a direct effect on drug compliance. To compare patients' experience with two devices, cross-over studies with patient-reported outcomes (PRO) as response variables are often used. Because of the strength of cross-over designs, each subject can directly compare the two devices by using the PRO variables, and variables indicating preference (preferring A, preferring B, or no preference) can be easily derived. Traditionally, methods based on frequentist statistics can be used to analyze such preference data, but there are some limitations for the frequentist methods. Recently, Bayesian methods are considered an acceptable method by the US Food and Drug Administration to design and analyze device studies. In this paper, we propose a Bayesian statistical method to analyze the data from preference trials. We demonstrate that the new Bayesian estimator enjoys some optimal properties versus the frequentist estimator. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion.

    PubMed

    Gautestad, Arild O

    2012-09-07

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.

  2. Sensitivity of the Halstead and Wechsler Test Batteries to brain damage: Evidence from Reitan's original validation sample.

    PubMed

    Loring, David W; Larrabee, Glenn J

    2006-06-01

    The Halstead-Reitan Battery has been instrumental in the development of neuropsychological practice in the United States. Although Reitan administered both the Wechsler-Bellevue Intelligence Scale and Halstead's test battery when evaluating Halstead's theory of biologic intelligence, the relative sensitivity of each test battery to brain damage continues to be an area of controversy. Because Reitan did not perform direct parametric analysis to contrast group performances, we reanalyze Reitan's original validation data from both Halstead (Reitan, 1955) and Wechsler batteries (Reitan, 1959a) and calculate effect sizes and probability levels using traditional parametric approaches. Eight of the 10 tests comprising Halstead's original Impairment Index, as well as the Impairment Index itself, statistically differentiated patients with unequivocal brain damage from controls. In addition, 13 of 14 Wechsler measures including Full-Scale IQ also differed statistically between groups (Brain Damage Full-Scale IQ = 96.2; Control Group Full Scale IQ = 112.6). We suggest that differences in the statistical properties of each battery (e.g., raw scores vs. standardized scores) likely contribute to classification characteristics including test sensitivity and specificity.

  3. Lagrangian single-particle turbulent statistics through the Hilbert-Huang transform.

    PubMed

    Huang, Yongxiang; Biferale, Luca; Calzavarini, Enrico; Sun, Chao; Toschi, Federico

    2013-04-01

    The Hilbert-Huang transform is applied to analyze single-particle Lagrangian velocity data from numerical simulations of hydrodynamic turbulence. The velocity trajectory is described in terms of a set of intrinsic mode functions C(i)(t) and of their instantaneous frequency ω(i)(t). On the basis of this decomposition we define the ω-conditioned statistical moments of the C(i) modes, named q-order Hilbert spectra (HS). We show that such quantities have enhanced scaling properties as compared to traditional Fourier transform- or correlation-based (structure functions) statistical indicators, thus providing better insights into the turbulent energy transfer process. We present clear empirical evidence that the energylike quantity, i.e., the second-order HS, displays a linear scaling in time in the inertial range, as expected from a dimensional analysis. We also measure high-order moment scaling exponents in a direct way, without resorting to the extended self-similarity procedure. This leads to an estimate of the Lagrangian structure function exponents which are consistent with the multifractal prediction in the Lagrangian frame as proposed by Biferale et al. [Phys. Rev. Lett. 93, 064502 (2004)].

  4. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  5. A study on rate of infestation to Sarcocystis cysts in supplied raw hamburgers.

    PubMed

    Nematollahia, Ahmad; Khoshkerdar, Afsaneh; Helan, Javad Ashrafi; Shahbazi, Parisa; Hassanzadeh, Parviz

    2015-06-01

    This study was carried on for determination of presence of Sarcocystis cysts in raw hamburgers in Tabriz North West of Iran. Ninety-six samples of industrial (70 % meat content) and traditional (30 % meat content) hamburgers (80 samples industrial and 16 samples traditional) were obtained from retail fast food stores. The samples were examined by gross examination, and microscopic examination methods consist impression smear and peptic digestion. Macroscopic cysts did not observed in any of the samples in gross examination. Microscopic study showed that from 96 samples 54 (56.25 %) samples were infected by at least one bradyzoites of Sarcocystis. From 54 infected samples, 45 industrial hamburgers and nine traditional hamburgers samples were infected. Statistical analysis showed that there was not significant differences between industrial and traditional hamburgers in infection to Sarcocystis. Infestation of hamburgers to Sarcocystis in summer was higher than other seasons but this difference was not significant. In Iran, beef meat is used for preparation of 70 % of hamburger and infestation of cattle to sarcocystosis was reported in many investigations in Iran. With regard to the high prevalence of Sarcocystis infection in meat products such as hamburgers in this study, it is strongly recommended to avoid eating raw or under-cooked hamburgers or keep them at freezing temperature for at least 3-5 days.

  6. The impact of teacher preparation on student achievement in rural secondary schools

    NASA Astrophysics Data System (ADS)

    Barnes, Shontier Prescott

    The primary purpose of this study was to examine significant differences, if any, in student achievement in the area of math and science of students taught by traditionally certified teachers and alternatively certified teachers. This study examined alternatively certified teachers, as identified from through the Georgia TAPP (Teacher Alternative Preparation Program), and traditionally certified teachers in rural high schools in the Central Savannah River Area of Georgia. Student achievement was measured by student scores on the Algebra I and Physical Science End-Of-Course Test, a criterion-referenced test aligned with state adopted curriculum standards. The study utilized frequency distributions, correlations, descriptive statistics, and univariate analysis of variance (ANOVA) to examine the data. Univariate tests were done to find individual differences for each dependent variable. The ANOVA was used for the single dependent (student achievement) and formed comparisons and tracked the effect of the independent variable (teacher preparation), each of which (traditional and alternative) may have a number of levels and may interact to affect the dependent variable. The covariates, the independent variables not manipulated, but still affecting the response, are students' ethnicity, gender, and school socioeconomic status were also analyzed to predict student achievement. KEY WORDS. Teacher Preparation, Student Achievement, Math, Science, Traditionally certified teachers, Alternatively certified teachers, Georgia TAPP (Teacher Alternative Preparation Program), End-Of-Course Test (EOCT), Performance standard.

  7. Quantitative Assessment of Blood Pressure Measurement Accuracy and Variability from Visual Auscultation Method by Observers without Receiving Medical Training

    PubMed Central

    Feng, Yong; Chen, Aiqing

    2017-01-01

    This study aimed to quantify blood pressure (BP) measurement accuracy and variability with different techniques. Thirty video clips of BP recordings from the BHS training database were converted to Korotkoff sound waveforms. Ten observers without receiving medical training were asked to determine BPs using (a) traditional manual auscultatory method and (b) visual auscultation method by visualizing the Korotkoff sound waveform, which was repeated three times on different days. The measurement error was calculated against the reference answers, and the measurement variability was calculated from the SD of the three repeats. Statistical analysis showed that, in comparison with the auscultatory method, visual method significantly reduced overall variability from 2.2 to 1.1 mmHg for SBP and from 1.9 to 0.9 mmHg for DBP (both p < 0.001). It also showed that BP measurement errors were significant for both techniques (all p < 0.01, except DBP from the traditional method). Although significant, the overall mean errors were small (−1.5 and −1.2 mmHg for SBP and −0.7 and 2.6 mmHg for DBP, resp., from the traditional auscultatory and visual auscultation methods). In conclusion, the visual auscultation method had the ability to achieve an acceptable degree of BP measurement accuracy, with smaller variability in comparison with the traditional auscultatory method. PMID:29423405

  8. Designing, implementing and evaluating an online problem-based learning (PBL) environment--a pilot study.

    PubMed

    Ng, Manwa L; Bridges, Susan; Law, Sam Po; Whitehill, Tara

    2014-01-01

    Problem-based learning (PBL) has been shown to be effective for promoting student competencies in self-directed and collaborative learning, critical thinking, self-reflection and tackling novel situations. However, the need for face-to-face interactions at the same place and time severely limits the potential of traditional PBL. The requirements of space and for meeting at a specific location at the same time create timetabling difficulties. Such limitations need to be tackled before all potentials of PBL learning can be realized. The present study aimed at designing and implementing an online PBL environment for undergraduate speech/language pathology students, and assessing the associated pedagogical effectiveness. A group of eight PBL students were randomly selected to participate in the study. They underwent 4 weeks of online PBL using Adobe Connect. Upon completion of the experiment, they were assessed via a self-reported questionnaire and quantitative comparison with traditional PBL students based on the same written assignment. The questionnaire revealed that all participating students enjoyed online PBL, without any perceived negative effects on learning. Online PBL unanimously saved the students travel time to and from school. Statistical analysis indicated no significant difference in assignment grades between the online and traditional PBL groups, indicating that online PBL learning appears to be similarly effective as traditional face-to-face PBL learning.

  9. The Impact of Learning Styles on Student Achievement in a Web-Based versus an Equivalent Face-to-Face Course

    ERIC Educational Resources Information Center

    Zacharis, Nick Z.

    2010-01-01

    This study investigated the relationship between students' learning styles and their achievement in two different learning environments: online instruction and traditional instruction. The results indicated that a) students in the traditional learning group had higher, but not statistically significant higher, levels of achievement than students…

  10. A practical guide to environmental association analysis in landscape genomics.

    PubMed

    Rellstab, Christian; Gugerli, Felix; Eckert, Andrew J; Hancock, Angela M; Holderegger, Rolf

    2015-09-01

    Landscape genomics is an emerging research field that aims to identify the environmental factors that shape adaptive genetic variation and the gene variants that drive local adaptation. Its development has been facilitated by next-generation sequencing, which allows for screening thousands to millions of single nucleotide polymorphisms in many individuals and populations at reasonable costs. In parallel, data sets describing environmental factors have greatly improved and increasingly become publicly accessible. Accordingly, numerous analytical methods for environmental association studies have been developed. Environmental association analysis identifies genetic variants associated with particular environmental factors and has the potential to uncover adaptive patterns that are not discovered by traditional tests for the detection of outlier loci based on population genetic differentiation. We review methods for conducting environmental association analysis including categorical tests, logistic regressions, matrix correlations, general linear models and mixed effects models. We discuss the advantages and disadvantages of different approaches, provide a list of dedicated software packages and their specific properties, and stress the importance of incorporating neutral genetic structure in the analysis. We also touch on additional important aspects such as sampling design, environmental data preparation, pooled and reduced-representation sequencing, candidate-gene approaches, linearity of allele-environment associations and the combination of environmental association analyses with traditional outlier detection tests. We conclude by summarizing expected future directions in the field, such as the extension of statistical approaches, environmental association analysis for ecological gene annotation, and the need for replication and post hoc validation studies. © 2015 John Wiley & Sons Ltd.

  11. Development of a Reference Image Collection Library for Histopathology Image Processing, Analysis and Decision Support Systems Research.

    PubMed

    Kostopoulos, Spiros; Ravazoula, Panagiota; Asvestas, Pantelis; Kalatzis, Ioannis; Xenogiannopoulos, George; Cavouras, Dionisis; Glotsos, Dimitris

    2017-06-01

    Histopathology image processing, analysis and computer-aided diagnosis have been shown as effective assisting tools towards reliable and intra-/inter-observer invariant decisions in traditional pathology. Especially for cancer patients, decisions need to be as accurate as possible in order to increase the probability of optimal treatment planning. In this study, we propose a new image collection library (HICL-Histology Image Collection Library) comprising 3831 histological images of three different diseases, for fostering research in histopathology image processing, analysis and computer-aided diagnosis. Raw data comprised 93, 116 and 55 cases of brain, breast and laryngeal cancer respectively collected from the archives of the University Hospital of Patras, Greece. The 3831 images were generated from the most representative regions of the pathology, specified by an experienced histopathologist. The HICL Image Collection is free for access under an academic license at http://medisp.bme.teiath.gr/hicl/ . Potential exploitations of the proposed library may span over a board spectrum, such as in image processing to improve visualization, in segmentation for nuclei detection, in decision support systems for second opinion consultations, in statistical analysis for investigation of potential correlations between clinical annotations and imaging findings and, generally, in fostering research on histopathology image processing and analysis. To the best of our knowledge, the HICL constitutes the first attempt towards creation of a reference image collection library in the field of traditional histopathology, publicly and freely available to the scientific community.

  12. The Feynman-Y Statistic in Relation to Shift-Register Neutron Coincidence Counting: Precision and Dead Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, Stephen; Santi, Peter A.; Henzlova, Daniela

    The Feynman-Y statistic is a type of autocorrelation analysis. It is defined as the excess variance-to-mean ratio, Y = VMR - 1, of the number count distribution formed by sampling a pulse train using a series of non-overlapping gates. It is a measure of the degree of correlation present on the pulse train with Y = 0 for Poisson data. In the context of neutron coincidence counting we show that the same information can be obtained from the accidentals histogram acquired using the multiplicity shift-register method, which is currently the common autocorrelation technique applied in nuclear safeguards. In the casemore » of multiplicity shift register analysis however, overlapping gates, either triggered by the incoming pulse stream or by a periodic clock, are used. The overlap introduces additional covariance but does not alter the expectation values. In this paper we discuss, for a particular data set, the relative merit of the Feynman and shift-register methods in terms of both precision and dead time correction. Traditionally the Feynman approach is applied with a relatively long gate width compared to the dieaway time. The main reason for this is so that the gate utilization factor can be taken as unity rather than being treated as a system parameter to be determined at characterization/calibration. But because the random trigger interval gate utilization factor is slow to saturate this procedure requires a gate width many times the effective 1/e dieaway time. In the traditional approach this limits the number of gates that can be fitted into a given assay duration. We empirically show that much shorter gates, similar in width to those used in traditional shift register analysis can be used. Because the way in which the correlated information present on the pulse train is extracted is different for the moments based method of Feynman and the various shift register based approaches, the dead time losses are manifested differently for these two approaches. The resulting estimates for the dead time corrected first and second order reduced factorial moments should be independent of the method however and this allows the respective dead time formalism to be checked. We discuss how to make dead time corrections in both the shift register and the Feynman approaches.« less

  13. HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.

    PubMed

    Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C

    2004-07-01

    A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.

  14. Novel method of fabricating individual trays for maxillectomy patients by computer-aided design and rapid prototyping.

    PubMed

    Huang, Zhi; Wang, Xin-zhi; Hou, Yue-Zhong

    2015-02-01

    Making impressions for maxillectomy patients is an essential but difficult task. This study developed a novel method to fabricate individual trays by computer-aided design (CAD) and rapid prototyping (RP) to simplify the process and enhance patient safety. Five unilateral maxillectomy patients were recruited for this study. For each patient, a computed tomography (CT) scan was taken. Based on the 3D surface reconstruction of the target area, an individual tray was manufactured by CAD/RP. With a conventional custom tray as control, two final impressions were made using the different types of tray for each patient. The trays were sectioned, and in each section the thickness of the material was measured at six evenly distributed points. Descriptive statistics and paired t-test were used to examine the difference of the impression thickness. SAS 9.3 was applied in the statistical analysis. Afterwards, all casts were then optically 3D scanned and compared digitally to evaluate the feasibility of this method. Impressions of all five maxillectomy patients were successfully made with individual trays fabricated by CAD/RP and traditional trays. The descriptive statistics of impression thickness measurement showed slightly more uneven results in the traditional trays, but no statistical significance was shown. A 3D digital comparison showed acceptable discrepancies within 1 mm in the majority of cast areas. The largest difference of 3 mm was observed in the buccal wall of the defective areas. Moderate deviations of 1 to 2 mm were detected in the buccal and labial vestibular groove areas. This study confirmed the feasibility of a novel method of fabricating individual trays by CAD/RP. Impressions made by individual trays manufactured using CAD/RP had a uniform thickness, with an acceptable level of accuracy compared to those made through conventional processes. © 2014 by the American College of Prosthodontists.

  15. Effectiveness of modified seminars as a teaching-learning method in pharmacology

    PubMed Central

    Palappallil, Dhanya Sasidharan; Sushama, Jitha; Ramnath, Sai Nathan

    2016-01-01

    Context: Student-led seminars (SLS) are adopted as a teaching-learning (T-L) method in pharmacology. Previous studies assessing the feedback on T-L methods in pharmacology points out that the traditional seminars consistently received poor feedbacks as they were not favorite among the students. Aims: This study aimed to obtain feedback on traditional SLS, introduce modified SLS and compare the modified seminars with the traditional ones. Settings and Design: This was a prospective interventional study done for 2 months in medical undergraduates of fifth semester attending Pharmacology seminars at a Government Medical College in South India. Subjects and Methods: Structured questionnaire was used to elicit feedback from participants. The responses were coded on 5-point Likert scale. Modifications in seminar sessions such as role plays, quiz, tests, group discussion, and patient-oriented problem-solving exercises were introduced along with SLS. Statistical Analysis Used: The data were analyzed using SPSS version 16. The descriptive data were expressed using frequencies and percentages. Wilcoxon signed rank test, and Friedman tests were used to compare traditional with modified seminars. Results: The participants identified interaction as the most important component of a seminar. Majority opined that the teacher should summarize at the end of SLS. Student feedback shows that modified seminars created more interest, enthusiasm, and inspiration to learn the topic when compared to traditional SLS. They also increased peer coordination and group dynamics. Students opined that communication skills and teacher-student interactions were not improved with modified seminars. Conclusions: Interventions in the form of modified SLS may be adopted to break the monotony of traditional seminars through active participation, peer interaction, and teamwork. PMID:27563587

  16. Three-dimensional virtual bronchoscopy using a tablet computer to guide real-time transbronchial needle aspiration.

    PubMed

    Fiorelli, Alfonso; Raucci, Antonio; Cascone, Roberto; Reginelli, Alfonso; Di Natale, Davide; Santoriello, Carlo; Capuozzo, Antonio; Grassi, Roberto; Serra, Nicola; Polverino, Mario; Santini, Mario

    2017-04-01

    We proposed a new virtual bronchoscopy tool to improve the accuracy of traditional transbronchial needle aspiration for mediastinal staging. Chest-computed tomographic images (1 mm thickness) were reconstructed with Osirix software to produce a virtual bronchoscopic simulation. The target adenopathy was identified by measuring its distance from the carina on multiplanar reconstruction images. The static images were uploaded in iMovie Software, which produced a virtual bronchoscopic movie from the images; the movie was then transferred to a tablet computer to provide real-time guidance during a biopsy. To test the validity of our tool, we divided all consecutive patients undergoing transbronchial needle aspiration retrospectively in two groups based on whether the biopsy was guided by virtual bronchoscopy (virtual bronchoscopy group) or not (traditional group). The intergroup diagnostic yields were statistically compared. Our analysis included 53 patients in the traditional and 53 in the virtual bronchoscopy group. The sensitivity, specificity, positive predictive value, negative predictive value and diagnostic accuracy for the traditional group were 66.6%, 100%, 100%, 10.53% and 67.92%, respectively, and for the virtual bronchoscopy group were 84.31%, 100%, 100%, 20% and 84.91%, respectively. The sensitivity ( P  = 0.011) and diagnostic accuracy ( P  = 0.011) of sampling the paratracheal station were better for the virtual bronchoscopy group than for the traditional group; no significant differences were found for the subcarinal lymph node. Our tool is simple, economic and available in all centres. It guided in real time the needle insertion, thereby improving the accuracy of traditional transbronchial needle aspiration, especially when target lesions are located in a difficult site like the paratracheal station. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  17. Comparative study between 2 methods of mounting models in semiadjustable articulator for orthognathic surgery.

    PubMed

    Mayrink, Gabriela; Sawazaki, Renato; Asprino, Luciana; de Moraes, Márcio; Fernandes Moreira, Roger William

    2011-11-01

    Compare the traditional method of mounting dental casts on a semiadjustable articulator and the new method suggested by Wolford and Galiano, 1 analyzing the inclination of maxillary occlusal plane in relation to FHP. Two casts of 10 patients were obtained. One of them was used for mounting of models on a traditional articulator, by using a face bow transfer system and the other one was used to mounting models at Occlusal Plane Indicator platform (OPI), using the SAM articulator. After that, na analysis of the accuracy of mounting models was performed. The angle made by de occlusal plane and FHP on the cephalogram should be equal the angle between the occlusal plane and the upper member of the articulator. The measures were tabulated in Microsoft Excell(®) and calculated using a 1-way analysis variance. Statistically, the results did not reveal significant differences among the measures. OPI and face bow presents similar results but more studies are needed to verify its accuracy relative to the maxillary cant in OPI or develop new techniques able to solve the disadvantages of each technique. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Neural net diagnostics for VLSI test

    NASA Technical Reports Server (NTRS)

    Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.

    1990-01-01

    This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.

  19. Antibacterial and phytochemical screening of Anethum graveolens, Foeniculum vulgare and Trachyspermum ammi

    PubMed Central

    Kaur, Gurinder J; Arora, Daljit S

    2009-01-01

    Background Anethum graveolens Linn., Foeniculum vulgare Mill. and Trachyspermum ammi L. are widely used traditional medicinal plants to treat various ailments. To provide a scientific basis to traditional uses of these plants, their aqueous and organic seed extracts, as well as isolated phytoconstituents were evaluated for their antibacterial potential. Methods Antibacterial activity of aqueous and organic seed extracts was assessed using agar diffusion assay, minimum inhibitory concentration and viable cell count studies; and their antibacterial effect was compared with some standard antibiotics. The presence of major phytoconstituents was detected qualitatively and quantitatively. The isolated phytoconstituents were subjected to disc diffusion assay to ascertain their antibacterial effect. Results Hot water and acetone seed extracts showed considerably good antibacterial activity against all the bacteria except Klebsiella pneumoniae and one strain of Pseudomonas aeruginosa. Minimum inhibitory concentration for aqueous and acetone seed extracts ranged from 20–80 mg/ml and 5–15 mg/ml respectively. Viable cell count studies revealed the bactericidal nature of the seed extracts. Statistical analysis proved the better/equal efficacy of some of these seed extracts as compared to standard antibiotics. Phytochemical analysis showed the presence of 2.80 – 4.23% alkaloids, 8.58 – 15.06% flavonoids, 19.71 – 27.77% tannins, 0.55–0.70% saponins and cardiac glycosides. Conclusion Antibacterial efficacy shown by these plants provides a scientific basis and thus, validates their traditional uses as homemade remedies. Isolation and purification of different phytochemicals may further yield significant antibacterial agents. PMID:19656417

  20. Clinical simulation training improves the clinical performance of Chinese medical students

    PubMed Central

    Zhang, Ming-ya; Cheng, Xin; Xu, An-ding; Luo, Liang-ping; Yang, Xuesong

    2015-01-01

    Background Modern medical education promotes medical students’ clinical operating capacity rather than the mastery of theoretical knowledge. To accomplish this objective, clinical skill training using various simulations was introduced into medical education to cultivate creativity and develop the practical ability of students. However, quantitative analysis of the efficiency of clinical skill training with simulations is lacking. Methods In the present study, we compared the mean scores of medical students (Jinan University) who graduated in 2013 and 2014 on 16 stations between traditional training (control) and simulative training groups. In addition, in a clinical skill competition, the objective structured clinical examination (OSCE) scores of participating medical students trained using traditional and simulative training were compared. The data were statistically analyzed and qualitatively described. Results The results revealed that simulative training could significantly enhance the graduate score of medical students compared with the control. The OSCE scores of participating medical students in the clinical skill competition, trained using simulations, were dramatically higher than those of students trained through traditional methods, and we also observed that the OSCE marks were significantly increased for the same participant after simulative training for the clinical skill competition. Conclusions Taken together, these data indicate that clinical skill training with a variety of simulations could substantially promote the clinical performance of medical students and optimize the resources used for medical education, although a precise analysis of each specialization is needed in the future. PMID:26478142

  1. Diabetes Care Management Teams Did Not Reduce Utilization When Compared With Traditional Care: A Randomized Cluster Trial.

    PubMed

    Kearns, Patrick

    2017-10-01

    PURPOSE: Health services research evaluates redesign models for primary care. Care management is one alternative. Evaluation includes resource utilization as a criterion. Compare the impact of care-manager teams on resource utilization. The comparison includes entire panes of patients and the subset of patients with diabetes. DESIGN: Randomized, prospective, cohort study comparing change in utilization rates between groups, pre- and post-intervention. METHODOLOGY: Ten primary care physician panels in a safety-net setting. Ten physicians were randomized to either a care-management approach (Group 1) or a traditional approach (Group 2). Care managers focused on diabetes and the cardiovascular cluster of diseases. Analysis compared rates of hospitalization, 30-day readmission, emergency room visits, and urgent care visits. Analysis compared baseline rates to annual rates after a yearlong run-in for entire panels and the subset of patients with diabetes. RESULTS: Resource utilization showed no statistically significant change between baseline and Year 3 (P=.79). Emergency room visits and hospital readmission increased for both groups (P=.90), while hospital admissions and urgent care visits decreased (P=.73). Similarly, utilization was not significantly different for patients with diabetes (P=.69). CONCLUSIONS: A care-management team approach failed to improve resource utilization rates by entire panels and the subset of diabetic patients compared to traditional care. This reinforces the need for further evidentiary support for the care-management model's hypothesis in the safety net.

  2. Modeling neuroendocrine stress reactivity in salivary cortisol: adjusting for peak latency variability.

    PubMed

    Lopez-Duran, Nestor L; Mayer, Stefanie E; Abelson, James L

    2014-07-01

    In this report, we present growth curve modeling (GCM) with landmark registration as an alternative statistical approach for the analysis of time series cortisol data. This approach addresses an often-ignored but critical source of variability in salivary cortisol analyses: individual and group differences in the time latency of post-stress peak concentrations. It allows for the simultaneous examination of cortisol changes before and after the peak while controlling for timing differences, and thus provides additional information that can help elucidate group differences in the underlying biological processes (e.g., intensity of response, regulatory capacity). We tested whether GCM with landmark registration is more sensitive than traditional statistical approaches (e.g., repeated measures ANOVA--rANOVA) in identifying sex differences in salivary cortisol responses to a psychosocial stressor (Trier Social Stress Test--TSST) in healthy adults (mean age 23). We used plasma ACTH measures as our "standard" and show that the new approach confirms in salivary cortisol the ACTH finding that males had longer peak latencies, higher post-stress peaks but a more intense post-peak decline. This finding would have been missed if only saliva cortisol was available and only more traditional analytic methods were used. This new approach may provide neuroendocrine researchers with a highly sensitive complementary tool to examine the dynamics of the cortisol response in a way that reduces risk of false negative findings when blood samples are not feasible.

  3. Are the correct herbal claims by Hildegard von Bingen only lucky strikes? A new statistical approach.

    PubMed

    Uehleke, Bernhard; Hopfenmueller, Werner; Stange, Rainer; Saller, Reinhard

    2012-01-01

    Ancient and medieval herbal books are often believed to describe the same claims still in use today. Medieval herbal books, however, provide long lists of claims for each herb, most of which are not approved today, while the herb's modern use is often missing. So the hypothesis arises that a medieval author could have randomly hit on 'correct' claims among his many 'wrong' ones. We developed a statistical procedure based on a simple probability model. We applied our procedure to the herbal books of Hildegard von Bingen (1098- 1179) as an example for its usefulness. Claim attributions for a certain herb were classified as 'correct' if approximately the same as indicated in actual monographs. The number of 'correct' claim attributions was significantly higher than it could have been by pure chance, even though the vast majority of Hildegard von Bingen's claims were not 'correct'. The hypothesis that Hildegard would have achieved her 'correct' claims purely by chance can be clearly rejected. The finding that medical claims provided by a medieval author are significantly related to modern herbal use supports the importance of traditional medicinal systems as an empirical source. However, since many traditional claims are not in accordance with modern applications, they should be used carefully and analyzed in a systematic, statistics-based manner. Our statistical approach can be used for further systematic comparison of herbal claims of traditional sources as well as in the fields of ethnobotany and ethnopharmacology. Copyright © 2012 S. Karger AG, Basel.

  4. Beyond the floor effect on the WISC-IV in individuals with Down syndrome: are there cognitive strengths and weaknesses?

    PubMed

    Pezzuti, L; Nacinovich, R; Oggiano, S; Bomba, M; Ferri, R; La Stella, A; Rossetti, S; Orsini, A

    2018-07-01

    Individuals with Down syndrome generally show a floor effect on Wechsler Scales that is manifested by flat profiles and with many or all of the weighted scores on the subtests equal to 1. The main aim of the present paper is to use the statistical Hessl method and the extended statistical method of Orsini, Pezzuti and Hulbert with a sample of individuals with Down syndrome (n = 128; 72 boys and 56 girls), to underline the variability of performance on Wechsler Intelligence Scale for Children-Fourth Edition subtests and indices, highlighting any strengths and weaknesses of this population that otherwise appear to be flattened. Based on results using traditional transformation of raw scores into weighted scores, a very high percentage of subtests with weighted score of 1 occurred in the Down syndrome sample, with a floor effect and without any statistically significant difference between four core Wechsler Intelligence Scale for Children-Fourth Edition indices. The results, using traditional transformation, confirm a deep cognitive impairment of those with Down syndrome. Conversely, using the new statistical method, it is immediately apparent that the variability of the scores, both on subtests and indices, is wider with respect to the traditional method. Children with Down syndrome show a greater ability in the Verbal Comprehension Index than in the Working Memory Index. © 2018 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  5. Using the Flipped Classroom to Bridge the Gap to Generation Y

    PubMed Central

    Gillispie, Veronica

    2016-01-01

    Background: The flipped classroom is a student-centered approach to learning that increases active learning for the student compared to traditional classroom-based instruction. In the flipped classroom model, students are first exposed to the learning material through didactics outside of the classroom, usually in the form of written material, voice-over lectures, or videos. During the formal teaching time, an instructor facilitates student-driven discussion of the material via case scenarios, allowing for complex problem solving, peer interaction, and a deep understanding of the concepts. A successful flipped classroom should have three goals: (1) allow the students to become critical thinkers, (2) fully engage students and instructors, and (3) stimulate the development of a deep understanding of the material. The flipped classroom model includes teaching and learning methods that can appeal to all four generations in the academic environment. Methods: During the 2015 academic year, we implemented the flipped classroom in the obstetrics and gynecology clerkship for the Ochsner Clinical School in New Orleans, LA. Voice-over presentations of the lectures that had been given to students in prior years were recorded and made available to the students through an online classroom. Weekly problem-based learning sessions matched to the subjects of the traditional lectures were held, and the faculty who had previously presented the information in the traditional lecture format facilitated the problem-based learning sessions. The knowledge base of students was evaluated at the end of the rotation via a multiple-choice question examination and the Objective Structured Clinical Examination (OSCE) as had been done in previous years. We compared demographic information and examination scores for traditional teaching and flipped classroom groups of students. The traditional teaching group consisted of students from Rotation 2 and Rotation 3 of the 2014 academic year who received traditional classroom-based instruction. The flipped classroom group consisted of students from Rotation 2 and Rotation 3 of the 2015 academic year who received formal didactics via voice-over presentation and had the weekly problem-based learning sessions. Results: When comparing the students taught by traditional methods to those taught in the flipped classroom model, we saw a statistically significant increase in test scores on the multiple-choice question examination in both the obstetrics and gynecology sections in Rotation 2. While the average score for the flipped classroom group increased in Rotation 3 on the obstetrics section of the multiple-choice question examination, the difference was not statistically significant. Unexpectedly, the average score on the gynecology portion of the multiple-choice question examination decreased among the flipped classroom group compared to the traditional teaching group, and this decrease was statistically significant. For both the obstetrics and the gynecology portions of the OSCE, we saw statistically significant increases in the scores for the flipped classroom group in both Rotation 2 and Rotation 3 compared to the traditional teaching group. With the exception of the gynecology portion of the multiple-choice question examination in Rotation 3, we saw improvement in scores after the implementation of the flipped classroom. Conclusion: The flipped classroom is a feasible and useful alternative to the traditional classroom. It is a method that embraces Generation Y's need for active learning in a group setting while maintaining a traditional classroom method for introducing the information. Active learning increases student engagement and can lead to improved retention of material as demonstrated on standard examinations. PMID:27046401

  6. Using the Flipped Classroom to Bridge the Gap to Generation Y.

    PubMed

    Gillispie, Veronica

    2016-01-01

    The flipped classroom is a student-centered approach to learning that increases active learning for the student compared to traditional classroom-based instruction. In the flipped classroom model, students are first exposed to the learning material through didactics outside of the classroom, usually in the form of written material, voice-over lectures, or videos. During the formal teaching time, an instructor facilitates student-driven discussion of the material via case scenarios, allowing for complex problem solving, peer interaction, and a deep understanding of the concepts. A successful flipped classroom should have three goals: (1) allow the students to become critical thinkers, (2) fully engage students and instructors, and (3) stimulate the development of a deep understanding of the material. The flipped classroom model includes teaching and learning methods that can appeal to all four generations in the academic environment. During the 2015 academic year, we implemented the flipped classroom in the obstetrics and gynecology clerkship for the Ochsner Clinical School in New Orleans, LA. Voice-over presentations of the lectures that had been given to students in prior years were recorded and made available to the students through an online classroom. Weekly problem-based learning sessions matched to the subjects of the traditional lectures were held, and the faculty who had previously presented the information in the traditional lecture format facilitated the problem-based learning sessions. The knowledge base of students was evaluated at the end of the rotation via a multiple-choice question examination and the Objective Structured Clinical Examination (OSCE) as had been done in previous years. We compared demographic information and examination scores for traditional teaching and flipped classroom groups of students. The traditional teaching group consisted of students from Rotation 2 and Rotation 3 of the 2014 academic year who received traditional classroom-based instruction. The flipped classroom group consisted of students from Rotation 2 and Rotation 3 of the 2015 academic year who received formal didactics via voice-over presentation and had the weekly problem-based learning sessions. When comparing the students taught by traditional methods to those taught in the flipped classroom model, we saw a statistically significant increase in test scores on the multiple-choice question examination in both the obstetrics and gynecology sections in Rotation 2. While the average score for the flipped classroom group increased in Rotation 3 on the obstetrics section of the multiple-choice question examination, the difference was not statistically significant. Unexpectedly, the average score on the gynecology portion of the multiple-choice question examination decreased among the flipped classroom group compared to the traditional teaching group, and this decrease was statistically significant. For both the obstetrics and the gynecology portions of the OSCE, we saw statistically significant increases in the scores for the flipped classroom group in both Rotation 2 and Rotation 3 compared to the traditional teaching group. With the exception of the gynecology portion of the multiple-choice question examination in Rotation 3, we saw improvement in scores after the implementation of the flipped classroom. The flipped classroom is a feasible and useful alternative to the traditional classroom. It is a method that embraces Generation Y's need for active learning in a group setting while maintaining a traditional classroom method for introducing the information. Active learning increases student engagement and can lead to improved retention of material as demonstrated on standard examinations.

  7. Analysis of correlation between pediatric asthma exacerbation and exposure to pollutant mixtures with association rule mining.

    PubMed

    Toti, Giulia; Vilalta, Ricardo; Lindner, Peggy; Lefer, Barry; Macias, Charles; Price, Daniel

    2016-11-01

    Traditional studies on effects of outdoor pollution on asthma have been criticized for questionable statistical validity and inefficacy in exploring the effects of multiple air pollutants, alone and in combination. Association rule mining (ARM), a method easily interpretable and suitable for the analysis of the effects of multiple exposures, could be of use, but the traditional interest metrics of support and confidence need to be substituted with metrics that focus on risk variations caused by different exposures. We present an ARM-based methodology that produces rules associated with relevant odds ratios and limits the number of final rules even at very low support levels (0.5%), thanks to post-pruning criteria that limit rule redundancy and control for statistical significance. The methodology has been applied to a case-crossover study to explore the effects of multiple air pollutants on risk of asthma in pediatric subjects. We identified 27 rules with interesting odds ratio among more than 10,000 having the required support. The only rule including only one chemical is exposure to ozone on the previous day of the reported asthma attack (OR=1.14). 26 combinatory rules highlight the limitations of air quality policies based on single pollutant thresholds and suggest that exposure to mixtures of chemicals is more harmful, with odds ratio as high as 1.54 (associated with the combination day0 SO 2 , day0 NO, day0 NO 2 , day1 PM). The proposed method can be used to analyze risk variations caused by single and multiple exposures. The method is reliable and requires fewer assumptions on the data than parametric approaches. Rules including more than one pollutant highlight interactions that deserve further investigation, while helping to limit the search field. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    PubMed

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  9. The Applicability of Standard Error of Measurement and Minimal Detectable Change to Motor Learning Research-A Behavioral Study.

    PubMed

    Furlan, Leonardo; Sterr, Annette

    2018-01-01

    Motor learning studies face the challenge of differentiating between real changes in performance and random measurement error. While the traditional p -value-based analyses of difference (e.g., t -tests, ANOVAs) provide information on the statistical significance of a reported change in performance scores, they do not inform as to the likely cause or origin of that change, that is, the contribution of both real modifications in performance and random measurement error to the reported change. One way of differentiating between real change and random measurement error is through the utilization of the statistics of standard error of measurement (SEM) and minimal detectable change (MDC). SEM is estimated from the standard deviation of a sample of scores at baseline and a test-retest reliability index of the measurement instrument or test employed. MDC, in turn, is estimated from SEM and a degree of confidence, usually 95%. The MDC value might be regarded as the minimum amount of change that needs to be observed for it to be considered a real change, or a change to which the contribution of real modifications in performance is likely to be greater than that of random measurement error. A computer-based motor task was designed to illustrate the applicability of SEM and MDC to motor learning research. Two studies were conducted with healthy participants. Study 1 assessed the test-retest reliability of the task and Study 2 consisted in a typical motor learning study, where participants practiced the task for five consecutive days. In Study 2, the data were analyzed with a traditional p -value-based analysis of difference (ANOVA) and also with SEM and MDC. The findings showed good test-retest reliability for the task and that the p -value-based analysis alone identified statistically significant improvements in performance over time even when the observed changes could in fact have been smaller than the MDC and thereby caused mostly by random measurement error, as opposed to by learning. We suggest therefore that motor learning studies could complement their p -value-based analyses of difference with statistics such as SEM and MDC in order to inform as to the likely cause or origin of any reported changes in performance.

  10. Carotid Artery Plaque Morphology and Composition in Relation to Incident Cardiovascular Events: The Multi-Ethnic Study of Atherosclerosis (MESA)

    PubMed Central

    Zavodni, Anna E. H.; Wasserman, Bruce A.; McClelland, Robyn L.; Gomes, Antoinette S.; Folsom, Aaron R.; Polak, Joseph F.; Lima, João A. C.

    2014-01-01

    Purpose To determine if carotid plaque morphology and composition with magnetic resonance (MR) imaging can be used to identify asymptomatic subjects at risk for cardiovascular events. Materials and Methods Institutional review boards at each site approved the study, and all sites were Health Insurance Portability and Accountability Act (HIPAA) compliant. A total of 946 participants in the Multi-Ethnic Study of Atherosclerosis (MESA) were evaluated with MR imaging and ultrasonography (US). MR imaging was used to define carotid plaque composition and remodeling index (wall area divided by the sum of wall area and lumen area), while US was used to assess carotid wall thickness. Incident cardiovascular events, including myocardial infarction, resuscitated cardiac arrest, angina, stroke, and death, were ascertained for an average of 5.5 years. Multivariable Cox proportional hazards models, C statistics, and net reclassification improvement (NRI) for event prediction were determined. Results Cardiovascular events occurred in 59 (6%) of participants. Carotid IMT as well as MR imaging remodeling index, lipid core, and calcium in the internal carotid artery were significant predictors of events in univariate analysis (P < .001 for all). For traditional risk factors, the C statistic for event prediction was 0.696. For MR imaging remodeling index and lipid core, the C statistic was 0.734 and the NRI was 7.4% and 15.8% for participants with and those without cardiovascular events, respectively (P = .02). The NRI for US IMT in addition to traditional risk factors was not significant. Conclusion The identification of vulnerable plaque characteristics with MR imaging aids in cardiovascular disease prediction and improves the reclassification of baseline cardiovascular risk. © RSNA, 2014 PMID:24592924

  11. A new statistical framework to assess structural alignment quality using information compression

    PubMed Central

    Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.

    2014-01-01

    Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241

  12. Information-dependent enrichment analysis reveals time-dependent transcriptional regulation of the estrogen pathway of toxicity.

    PubMed

    Pendse, Salil N; Maertens, Alexandra; Rosenberg, Michael; Roy, Dipanwita; Fasani, Rick A; Vantangoli, Marguerite M; Madnick, Samantha J; Boekelheide, Kim; Fornace, Albert J; Odwin, Shelly-Ann; Yager, James D; Hartung, Thomas; Andersen, Melvin E; McMullen, Patrick D

    2017-04-01

    The twenty-first century vision for toxicology involves a transition away from high-dose animal studies to in vitro and computational models (NRC in Toxicity testing in the 21st century: a vision and a strategy, The National Academies Press, Washington, DC, 2007). This transition requires mapping pathways of toxicity by understanding how in vitro systems respond to chemical perturbation. Uncovering transcription factors/signaling networks responsible for gene expression patterns is essential for defining pathways of toxicity, and ultimately, for determining the chemical modes of action through which a toxicant acts. Traditionally, transcription factor identification is achieved via chromatin immunoprecipitation studies and summarized by calculating which transcription factors are statistically associated with up- and downregulated genes. These lists are commonly determined via statistical or fold-change cutoffs, a procedure that is sensitive to statistical power and may not be as useful for determining transcription factor associations. To move away from an arbitrary statistical or fold-change-based cutoff, we developed, in the context of the Mapping the Human Toxome project, an enrichment paradigm called information-dependent enrichment analysis (IDEA) to guide identification of the transcription factor network. We used a test case of activation in MCF-7 cells by 17β estradiol (E2). Using this new approach, we established a time course for transcriptional and functional responses to E2. ERα and ERβ were associated with short-term transcriptional changes in response to E2. Sustained exposure led to recruitment of additional transcription factors and alteration of cell cycle machinery. TFAP2C and SOX2 were the transcription factors most highly correlated with dose. E2F7, E2F1, and Foxm1, which are involved in cell proliferation, were enriched only at 24 h. IDEA should be useful for identifying candidate pathways of toxicity. IDEA outperforms gene set enrichment analysis (GSEA) and provides similar results to weighted gene correlation network analysis, a platform that helps to identify genes not annotated to pathways.

  13. Evaluation of Cepstrum Algorithm with Impact Seeded Fault Data of Helicopter Oil Cooler Fan Bearings and Machine Fault Simulator Data

    DTIC Science & Technology

    2013-02-01

    of a bearing must be put into practice. There are many potential methods, the most traditional being the use of statistical time-domain features...accelerate degradation to test multiples bearings to gain statistical relevance and extrapolate results to scale for field conditions. Temperature...as time statistics , frequency estimation to improve the fault frequency detection. For future investigations, one can further explore the

  14. Cooperative Learning in Virtual Environments: The Jigsaw Method in Statistical Courses

    ERIC Educational Resources Information Center

    Vargas-Vargas, Manuel; Mondejar-Jimenez, Jose; Santamaria, Maria-Letica Meseguer; Alfaro-Navarro, Jose-Luis; Fernandez-Aviles, Gema

    2011-01-01

    This document sets out a novel teaching methodology as used in subjects with statistical content, traditionally regarded by students as "difficult". In a virtual learning environment, instructional techniques little used in mathematical courses were employed, such as the Jigsaw cooperative learning method, which had to be adapted to the…

  15. An Empirical Consideration of a Balanced Amalgamation of Learning Strategies in Graduate Introductory Statistics Classes

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.

    2009-01-01

    This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this…

  16. Asset Attribution Stability and Portfolio Construction: An Educational Example

    ERIC Educational Resources Information Center

    Chong, James T.; Jennings, William P.; Phillips, G. Michael

    2014-01-01

    This paper illustrates how a third statistic from asset pricing models, the R-squared statistic, may have information that can help in portfolio construction. Using a traditional CAPM model in comparison to an 18-factor Arbitrage Pricing Style Model, a portfolio separation test is conducted. Portfolio returns and risk metrics are compared using…

  17. Using Information Technology in Teaching of Business Statistics in Nigeria Business School

    ERIC Educational Resources Information Center

    Hamadu, Dallah; Adeleke, Ismaila; Ehie, Ike

    2011-01-01

    This paper discusses the use of Microsoft Excel software in the teaching of statistics in the Faculty of Business Administration at the University of Lagos, Nigeria. Problems associated with existing traditional methods are identified and a novel pedagogy using Excel is proposed. The advantages of using this software over other specialized…

  18. The Development and Demonstration of Multiple Regression Models for Operant Conditioning Questions.

    ERIC Educational Resources Information Center

    Fanning, Fred; Newman, Isadore

    Based on the assumption that inferential statistics can make the operant conditioner more sensitive to possible significant relationships, regressions models were developed to test the statistical significance between slopes and Y intercepts of the experimental and control group subjects. These results were then compared to the traditional operant…

  19. Online, Instructional Television and Traditional Delivery: Student Characteristics and Success Factors in Business Statistics

    ERIC Educational Resources Information Center

    Dotterweich, Douglas P.; Rochelle, Carolyn F.

    2012-01-01

    Distance education has surged in recent years while research on student characteristics and factors leading to successful outcomes has not kept pace. This study examined characteristics of regional university students in undergraduate Business Statistics and factors linked to their success based on three modes of delivery - Online, Instructional…

  20. Assessing the Disconnect between Grade Expectation and Achievement in a Business Statistics Course

    ERIC Educational Resources Information Center

    Berenson, Mark L.; Ramnarayanan, Renu; Oppenheim, Alan

    2015-01-01

    In an institutional review board--approved study aimed at evaluating differences in learning between a large-sized introductory business statistics course section using courseware assisted examinations compared with small-sized sections using traditional paper-and-pencil examinations, there appeared to be a severe disconnect between the final…

Top