Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu
2014-05-01
The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.
The Utility of Robust Means in Statistics
ERIC Educational Resources Information Center
Goodwyn, Fara
2012-01-01
Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…
Line identification studies using traditional techniques and wavelength coincidence statistics
NASA Technical Reports Server (NTRS)
Cowley, Charles R.; Adelman, Saul J.
1990-01-01
Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.
NASA Astrophysics Data System (ADS)
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
ERIC Educational Resources Information Center
Faghihi, Foroozandeh; Rakow, Ernest A.
This study, conducted at the University of Memphis (Tennessee), compared the effects of a self-paced method of instruction on the attitudes and perceptions of students enrolled in an undergraduate statistics course with those of a comparable group of students taking statistics in a traditional lecture setting. The non-traditional course used a…
ERIC Educational Resources Information Center
Phelps, Amy L.; Dostilio, Lina
2008-01-01
The present study addresses the efficacy of using service-learning methods to meet the GAISE guidelines (http://www.amstat.org/education/gaise/GAISECollege.htm) in a second business statistics course and further explores potential advantages of assigning a service-learning (SL) project as compared to the traditional statistics project assignment.…
Bayesian statistics and Monte Carlo methods
NASA Astrophysics Data System (ADS)
Koch, K. R.
2018-03-01
The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.
NASA Astrophysics Data System (ADS)
O'Shea, Bethany; Jankowski, Jerzy
2006-12-01
The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright
ERIC Educational Resources Information Center
Geske, Jenenne A.; Mickelson, William T.; Bandalos, Deborah L.; Jonson, Jessica; Smith, Russell W.
The bulk of experimental research related to reforms in the teaching of statistics concentrates on the effects of alternative teaching methods on statistics achievement. This study expands on that research by including an examination of the effects of instructor and the interaction between instructor and method on achievement as well as attitudes,…
Managing Clustered Data Using Hierarchical Linear Modeling
ERIC Educational Resources Information Center
Warne, Russell T.; Li, Yan; McKyer, E. Lisako J.; Condie, Rachel; Diep, Cassandra S.; Murano, Peter S.
2012-01-01
Researchers in nutrition research often use cluster or multistage sampling to gather participants for their studies. These sampling methods often produce violations of the assumption of data independence that most traditional statistics share. Hierarchical linear modeling is a statistical method that can overcome violations of the independence…
ERIC Educational Resources Information Center
Everson, Howard T.; And Others
This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time
Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.
2017-12-20
In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.
Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.
In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.
NASA Astrophysics Data System (ADS)
Hundley, Stacey A.
In recent years there has been a national call for reform in undergraduate science education. The goal of this reform movement in science education is to develop ways to improve undergraduate student learning with an emphasis on developing more effective teaching practices. Introductory science courses at the college level are generally taught using a traditional lecture format. Recent studies have shown incorporating active learning strategies within the traditional lecture classroom has positive effects on student outcomes. This study focuses on incorporating interactive teaching methods into the traditional lecture classroom to enhance student learning for non-science majors enrolled in introductory geology courses at a private university. Students' experience and instructional preferences regarding introductory geology courses were identified from survey data analysis. The information gained from responses to the questionnaire was utilized to develop an interactive lecture introductory geology course for non-science majors. Student outcomes were examined in introductory geology courses based on two teaching methods: interactive lecture and traditional lecture. There were no significant statistical differences between the groups based on the student outcomes and teaching methods. Incorporating interactive lecture methods did not statistically improve student outcomes when compared to traditional lecture teaching methods. However, the responses to the survey revealed students have a preference for introductory geology courses taught with lecture and instructor-led discussions and students prefer to work independently or in small groups. The results of this study are useful to individuals who teach introductory geology courses and individuals who teach introductory science courses for non-science majors at the college level.
ERIC Educational Resources Information Center
Glass, Gene V.; And Others
Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…
Cooperative Learning in Virtual Environments: The Jigsaw Method in Statistical Courses
ERIC Educational Resources Information Center
Vargas-Vargas, Manuel; Mondejar-Jimenez, Jose; Santamaria, Maria-Letica Meseguer; Alfaro-Navarro, Jose-Luis; Fernandez-Aviles, Gema
2011-01-01
This document sets out a novel teaching methodology as used in subjects with statistical content, traditionally regarded by students as "difficult". In a virtual learning environment, instructional techniques little used in mathematical courses were employed, such as the Jigsaw cooperative learning method, which had to be adapted to the…
Zhang, Xiao-Bo; Qu, Xian-You; Li, Meng; Wang, Hui; Jing, Zhi-Xian; Liu, Xiang; Zhang, Zhi-Wei; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
After the end of the national and local medicine resources census work, a large number of Chinese medicine resources and distribution of data will be summarized. The species richness between the regions is a valid indicator for objective reflection of inter-regional resources of Chinese medicine. Due to the large difference in the size of the county area, the assessment of the intercropping of the resources of the traditional Chinese medicine by the county as a statistical unit will lead to the deviation of the regional abundance statistics. Based on the rule grid or grid statistical methods, the size of the statistical unit due to different can be reduced, the differences in the richness of traditional Chinese medicine resources are caused. Taking Chongqing as an example, based on the existing survey data, the difference of richness of traditional Chinese medicine resources under different grid scale were compared and analyzed. The results showed that the 30 km grid could be selected and the richness of Chinese medicine resources in Chongqing could reflect the objective situation of intercropping resources richness in traditional Chinese medicine better. Copyright© by the Chinese Pharmaceutical Association.
Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data
NASA Astrophysics Data System (ADS)
Reno, B. L.; Brown, M.; Piccoli, P. M.
2007-12-01
Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.
Wijerathne, Buddhika; Rathnayake, Geetha
2013-01-01
Background Most universities currently practice traditional practical spot tests to evaluate students. However, traditional methods have several disadvantages. Computer-based examination techniques are becoming more popular among medical educators worldwide. Therefore incorporating the computer interface in practical spot testing is a novel concept that may minimize the shortcomings of traditional methods. Assessing students’ attitudes and perspectives is vital in understanding how students perceive the novel method. Methods One hundred and sixty medical students were randomly allocated to either a computer-based spot test (n=80) or a traditional spot test (n=80). The students rated their attitudes and perspectives regarding the spot test method soon after the test. The results were described comparatively. Results Students had higher positive attitudes towards the computer-based practical spot test compared to the traditional spot test. Their recommendations to introduce the novel practical spot test method for future exams and to other universities were statistically significantly higher. Conclusions The computer-based practical spot test is viewed as more acceptable to students than the traditional spot test. PMID:26451213
Ibrahim, Nahla Khamis; Banjar, Shorooq; Al-Ghamdi, Amal; Al-Darmasi, Moroj; Khoja, Abeer; Turkistani, Jamela; Arif, Rwan; Al-Sebyani, Awatif; Musawa, Al-Anoud; Basfar, Wijdan
2014-01-01
Problem-based learning (PBL) is the most important educational innovations in the past 4 decades. The objective of the study was to compare between the preference of medical students for PBL and the preference for traditional lectures regarding learning outcomes (e.g., knowledge, attitude, and skills) gained from both methods. A cross-sectional study was conducted among medical students who studied the hybrid curriculum (PBL and traditional lectures) in King Abdulaziz University, Jeddah, in 2011. Data was collected through a pre-constructed, validated, confidentially anonymous, and self-administered questionnaire. Students' perceptions toward PBL and traditional lectures were assessed through their response to 20 statements inquired about both methods of learning using a five-point Likert scale. Descriptive and analytic statistics were performed using SPSS, version 21 (SPSS Inc, Chicago, Ill., USA). Learners preferred PBL more to traditional lectures for better linking the knowledge of basic and clinical sciences (t test=10.15, P < .001). However, no statistical significant difference (P > .05) was observed regarding the amount of basic knowledge recalled from both methods. Students preferred PBL more to lectures for better learning attitudes, skills, future outcomes, and learning satisfaction (P < .05). PBL motivates students to learn better than lecturing (P < .05). From students' opinion, the mean total skill gained from PBL (47.2 [10.6]) was much higher than that of lectures (33.0 [9.9]), and a highly statistical significant difference was observed (t test=20.9, P < .001). Students preferred PBL more to traditional lectures for improving most of learning outcome domains, especially, learning attitudes and skills. Introducing hybrid-PBL curriculum in all Saudi universities is highly recommended.
Pezzuti, L; Nacinovich, R; Oggiano, S; Bomba, M; Ferri, R; La Stella, A; Rossetti, S; Orsini, A
2018-07-01
Individuals with Down syndrome generally show a floor effect on Wechsler Scales that is manifested by flat profiles and with many or all of the weighted scores on the subtests equal to 1. The main aim of the present paper is to use the statistical Hessl method and the extended statistical method of Orsini, Pezzuti and Hulbert with a sample of individuals with Down syndrome (n = 128; 72 boys and 56 girls), to underline the variability of performance on Wechsler Intelligence Scale for Children-Fourth Edition subtests and indices, highlighting any strengths and weaknesses of this population that otherwise appear to be flattened. Based on results using traditional transformation of raw scores into weighted scores, a very high percentage of subtests with weighted score of 1 occurred in the Down syndrome sample, with a floor effect and without any statistically significant difference between four core Wechsler Intelligence Scale for Children-Fourth Edition indices. The results, using traditional transformation, confirm a deep cognitive impairment of those with Down syndrome. Conversely, using the new statistical method, it is immediately apparent that the variability of the scores, both on subtests and indices, is wider with respect to the traditional method. Children with Down syndrome show a greater ability in the Verbal Comprehension Index than in the Working Memory Index. © 2018 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.
[Road Extraction in Remote Sensing Images Based on Spectral and Edge Analysis].
Zhao, Wen-zhi; Luo, Li-qun; Guo, Zhou; Yue, Jun; Yu, Xue-ying; Liu, Hui; Wei, Jing
2015-10-01
Roads are typically man-made objects in urban areas. Road extraction from high-resolution images has important applications for urban planning and transportation development. However, due to the confusion of spectral characteristic, it is difficult to distinguish roads from other objects by merely using traditional classification methods that mainly depend on spectral information. Edge is an important feature for the identification of linear objects (e. g. , roads). The distribution patterns of edges vary greatly among different objects. It is crucial to merge edge statistical information into spectral ones. In this study, a new method that combines spectral information and edge statistical features has been proposed. First, edge detection is conducted by using self-adaptive mean-shift algorithm on the panchromatic band, which can greatly reduce pseudo-edges and noise effects. Then, edge statistical features are obtained from the edge statistical model, which measures the length and angle distribution of edges. Finally, by integrating the spectral and edge statistical features, SVM algorithm is used to classify the image and roads are ultimately extracted. A series of experiments are conducted and the results show that the overall accuracy of proposed method is 93% comparing with only 78% overall accuracy of the traditional. The results demonstrate that the proposed method is efficient and valuable for road extraction, especially on high-resolution images.
Aragão, José Aderval; Freire, Marianna Ribeiro de Menezes; Nolasco Farias, Lucas Guimarães; Diniz, Sarah Santana; Sant'anna Aragão, Felipe Matheus; Sant'anna Aragão, Iapunira Catarina; Lima, Tarcisio Brandão; Reis, Francisco Prado
2018-06-01
To compare depressive symptoms among medical students taught using problem-based learning (PBL) and the traditional method. Beck's Depression Inventory was applied to 215 medical students. The prevalence of depression was calculated as the number of individuals with depression divided by the total number in the sample from each course, with 95% confidence intervals. The statistical significance level used was 5% (p ≤ .05). Among the 215 students, 52.1% were male and 47.9% were female; and 51.6% were being taught using PBL methodology and 48.4% using traditional methods. The prevalence of depression was 29.73% with PBL and 22.12% with traditional methods. There was higher prevalence among females: 32.8% with PBL and 23.1% with traditional methods. The prevalence of depression with PBL among students up to 21 years of age was 29.4% and among those over 21 years, 32.1%. With traditional methods among students up to 21 years of age, it was 16.7%%, and among those over 21 years, 30.1%. The prevalence of depression with PBL was highest among students in the second semester and with traditional methods, in the eighth. Depressive symptoms were highly prevalent among students taught both with PBL and with traditional methods.
Traditional learning and problem-based learning: self-perception of preparedness for internship.
Millan, Laís Pereira Bueno; Semer, Beatriz; Rodrigues, José Mauro da Silva; Gianini, Reinaldo José
2012-01-01
This study aims to evaluate Pontificia Universidade Católica de São Paulo (PUC-SP) medical students' perception of their preparedness to attend the internship course by comparing students who entered the internship in 2009, who were taught according to the traditional learning method, and those who entered the internship in 2010, who were taught according to the new method, i.e. problem-based learning (PBL). 50 traditional learning method students answered a standard Lickert scale questionnaire upon entering internship in 2009. In 2010, the process was repeated with PBL students. The questionnaire was based upon the Preparation for Hospital Practice Questionnaire. This questionnaire was evaluated by professors from three medical schools in Brazil regarding its applicability. The original questions were classified according to the importance these professors attributed to them, and less important questions were removed. Scores obtained from the Student's t-test were considered significant with p < 0.05. A significant statistical difference was observed in 16 questions, and the traditional learning method students reported higher average scores. When questions were divided into dimensions, a significant statistical difference appeared in the dimensions " social aspects of health", "medical skills", and "ethical concepts"; traditional learning method students again reported higher scores (p < 0.001 for all dimensions). Higher scores were also reported when the average of the answers to the whole questionnaire was calculated. Traditional learning method students consider themselves to be better prepared for internship activities than PBL students, according to the following three comparative means: by analyzing the answers to each question, by grouping these answers into dimensions, and by calculating the means of answers to the whole questionnaire.
The Web as an educational tool for/in learning/teaching bioinformatics statistics.
Oliver, J; Pisano, M E; Alonso, T; Roca, P
2005-12-01
Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.
[Overview and prospect of syndrome differentiation of hypertension in traditional Chinese medicine].
Yang, Xiao-Chen; Xiong, Xing-Jiang; Wang, Jie
2014-01-01
This article is to overview the literature of syndrome differentiation of traditional Chinese medicine on hypertension. According to the theory of disease in combination with syndrome, we concluded syndrome types of hypertension in four aspects, including national standards, industry standards, teaching standards and personal experience. Meanwhile, in order to provide new methods and approaches for normalized research, we integrated modern testing methods and statistical methods to analyze syndrome differentiation for the treatment of hypertension.
Abbaszadeh, Abbas; Sabeghi, Hakimeh; Borhani, Fariba; Heydari, Abbas
2011-01-01
BACKGROUND: Accurate recording of the nursing care indicates the care performance and its quality, so that, any failure in documentation can be a reason for inadequate patient care. Therefore, improving nurses’ skills in this field using effective educational methods is of high importance. Since traditional teaching methods are not suitable for communities with rapid knowledge expansion and constant changes, e-learning methods can be a viable alternative. To show the importance of e-learning methods on nurses’ care reporting skills, this study was performed to compare the e-learning methods with the traditional instructor-led methods. METHODS: This was a quasi-experimental study aimed to compare the effect of two teaching methods (e-learning and lecture) on nursing documentation and examine the differences in acquiring competency on documentation between nurses who participated in the e-learning (n = 30) and nurses in a lecture group (n = 31). RESULTS: The results of the present study indicated that statistically there was no significant difference between the two groups. The findings also revealed that statistically there was no significant correlation between the two groups toward demographic variables. However, we believe that due to benefits of e-learning against traditional instructor-led method, and according to their equal effect on nurses’ documentation competency, it can be a qualified substitute for traditional instructor-led method. CONCLUSIONS: E-learning as a student-centered method as well as lecture method equally promote competency of the nurses on documentation. Therefore, e-learning can be used to facilitate the implementation of nursing educational programs. PMID:22224113
Abbaszadeh, Abbas; Sabeghi, Hakimeh; Borhani, Fariba; Heydari, Abbas
2011-01-01
Accurate recording of the nursing care indicates the care performance and its quality, so that, any failure in documentation can be a reason for inadequate patient care. Therefore, improving nurses' skills in this field using effective educational methods is of high importance. Since traditional teaching methods are not suitable for communities with rapid knowledge expansion and constant changes, e-learning methods can be a viable alternative. To show the importance of e-learning methods on nurses' care reporting skills, this study was performed to compare the e-learning methods with the traditional instructor-led methods. This was a quasi-experimental study aimed to compare the effect of two teaching methods (e-learning and lecture) on nursing documentation and examine the differences in acquiring competency on documentation between nurses who participated in the e-learning (n = 30) and nurses in a lecture group (n = 31). The results of the present study indicated that statistically there was no significant difference between the two groups. The findings also revealed that statistically there was no significant correlation between the two groups toward demographic variables. However, we believe that due to benefits of e-learning against traditional instructor-led method, and according to their equal effect on nurses' documentation competency, it can be a qualified substitute for traditional instructor-led method. E-learning as a student-centered method as well as lecture method equally promote competency of the nurses on documentation. Therefore, e-learning can be used to facilitate the implementation of nursing educational programs.
Getting the big picture in community science: methods that capture context.
Luke, Douglas A
2005-06-01
Community science has a rich tradition of using theories and research designs that are consistent with its core value of contextualism. However, a survey of empirical articles published in the American Journal of Community Psychology shows that community scientists utilize a narrow range of statistical tools that are not well suited to assess contextual data. Multilevel modeling, geographic information systems (GIS), social network analysis, and cluster analysis are recommended as useful tools to address contextual questions in community science. An argument for increased methodological consilience is presented, where community scientists are encouraged to adopt statistical methodology that is capable of modeling a greater proportion of the data than is typical with traditional methods.
NASA Astrophysics Data System (ADS)
El Sharif, H.; Teegavarapu, R. S.
2012-12-01
Spatial interpolation methods used for estimation of missing precipitation data at a site seldom check for their ability to preserve site and regional statistics. Such statistics are primarily defined by spatial correlations and other site-to-site statistics in a region. Preservation of site and regional statistics represents a means of assessing the validity of missing precipitation estimates at a site. This study evaluates the efficacy of a fuzzy-logic methodology for infilling missing historical daily precipitation data in preserving site and regional statistics. Rain gauge sites in the state of Kentucky, USA, are used as a case study for evaluation of this newly proposed method in comparison to traditional data infilling techniques. Several error and performance measures will be used to evaluate the methods and trade-offs in accuracy of estimation and preservation of site and regional statistics.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
Learning physics: A comparative analysis between instructional design methods
NASA Astrophysics Data System (ADS)
Mathew, Easow
The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods are more effective in females, whereas there was no significant difference noted in the experimental group between male and female participants. There was a statistically significant and negative relationship (r = -.61, p = .01) between age and physics pretest scores in the control group. No statistical analyses yielded significantly different average academic performance values in either group as delineated by ethnicity.
Pain Perception: Computerized versus Traditional Local Anesthesia in Pediatric Patients.
Mittal, M; Kumar, A; Srivastava, D; Sharma, P; Sharma, S
2015-01-01
Local anesthetic injection is one of the most anxiety- provoking procedure for both children and adult patients in dentistry. A computerized system for slow delivery of local anesthetic has been developed as a possible solution to reduce the pain related to the local anesthetic injection. The present study was conducted to evaluate and compare pain perception rates in pediatric patients with computerized system and traditional methods, both objectively and subjectively. It was a randomized controlled study in one hundred children aged 8-12 years in healthy physical and mental state, assessed as being cooperative, requiring extraction of maxillary primary molars. Children were divided into two groups by random sampling - Group A received buccal and palatal infiltration injection using Wand, while Group B received buccal and palatal infiltration using traditional syringe. Visual Analog scale (VAS) was used for subjective evaluation of pain perception by patient. Sound, Eye, Motor (SEM) scale was used as an objective method where sound, eye and motor reactions of patient were observed and heart rate measurement using pulse oximeter was used as the physiological parameter for objective evaluation. Patients experienced significantly less pain of injection with the computerized method during palatal infiltration, while less pain was not statistically significant during buccal infiltration. Heart rate increased during both buccal and palatal infiltration in traditional and computerized local anesthesia, but difference between traditional and computerized method was not statistically significant. It was concluded that pain perception was significantly more during traditional palatal infiltration injection as compared to computerized palatal infiltration, while there was no difference in pain perception during buccal infiltration in both the groups.
Echeto, Luisa F; Sposetti, Venita; Childs, Gail; Aguilar, Maria L; Behar-Horenstein, Linda S; Rueda, Luis; Nimmo, Arthur
2015-09-01
The aim of this study was to evaluate the effectiveness of team-based learning (TBL) methodology on dental students' retention of knowledge regarding removable partial denture (RPD) treatment. The process of learning RPD treatment requires that students first acquire foundational knowledge and then use critical thinking skills to apply that knowledge to a variety of clinical situations. The traditional approach to teaching, characterized by a reliance on lectures, is not the most effective method for learning clinical applications. To address the limitations of that approach, the teaching methodology of the RPD preclinical course at the University of Florida was changed to TBL, which has been shown to motivate student learning and improve clinical performance. A written examination was constructed to compare the impact of TBL with that of traditional teaching regarding students' retention of knowledge and their ability to evaluate, diagnose, and treatment plan a partially edentulous patient with an RPD prosthesis. Students taught using traditional and TBL methods took the same examination. The response rate (those who completed the examination) for the class of 2013 (traditional method) was 94% (79 students of 84); for the class of 2014 (TBL method), it was 95% (78 students of 82). The results showed that students who learned RPD with TBL scored higher on the examination than those who learned RPD with traditional methods. Compared to the students taught with the traditional method, the TBL students' proportion of passing grades was statistically significantly higher (p=0.002), and 23.7% more TBL students passed the examination. The mean score for the TBL class (0.758) compared to the conventional class (0.700) was statistically significant with a large effect size, also demonstrating the practical significance of the findings. The results of the study suggest that TBL methodology is a promising approach to teaching RPD with successful outcomes.
The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade
ERIC Educational Resources Information Center
Koparan, Timur; Güven, Bülent
2014-01-01
This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…
The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning
ERIC Educational Resources Information Center
Koparan, Timur; Güven, Bülent
2014-01-01
This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…
DOT National Transportation Integrated Search
2010-12-01
Recent research suggests that traditional safety evaluation methods may be inadequate in accurately determining the effectiveness of roadway safety measures. In recent years, advanced statistical methods are being utilized in traffic safety studies t...
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
Background In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. Methods A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. Results All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. Conclusions The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method. PMID:25803659
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.
Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J
2017-11-24
Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Pilot study on the feasibility of a computerized speech recognition charting system.
Feldman, C A; Stevens, D
1990-08-01
The objective of this study was to determine the feasibility of developing and using a voice recognition computerized charting system to record dental clinical examination data. More specifically, the study was designed to analyze the time and error differential between the traditional examiner/recorder method (ASSISTANT) and computerized voice recognition method (VOICE). DMFS examinations were performed twice on 20 patients using the traditional ASSISTANT and the VOICE charting system. A statistically significant difference was found when comparing the mean ASSISTANT time of 2.69 min to the VOICE time of 3.72 min (P less than 0.001). No statistically significant difference was found when comparing the mean ASSISTANT recording errors of 0.1 to VOICE recording errors of 0.6 (P = 0.059). 90% of the patients indicated they felt comfortable with the dentist talking to a computer and only 5% of the sample indicated they opposed VOICE. Results from this pilot study indicate that a charting system utilizing voice recognition technology could be considered a viable alternative to traditional examiner/recorder methods of clinical charting.
An enriched multimedia eBook application to facilitate learning of anatomy.
Stirling, Allan; Birt, James
2014-01-01
This pilot study compared the use of an enriched multimedia eBook with traditional methods for teaching the gross anatomy of the heart and great vessels. Seventy-one first-year students from an Australian medical school participated in the study. Students' abilities were examined by pretest, intervention, and post-test measurements. Perceptions and attitudes toward eBook technology were examined by survey questions. Results indicated a strongly positive user experience coupled with increased marks; however, there were no statistically significant results for the eBook method of delivery alone outperforming the traditional anatomy practical session. Results did show a statistically significant difference in the final marks achieved based on the sequencing of the learning modalities. With initial interaction with the multimedia content followed by active experimentation in the anatomy lab, students' performance was improved in the final test. Obtained data support the role of eBook technology in modern anatomy curriculum being a useful adjunct to traditional methods. Further study is needed to investigate the importance of sequencing of teaching interventions. © 2013 American Association of Anatomists.
Uncertainty propagation for statistical impact prediction of space debris
NASA Astrophysics Data System (ADS)
Hoogendoorn, R.; Mooij, E.; Geul, J.
2018-01-01
Predictions of the impact time and location of space debris in a decaying trajectory are highly influenced by uncertainties. The traditional Monte Carlo (MC) method can be used to perform accurate statistical impact predictions, but requires a large computational effort. A method is investigated that directly propagates a Probability Density Function (PDF) in time, which has the potential to obtain more accurate results with less computational effort. The decaying trajectory of Delta-K rocket stages was used to test the methods using a six degrees-of-freedom state model. The PDF of the state of the body was propagated in time to obtain impact-time distributions. This Direct PDF Propagation (DPP) method results in a multi-dimensional scattered dataset of the PDF of the state, which is highly challenging to process. No accurate results could be obtained, because of the structure of the DPP data and the high dimensionality. Therefore, the DPP method is less suitable for practical uncontrolled entry problems and the traditional MC method remains superior. Additionally, the MC method was used with two improved uncertainty models to obtain impact-time distributions, which were validated using observations of true impacts. For one of the two uncertainty models, statistically more valid impact-time distributions were obtained than in previous research.
Using the Flipped Classroom to Bridge the Gap to Generation Y
Gillispie, Veronica
2016-01-01
Background: The flipped classroom is a student-centered approach to learning that increases active learning for the student compared to traditional classroom-based instruction. In the flipped classroom model, students are first exposed to the learning material through didactics outside of the classroom, usually in the form of written material, voice-over lectures, or videos. During the formal teaching time, an instructor facilitates student-driven discussion of the material via case scenarios, allowing for complex problem solving, peer interaction, and a deep understanding of the concepts. A successful flipped classroom should have three goals: (1) allow the students to become critical thinkers, (2) fully engage students and instructors, and (3) stimulate the development of a deep understanding of the material. The flipped classroom model includes teaching and learning methods that can appeal to all four generations in the academic environment. Methods: During the 2015 academic year, we implemented the flipped classroom in the obstetrics and gynecology clerkship for the Ochsner Clinical School in New Orleans, LA. Voice-over presentations of the lectures that had been given to students in prior years were recorded and made available to the students through an online classroom. Weekly problem-based learning sessions matched to the subjects of the traditional lectures were held, and the faculty who had previously presented the information in the traditional lecture format facilitated the problem-based learning sessions. The knowledge base of students was evaluated at the end of the rotation via a multiple-choice question examination and the Objective Structured Clinical Examination (OSCE) as had been done in previous years. We compared demographic information and examination scores for traditional teaching and flipped classroom groups of students. The traditional teaching group consisted of students from Rotation 2 and Rotation 3 of the 2014 academic year who received traditional classroom-based instruction. The flipped classroom group consisted of students from Rotation 2 and Rotation 3 of the 2015 academic year who received formal didactics via voice-over presentation and had the weekly problem-based learning sessions. Results: When comparing the students taught by traditional methods to those taught in the flipped classroom model, we saw a statistically significant increase in test scores on the multiple-choice question examination in both the obstetrics and gynecology sections in Rotation 2. While the average score for the flipped classroom group increased in Rotation 3 on the obstetrics section of the multiple-choice question examination, the difference was not statistically significant. Unexpectedly, the average score on the gynecology portion of the multiple-choice question examination decreased among the flipped classroom group compared to the traditional teaching group, and this decrease was statistically significant. For both the obstetrics and the gynecology portions of the OSCE, we saw statistically significant increases in the scores for the flipped classroom group in both Rotation 2 and Rotation 3 compared to the traditional teaching group. With the exception of the gynecology portion of the multiple-choice question examination in Rotation 3, we saw improvement in scores after the implementation of the flipped classroom. Conclusion: The flipped classroom is a feasible and useful alternative to the traditional classroom. It is a method that embraces Generation Y's need for active learning in a group setting while maintaining a traditional classroom method for introducing the information. Active learning increases student engagement and can lead to improved retention of material as demonstrated on standard examinations. PMID:27046401
de Souza Teixeira, Carla Regina; Kusumota, Luciana; Alves Pereira, Marta Cristiane; Merizio Martins Braga, Fernanda Titareli; Pirani Gaioso, Vanessa; Mara Zamarioli, Cristina; Campos de Carvalho, Emilia
2014-01-01
To compare the level of anxiety and performance of nursing students when performing a clinical simulation through the traditional method of assessment with the presence of an evaluator and through a filmed assessment without the presence of an evaluator. Controlled trial with the participation of Brazilian public university 20 students who were randomly assigned to one of two groups: a) assessment through the traditional method with the presence of an evaluator; or b) filmed assessment. The level of anxiety was assessed using the Zung test and performance was measured based on the number of correct answers. Averages of 32 and 27 were obtained on the anxiety scale by the group assessed through the traditional method before and after the simulation, respectively, while the filmed group obtained averages of 33 and 26; the final scores correspond to mild anxiety. Even though there was a statistically significant reduction in the intra-groups scores before and after the simulation, there was no difference between the groups. As for the performance assessments in the clinical simulation, the groups obtained similar percentages of correct answers (83% in the traditional assessment and 84% in the filmed assessment) without statistically significant differences. Filming can be used and encouraged as a strategy to assess nursing undergraduate students.
Introductory Guide to the Statistics of Molecular Genetics
ERIC Educational Resources Information Center
Eley, Thalia C.; Rijsdijk, Fruhling
2005-01-01
Background: This introductory guide presents the main two analytical approaches used by molecular geneticists: linkage and association. Methods: Traditional linkage and association methods are described, along with more recent advances in methodologies such as those using a variance components approach. Results: New methods are being developed all…
A Complex Network Approach to Stylometry
Amancio, Diego Raphael
2015-01-01
Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921
ERIC Educational Resources Information Center
Vaughn, Brandon K.
2009-01-01
This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this…
Using Information Technology in Teaching of Business Statistics in Nigeria Business School
ERIC Educational Resources Information Center
Hamadu, Dallah; Adeleke, Ismaila; Ehie, Ike
2011-01-01
This paper discusses the use of Microsoft Excel software in the teaching of statistics in the Faculty of Business Administration at the University of Lagos, Nigeria. Problems associated with existing traditional methods are identified and a novel pedagogy using Excel is proposed. The advantages of using this software over other specialized…
NASA Astrophysics Data System (ADS)
Yin, Yanshu; Feng, Wenjie
2017-12-01
In this paper, a location-based multiple point statistics method is developed to model a non-stationary reservoir. The proposed method characterizes the relationship between the sedimentary pattern and the deposit location using the relative central position distance function, which alleviates the requirement that the training image and the simulated grids have the same dimension. The weights in every direction of the distance function can be changed to characterize the reservoir heterogeneity in various directions. The local integral replacements of data events, structured random path, distance tolerance and multi-grid strategy are applied to reproduce the sedimentary patterns and obtain a more realistic result. This method is compared with the traditional Snesim method using a synthesized 3-D training image of Poyang Lake and a reservoir model of Shengli Oilfield in China. The results indicate that the new method can reproduce the non-stationary characteristics better than the traditional method and is more suitable for simulation of delta-front deposits. These results show that the new method is a powerful tool for modelling a reservoir with non-stationary characteristics.
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
Adaptive interference cancel filter for evoked potential using high-order cumulants.
Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei
2004-01-01
This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.
NASA Astrophysics Data System (ADS)
Goodman, Steven N.
1989-11-01
This dissertation explores the use of a mathematical measure of statistical evidence, the log likelihood ratio, in clinical trials. The methods and thinking behind the use of an evidential measure are contrasted with traditional methods of analyzing data, which depend primarily on a p-value as an estimate of the statistical strength of an observed data pattern. It is contended that neither the behavioral dictates of Neyman-Pearson hypothesis testing methods, nor the coherency dictates of Bayesian methods are realistic models on which to base inference. The use of the likelihood alone is applied to four aspects of trial design or conduct: the calculation of sample size, the monitoring of data, testing for the equivalence of two treatments, and meta-analysis--the combining of results from different trials. Finally, a more general model of statistical inference, using belief functions, is used to see if it is possible to separate the assessment of evidence from our background knowledge. It is shown that traditional and Bayesian methods can be modeled as two ends of a continuum of structured background knowledge, methods which summarize evidence at the point of maximum likelihood assuming no structure, and Bayesian methods assuming complete knowledge. Both schools are seen to be missing a concept of ignorance- -uncommitted belief. This concept provides the key to understanding the problem of sampling to a foregone conclusion and the role of frequency properties in statistical inference. The conclusion is that statistical evidence cannot be defined independently of background knowledge, and that frequency properties of an estimator are an indirect measure of uncommitted belief. Several likelihood summaries need to be used in clinical trials, with the quantitative disparity between summaries being an indirect measure of our ignorance. This conclusion is linked with parallel ideas in the philosophy of science and cognitive psychology.
The theory precision analyse of RFM localization of satellite remote sensing imagery
NASA Astrophysics Data System (ADS)
Zhang, Jianqing; Xv, Biao
2009-11-01
The tradition method of detecting precision of Rational Function Model(RFM) is to make use of a great deal check points, and it calculates mean square error through comparing calculational coordinate with known coordinate. This method is from theory of probability, through a large number of samples to statistic estimate value of mean square error, we can think its estimate value approaches in its true when samples are well enough. This paper is from angle of survey adjustment, take law of propagation of error as the theory basis, and it calculates theory precision of RFM localization. Then take the SPOT5 three array imagery as experiment data, and the result of traditional method and narrated method in the paper are compared, while has confirmed tradition method feasible, and answered its theory precision question from the angle of survey adjustment.
Use of recurrence plots in the analysis of pupil diameter dynamics in narcoleptics
NASA Astrophysics Data System (ADS)
Keegan, Andrew P.; Zbilut, J. P.; Merritt, S. L.; Mercer, P. J.
1993-11-01
Recurrence plots were used to evaluate pupil dynamics of subjects with narcolepsy. Preliminary data indicate that this nonlinear method of analyses may be more useful in revealing underlying deterministic differences than traditional methods like FFT and counting statistics.
Compromise decision support problems for hierarchical design involving uncertainty
NASA Astrophysics Data System (ADS)
Vadde, S.; Allen, J. K.; Mistree, F.
1994-08-01
In this paper an extension to the traditional compromise Decision Support Problem (DSP) formulation is presented. Bayesian statistics is used in the formulation to model uncertainties associated with the information being used. In an earlier paper a compromise DSP that accounts for uncertainty using fuzzy set theory was introduced. The Bayesian Decision Support Problem is described in this paper. The method for hierarchical design is demonstrated by using this formulation to design a portal frame. The results are discussed and comparisons are made with those obtained using the fuzzy DSP. Finally, the efficacy of incorporating Bayesian statistics into the traditional compromise DSP formulation is discussed and some pending research issues are described. Our emphasis in this paper is on the method rather than the results per se.
AUPress: A Comparison of an Open Access University Press with Traditional Presses
ERIC Educational Resources Information Center
McGreal, Rory; Chen, Nian-Shing
2011-01-01
This study is a comparison of AUPress with three other traditional (non-open access) Canadian university presses. The analysis is based on the rankings that are correlated with book sales on Amazon.com and Amazon.ca. Statistical methods include the sampling of the sales ranking of randomly selected books from each press. The results of one-way…
NASA Astrophysics Data System (ADS)
Century, Daisy Nelson
This probing study focused on alternative and traditional assessments, their comparative impacts on students' attitudes and science learning outcomes. Four basic questions were asked: What type of science learning stemming from the instruction can best be assessed by the use of traditional paper-and pencil test? What type of science learning stemming from the instruction can best be assessed by the use of alternative assessment? What are the differences in the types of learning outcomes that can be assessed by the use of paper-pencil test and alternative assessment test? Is there a difference in students' attitude towards learning science when assessment of outcomes is by alternative assessment means compared to traditional means compared to traditional means? A mixed methodology involving quantitative and qualitative techniques was utilized. However, the study was essentially a case study. Quantitative data analysis included content achievement and attitude results, to which non-parametric statistics were applied. Analysis of qualitative data was done as a case study utilizing pre-set protocols resulting in a narrative summary style of report. These outcomes were combined in order to produce conclusions. This study revealed that the traditional method yielded more concrete cognitive content learning than did the alternative assessment. The alternative assessment yielded more psychomotor, cooperative learning and critical thinking skills. In both the alternative and the traditional methods the student's attitudes toward science were positive. There was no significant differences favoring either group. The quantitative findings of no statistically significant differences suggest that at a minimum there is no loss in the use of alternative assessment methods, in this instance, performance testing. Adding the results from the qualitative analysis to this suggests (1) that class groups were more satisfied when alternative methods were employed, and (2) that the two assessment methodologies are complementary to each other, and thus should probably be used together to produce maximum benefit.
NASA Astrophysics Data System (ADS)
Black, Joshua A.; Knowles, Peter J.
2018-06-01
The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.
Effect of CorrelatedRotational Noise
NASA Astrophysics Data System (ADS)
Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna
The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.
ERIC Educational Resources Information Center
Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert
2011-01-01
The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…
Myths and Misconceptions about Using Qualitative Methods in Assessment
ERIC Educational Resources Information Center
Harper, Shaun R.; Kuh, George D.
2007-01-01
The value of qualitative assessment approaches has been underestimated primarily because they are often juxtaposed against long-standing quantitative traditions and the widely accepted premise that the best research produces generalizable and statistically significant findings. Institutional researchers avoid qualitative methods for at least three…
2013-02-01
of a bearing must be put into practice. There are many potential methods, the most traditional being the use of statistical time-domain features...accelerate degradation to test multiples bearings to gain statistical relevance and extrapolate results to scale for field conditions. Temperature...as time statistics , frequency estimation to improve the fault frequency detection. For future investigations, one can further explore the
Using the Flipped Classroom to Bridge the Gap to Generation Y.
Gillispie, Veronica
2016-01-01
The flipped classroom is a student-centered approach to learning that increases active learning for the student compared to traditional classroom-based instruction. In the flipped classroom model, students are first exposed to the learning material through didactics outside of the classroom, usually in the form of written material, voice-over lectures, or videos. During the formal teaching time, an instructor facilitates student-driven discussion of the material via case scenarios, allowing for complex problem solving, peer interaction, and a deep understanding of the concepts. A successful flipped classroom should have three goals: (1) allow the students to become critical thinkers, (2) fully engage students and instructors, and (3) stimulate the development of a deep understanding of the material. The flipped classroom model includes teaching and learning methods that can appeal to all four generations in the academic environment. During the 2015 academic year, we implemented the flipped classroom in the obstetrics and gynecology clerkship for the Ochsner Clinical School in New Orleans, LA. Voice-over presentations of the lectures that had been given to students in prior years were recorded and made available to the students through an online classroom. Weekly problem-based learning sessions matched to the subjects of the traditional lectures were held, and the faculty who had previously presented the information in the traditional lecture format facilitated the problem-based learning sessions. The knowledge base of students was evaluated at the end of the rotation via a multiple-choice question examination and the Objective Structured Clinical Examination (OSCE) as had been done in previous years. We compared demographic information and examination scores for traditional teaching and flipped classroom groups of students. The traditional teaching group consisted of students from Rotation 2 and Rotation 3 of the 2014 academic year who received traditional classroom-based instruction. The flipped classroom group consisted of students from Rotation 2 and Rotation 3 of the 2015 academic year who received formal didactics via voice-over presentation and had the weekly problem-based learning sessions. When comparing the students taught by traditional methods to those taught in the flipped classroom model, we saw a statistically significant increase in test scores on the multiple-choice question examination in both the obstetrics and gynecology sections in Rotation 2. While the average score for the flipped classroom group increased in Rotation 3 on the obstetrics section of the multiple-choice question examination, the difference was not statistically significant. Unexpectedly, the average score on the gynecology portion of the multiple-choice question examination decreased among the flipped classroom group compared to the traditional teaching group, and this decrease was statistically significant. For both the obstetrics and the gynecology portions of the OSCE, we saw statistically significant increases in the scores for the flipped classroom group in both Rotation 2 and Rotation 3 compared to the traditional teaching group. With the exception of the gynecology portion of the multiple-choice question examination in Rotation 3, we saw improvement in scores after the implementation of the flipped classroom. The flipped classroom is a feasible and useful alternative to the traditional classroom. It is a method that embraces Generation Y's need for active learning in a group setting while maintaining a traditional classroom method for introducing the information. Active learning increases student engagement and can lead to improved retention of material as demonstrated on standard examinations.
Morbidity and chronic pain following different techniques of caesarean section: A comparative study.
Belci, D; Di Renzo, G C; Stark, M; Đurić, J; Zoričić, D; Belci, M; Peteh, L L
2015-01-01
Research examining long-term outcomes after childbirth performed with different techniques of caesarean section have been limited and do not provide information on morbidity and neuropathic pain. The study compares two groups of patients submitted to the 'Traditional' method using Pfannenstiel incision and patients submitted to the 'Misgav Ladach' method ≥ 5 years after the operation. We find better long-term postoperative results in the patients that were treated with the Misgav Ladach method compared with the Traditional method. The results were statistically better regarding the intensity of pain, presence of neuropathic and chronic pain and the level of satisfaction about cosmetic appearance of the scar.
Transport Coefficients from Large Deviation Functions
NASA Astrophysics Data System (ADS)
Gao, Chloe; Limmer, David
2017-10-01
We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin
2018-01-01
Background: Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. Materials and Methods: A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Results: Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Conclusions: Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills. PMID:29861761
Han, Sheng-Nan
2014-07-01
Chemometrics is a new branch of chemistry which is widely applied to various fields of analytical chemistry. Chemometrics can use theories and methods of mathematics, statistics, computer science and other related disciplines to optimize the chemical measurement process and maximize access to acquire chemical information and other information on material systems by analyzing chemical measurement data. In recent years, traditional Chinese medicine has attracted widespread attention. In the research of traditional Chinese medicine, it has been a key problem that how to interpret the relationship between various chemical components and its efficacy, which seriously restricts the modernization of Chinese medicine. As chemometrics brings the multivariate analysis methods into the chemical research, it has been applied as an effective research tool in the composition-activity relationship research of Chinese medicine. This article reviews the applications of chemometrics methods in the composition-activity relationship research in recent years. The applications of multivariate statistical analysis methods (such as regression analysis, correlation analysis, principal component analysis, etc. ) and artificial neural network (such as back propagation artificial neural network, radical basis function neural network, support vector machine, etc. ) are summarized, including the brief fundamental principles, the research contents and the advantages and disadvantages. Finally, the existing main problems and prospects of its future researches are proposed.
Cluster mass inference via random field theory.
Zhang, Hui; Nichols, Thomas E; Johnson, Timothy D
2009-01-01
Cluster extent and voxel intensity are two widely used statistics in neuroimaging inference. Cluster extent is sensitive to spatially extended signals while voxel intensity is better for intense but focal signals. In order to leverage strength from both statistics, several nonparametric permutation methods have been proposed to combine the two methods. Simulation studies have shown that of the different cluster permutation methods, the cluster mass statistic is generally the best. However, to date, there is no parametric cluster mass inference available. In this paper, we propose a cluster mass inference method based on random field theory (RFT). We develop this method for Gaussian images, evaluate it on Gaussian and Gaussianized t-statistic images and investigate its statistical properties via simulation studies and real data. Simulation results show that the method is valid under the null hypothesis and demonstrate that it can be more powerful than the cluster extent inference method. Further, analyses with a single subject and a group fMRI dataset demonstrate better power than traditional cluster size inference, and good accuracy relative to a gold-standard permutation test.
A Selective Overview of Variable Selection in High Dimensional Feature Space
Fan, Jianqing
2010-01-01
High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976
Clinical Trials With Large Numbers of Variables: Important Advantages of Canonical Analysis.
Cleophas, Ton J
2016-01-01
Canonical analysis assesses the combined effects of a set of predictor variables on a set of outcome variables, but it is little used in clinical trials despite the omnipresence of multiple variables. The aim of this study was to assess the performance of canonical analysis as compared with traditional multivariate methods using multivariate analysis of covariance (MANCOVA). As an example, a simulated data file with 12 gene expression levels and 4 drug efficacy scores was used. The correlation coefficient between the 12 predictor and 4 outcome variables was 0.87 (P = 0.0001) meaning that 76% of the variability in the outcome variables was explained by the 12 covariates. Repeated testing after the removal of 5 unimportant predictor and 1 outcome variable produced virtually the same overall result. The MANCOVA identified identical unimportant variables, but it was unable to provide overall statistics. (1) Canonical analysis is remarkable, because it can handle many more variables than traditional multivariate methods such as MANCOVA can. (2) At the same time, it accounts for the relative importance of the separate variables, their interactions and differences in units. (3) Canonical analysis provides overall statistics of the effects of sets of variables, whereas traditional multivariate methods only provide the statistics of the separate variables. (4) Unlike other methods for combining the effects of multiple variables such as factor analysis/partial least squares, canonical analysis is scientifically entirely rigorous. (5) Limitations include that it is less flexible than factor analysis/partial least squares, because only 2 sets of variables are used and because multiple solutions instead of one is offered. We do hope that this article will stimulate clinical investigators to start using this remarkable method.
Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin
2018-01-01
Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills.
The effects of modeling instruction on high school physics academic achievement
NASA Astrophysics Data System (ADS)
Wright, Tiffanie L.
The purpose of this study was to explore whether Modeling Instruction, compared to traditional lecturing, is an effective instructional method to promote academic achievement in selected high school physics classes at a rural middle Tennessee high school. This study used an ex post facto , quasi-experimental research methodology. The independent variables in this study were the instructional methods of teaching. The treatment variable was Modeling Instruction and the control variable was traditional lecture instruction. The Treatment Group consisted of participants in Physical World Concepts who received Modeling Instruction. The Control Group consisted of participants in Physical Science who received traditional lecture instruction. The dependent variable was gains scores on the Force Concepts Inventory (FCI). The participants for this study were 133 students each in both the Treatment and Control Groups (n = 266), who attended a public, high school in rural middle Tennessee. The participants were administered the Force Concepts Inventory (FCI) prior to being taught the mechanics of physics. The FCI data were entered into the computer-based Statistical Package for the Social Science (SPSS). Two independent samples t-tests were conducted to answer the research questions. There was a statistically significant difference between the treatment and control groups concerning the instructional method. Modeling Instructional methods were found to be effective in increasing the academic achievement of students in high school physics. There was no statistically significant difference between FCI gains scores for gender. Gender was found to have no effect on the academic achievement of students in high school physics classes. However, even though there was not a statistically significant difference, female students' gains scores were higher than male students' gains scores when Modeling Instructional methods of teaching were used. Based on these findings, it is recommended that high school science teachers should use Modeling Instructional methods of teaching daily in their classrooms. A recommendation for further research is to expand the Modeling Instructional methods of teaching into different content areas, (i.e., reading and language arts) to explore academic achievement gains.
ERIC Educational Resources Information Center
Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael
2017-01-01
The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…
ERIC Educational Resources Information Center
Goldfinch, Judy
1996-01-01
A study compared the effectiveness of two methods (medium-size class instruction and large lectures with tutorial sessions) for teaching mathematics and statistics to first-year business students. Students and teachers overwhelmingly preferred the medium-size class method, which produced higher exam scores but had no significant effect on…
Bayesian demography 250 years after Bayes
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Tests of Independence for Ordinal Data Using Bootstrap.
ERIC Educational Resources Information Center
Chan, Wai; Yung, Yiu-Fai; Bentler, Peter M.; Tang, Man-Lai
1998-01-01
Two bootstrap tests are proposed to test the independence hypothesis in a two-way cross table. Monte Carlo studies are used to compare the traditional asymptotic test with these bootstrap methods, and the bootstrap methods are found superior in two ways: control of Type I error and statistical power. (SLD)
STS-based education in non-majors college biology
NASA Astrophysics Data System (ADS)
Henderson, Phyllis Lee
The study explored the effect of the science-technology-society (STS) and traditional teaching methods in non-majors biology classes at a community college. It investigated the efficacy of the two methods in developing cognitive abilities at Bloom's first three levels of learning. It compared retention rates in classes taught in the two methods. Changes in student attitude relating to anxiety, fear, and interest in biology were explored. The effect of each method on grade attainment among men and women was investigated. The effect of each method on grade attainment among older and younger students was examined. Results of the study indicated that no significant differences, relating to retention or student attitude, existed in classes taught in the two methods. The study found no significant cognitive gains at Bloom's first three levels in classes taught in the traditional format. In the STS classes no significant gains were uncovered at Bloom's first level of cognition. Statistically significant gains were found in the STS classes at Bloom's second and third levels of cognition. In the classes taught in the traditional format no difference was identified in grade attainment between males and females. In the STS-based classes a small correlational difference between males and females was found with males receiving lower grades than expected. No difference in grade attainment was found between older and younger students taught in the traditional format. In the STS-based classes a small statistically significant difference in grade attainment was uncovered between older and younger students with older students receiving more A's and fewer C's than expected. This study found no difference in the grades of older, female students as compared to all other students in the traditionally taught classes. A weak statistically significant difference was discovered between grade attainment of older, female students and all other students in the STS classes with older, female students earning more A's and fewer C's than expected. It was concluded that among the students examined in this investigation STS teaching methods enhanced cognitive gains at Bloom's second and third levels of cognition. STS also strengthened grade attainment among older students and female students. Recommendations for further study included replication of the study to include a larger sample size, other types of institutions, and other academic disciplines in science. Expansion of the study to Bloom's fourth and fifth levels, use of a standardized testing instruments to determine attitude, analysis using qualitative methods of investigation, and refinement of the study to provide a true experimental design were also suggested.
Handbook of Research Methods in Social and Personality Psychology
NASA Astrophysics Data System (ADS)
Reis, Harry T.; Judd, Charles M.
2000-03-01
This volume provides an overview of research methods in contemporary social psychology. Coverage includes conceptual issues in research design, methods of research, and statistical approaches. Because the range of research methods available for social psychology have expanded extensively in the past decade, both traditional and innovative methods are presented. The goal is to introduce new and established researchers alike to new methodological developments in the field.
[Possible sources of Trichinella infection in the indigenous population of Chukotka].
Bukina, L A
2014-01-01
Statistical methods confirmed that the dietary intake of traditionally made meat from marine mammals and polar bear could cause Trichinella infection in the residents of the communities of the Chukotka Peninsula.
Lu, Fletcher; Lemonde, Manon
2013-12-01
The objective of this study was to assess if online teaching delivery produces comparable student test performance as the traditional face-to-face approach irrespective of academic aptitude. This study involves a quasi-experimental comparison of student performance in an undergraduate health science statistics course partitioned in two ways. The first partition involves one group of students taught with a traditional face-to-face classroom approach and the other through a completely online instructional approach. The second partition of the subjects categorized the academic aptitude of the students into groups of higher and lower academically performing based on their assignment grades during the course. Controls that were placed on the study to reduce the possibility of confounding variables were: the same instructor taught both groups covering the same subject information, using the same assessment methods and delivered over the same period of time. The results of this study indicate that online teaching delivery is as effective as a traditional face-to-face approach in terms of producing comparable student test performance but only if the student is academically higher performing. For academically lower performing students, the online delivery method produced significantly poorer student test results compared to those lower performing students taught in a traditional face-to-face environment.
NASA Astrophysics Data System (ADS)
Salvato, Steven Walter
The purpose of this study was to analyze questions within the chapters of a nontraditional general chemistry textbook and the four general chemistry textbooks most widely used by Texas community colleges in order to determine if the questions require higher- or lower-order thinking according to Bloom's taxonomy. The study employed quantitative methods. Bloom's taxonomy (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) was utilized as the main instrument in the study. Additional tools were used to help classify the questions into the proper category of the taxonomy (McBeath, 1992; Metfessel, Michael, & Kirsner, 1969). The top four general chemistry textbooks used in Texas community colleges and Chemistry: A Project of the American Chemical Society (Bell et al., 2005) were analyzed during the fall semester of 2010 in order to categorize the questions within the chapters into one of the six levels of Bloom's taxonomy. Two coders were used to assess reliability. The data were analyzed using descriptive and inferential methods. The descriptive method involved calculation of the frequencies and percentages of coded questions from the books as belonging to the six categories of the taxonomy. Questions were dichotomized into higher- and lower-order thinking questions. The inferential methods involved chi-square tests of association to determine if there were statistically significant differences among the four traditional college general chemistry textbooks in the proportions of higher- and lower-order questions and if there were statistically significant differences between the nontraditional chemistry textbook and the four traditional general chemistry textbooks. Findings indicated statistically significant differences among the four textbooks frequently used in Texas community colleges in the number of higher- and lower-level questions. Statistically significant differences were also found among the four textbooks and the nontraditional textbook. After the analysis of the data, conclusions were drawn, implications for practice were delineated, and recommendations for future research were given.
Android malware detection based on evolutionary super-network
NASA Astrophysics Data System (ADS)
Yan, Haisheng; Peng, Lingling
2018-04-01
In the paper, an android malware detection method based on evolutionary super-network is proposed in order to improve the precision of android malware detection. Chi square statistics method is used for selecting characteristics on the basis of analyzing android authority. Boolean weighting is utilized for calculating characteristic weight. Processed characteristic vector is regarded as the system training set and test set; hyper edge alternative strategy is used for training super-network classification model, thereby classifying test set characteristic vectors, and it is compared with traditional classification algorithm. The results show that the detection method proposed in the paper is close to or better than traditional classification algorithm. The proposed method belongs to an effective Android malware detection means.
Effects of Mobile Learning in Medical Education: A Counterfactual Evaluation.
Briz-Ponce, Laura; Juanes-Méndez, Juan Antonio; García-Peñalvo, Francisco José; Pereira, Anabela
2016-06-01
The aim of this research is to contribute to the general system education providing new insights and resources. This study performs a quasi-experimental study at University of Salamanca with 30 students to compare results between using an anatomic app for learning and the formal traditional method conducted by a teacher. The findings of the investigation suggest that the performance of learners using mobile apps is statistical better than the students using the traditional method. However, mobile devices should be considered as an additional tool to complement the teachers' explanation and it is necessary to overcome different barriers and challenges to adopt these pedagogical methods at University.
Yang, Yang; DeGruttola, Victor
2016-01-01
Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients. PMID:22740584
Yang, Yang; DeGruttola, Victor
2012-06-22
Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.
First arrival time picking for microseismic data based on DWSW algorithm
NASA Astrophysics Data System (ADS)
Li, Yue; Wang, Yue; Lin, Hongbo; Zhong, Tie
2018-03-01
The first arrival time picking is a crucial step in microseismic data processing. When the signal-to-noise ratio (SNR) is low, however, it is difficult to get the first arrival time accurately with traditional methods. In this paper, we propose the double-sliding-window SW (DWSW) method based on the Shapiro-Wilk (SW) test. The DWSW method is used to detect the first arrival time by making full use of the differences between background noise and effective signals in the statistical properties. Specifically speaking, we obtain the moment corresponding to the maximum as the first arrival time of microseismic data when the statistic of our method reaches its maximum. Hence, in our method, there is no need to select the threshold, which makes the algorithm more facile when the SNR of microseismic data is low. To verify the reliability of the proposed method, a series of experiments is performed on both synthetic and field microseismic data. Our method is compared with the traditional short-time and long-time average (STA/LTA) method, the Akaike information criterion, and the kurtosis method. Analysis results indicate that the accuracy rate of the proposed method is superior to that of the other three methods when the SNR is as low as - 10 dB.
Carvalho, Pedro; Marques, Rui Cunha
2016-02-15
This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Ragasa, Carmelita Y.
2008-01-01
The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…
Statistical and Machine Learning forecasting methods: Concerns and ways forward
Makridakis, Spyros; Assimakopoulos, Vassilios
2018-01-01
Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784
Integration of a Community Pharmacy Simulation Program into a Therapeutics Course.
Shin, Jaekyu; Tabatabai, Daryush; Boscardin, Christy; Ferrone, Marcus; Brock, Tina
2018-02-01
Objective. To demonstrate the feasibility of integrating the computer simulation, MyDispense, into a therapeutics course and to measure its effects on student perception and learning. Methods. We conducted a prospective study with an experimental phase and an implementation phase. In the first phase, students were randomized to complete a therapeutics case using MyDispense or traditional paper methods in class. In the second phase, all students completed two therapeutic cases using MyDispense in class with the option to complete four additional outside-of-class cases using MyDispense. Students completed pre- and post-tests in class and three surveys. Results. In the experimental phase, mean test scores increased from pre- to post-test for both MyDispense and traditional paper groups, but the difference between the groups was not statistically significant. Students in the traditional paper group reported statistically significant gains in confidence compared to the MyDispense group. In the implementation phase, mean test scores again increased, however, student perception of the use of MyDispense for therapeutics was negative. Completing the optional outside-of-class cases, however, was positively and significantly correlated with the midterm and final examination scores. Conclusion. Implementation of MyDispense in therapeutics may be feasible and has positive effects (eg, correlation with exam scores, capacity for immediate feedback, and potential for effective self-study). With short-term use and in the absence of assessment methods that also require seeking information from patients, students prefer to learn via traditional paper cases.
ERIC Educational Resources Information Center
Weigold, Arne; Weigold, Ingrid K.; Russell, Elizabeth J.
2013-01-01
Self-report survey-based data collection is increasingly carried out using the Internet, as opposed to the traditional paper-and-pencil method. However, previous research on the equivalence of these methods has yielded inconsistent findings. This may be due to methodological and statistical issues present in much of the literature, such as…
Undergraduate medical student's perceptions on traditional and problem based curricula: pilot study.
Meo, Sultan Ayoub
2014-07-01
To evaluate and compare students' perceptions about teaching and learning, knowledge and skills, outcomes of course materials and their satisfaction in traditional Lecture Based learning versus Problem-Based Learning curricula in two different medical schools. The comparative cross-sectional questionnaire-based study was conducted in the Department of Physiology, College of Medicine, King Saud University, Riyadh, Saudi Arabia, from July 2009 to January 2011. Two different undergraduate medical schools were selected; one followed the traditional curriculum, while the other followed the problem-based learning curriculum. Two equal groups of first year medical students were selected. They were taught in respiratory physiology and lung function lab according to their curriculum for a period of two weeks. At the completion of the study period, a five-point Likert scale was used to assess students' perceptions on satisfaction, academic environment, teaching and learning, knowledge and skills and outcomes of course materials about effectiveness of problem-based learning compared to traditional methods. SPSS 19 was used for statistical analysis. Students used to problem-based learning curriculum obtained marginally higher scores in their perceptions (24.10 +/- 3.63) compared to ones following the traditional curriculum (22.67 +/- 3.74). However, the difference in perceptions did not achieve a level of statistical significance. Students following problem-based learning curriculum have more positive perceptions on teaching and learning, knowledge and skills, outcomes of their course materials and satisfaction compared to the students belonging to the traditional style of medical school. However, the difference between the two groups was not statistically significant.
Walker, Jean T; Martin, Tina M; Haynie, Lisa; Norwood, Anne; White, Jill; Grant, LaVerne
2007-01-01
Accelerated baccalaureate nursing programs are in great demand in the United States. Currently there are 197 such programs, but little research has been conducted on student characteristics and program outcomes. This quantitative study explores preferences of second-degree students and traditional generic students with regard to teaching methods and relationships with faculty. The results indicate that statistically significant differences exist between the two groups of students. Three areas of significance are ability for self-directed learning, expectations of faculty and classroom structure, and obtaining a grade that really matters.
Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong
2013-01-01
For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods. PMID:23620809
Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong
2013-01-01
For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
Quasi-Static Probabilistic Structural Analyses Process and Criteria
NASA Technical Reports Server (NTRS)
Goldberg, B.; Verderaime, V.
1999-01-01
Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.
Environmental monitoring: data trending using a frequency model.
Caputo, Ross A; Huffman, Anne
2004-01-01
Environmental monitoring programs for the oversight of classified environments have used traditional statistical control charts to monitor trends in microbial recovery for classified environments. These methodologies work well for environments that yield measurable microbial recoveries. However, today successful increased control of microbial content yields numerous instances where microbial recovery in a sample is generally zero. As a result, traditional control chart methods cannot be used appropriately. Two methods to monitor the performance of a classified environment where microbial recovery is zero are presented. Both methods use the frequency between non-zero microbial recovery as an event. Therefore, the frequency of events is monitored rather than the microbial recovery count. Both methods are shown to be appropriate for use in the described instances.
Influence of microwave sterilization on the cutting capacity of carbide burs.
Fais, Laiza Maria Grassi; Pinelli, Lígia Antunes Pereira; Adabo, Gelson Luis; Silva, Regina Helena Barbosa Tavares da; Marcelo, Caroline Canhizares; Guaglianoni, Dalton Geraldo
2009-01-01
This study compared the cutting capacity of carbide burs sterilized with microwaves and traditional sterilization methods. Sixty burs were divided into 5 groups according to the sterilization methods: dry heat (G1), autoclave (G2), microwave irradiation (G3), glutaraldehyde (G4) or control - no sterilization (G5). The burs were used to cut glass plates in a cutting machine set for twelve 2.5-min periods and, after each period, they were sterilized (except G5) following the protocol established for each group. The cutting capacity of the burs was determined by a weight-loss method. Data were analyzed statistically by Kruskal-Wallis and Dunn's test. The means of the cutting amount performed by each group after the 12 periods were G1 = 0.2167 +/- 0.0627 g; G2 = 0.2077 +/- 0.0231 g; G3 = 0.1980 +/- 0.0326 g; G4 = 0.1203 +/- 0.0459 g; G5 = 0.2642 +/- 0.0359 g. There were statistically significant differences among the groups (p<0.05); only dry heat sterilization was similar to the control. Sterilization by dry heat was the method that least affected the cutting capacity of the carbide burs and microwave sterilization was not better than traditional sterilization methods.
Performance comparison of LUR and OK in PM2.5 concentration mapping: a multidimensional perspective
Zou, Bin; Luo, Yanqing; Wan, Neng; Zheng, Zhong; Sternberg, Troy; Liao, Yilan
2015-01-01
Methods of Land Use Regression (LUR) modeling and Ordinary Kriging (OK) interpolation have been widely used to offset the shortcomings of PM2.5 data observed at sparse monitoring sites. However, traditional point-based performance evaluation strategy for these methods remains stagnant, which could cause unreasonable mapping results. To address this challenge, this study employs ‘information entropy’, an area-based statistic, along with traditional point-based statistics (e.g. error rate, RMSE) to evaluate the performance of LUR model and OK interpolation in mapping PM2.5 concentrations in Houston from a multidimensional perspective. The point-based validation reveals significant differences between LUR and OK at different test sites despite the similar end-result accuracy (e.g. error rate 6.13% vs. 7.01%). Meanwhile, the area-based validation demonstrates that the PM2.5 concentrations simulated by the LUR model exhibits more detailed variations than those interpolated by the OK method (i.e. information entropy, 7.79 vs. 3.63). Results suggest that LUR modeling could better refine the spatial distribution scenario of PM2.5 concentrations compared to OK interpolation. The significance of this study primarily lies in promoting the integration of point- and area-based statistics for model performance evaluation in air pollution mapping. PMID:25731103
Wilson, Jennifer A; Pegram, Angela H; Battise, Dawn M; Robinson, April M
2017-11-01
To determine if traditional didactic lecture or the jigsaw learning method is more effective to teach the medication therapy management (MTM) core elements in a first year pharmacy course. Traditional didactic lecture and a pre-class reading assignment were used in the fall semester cohort, and the jigsaw method was used in the spring semester cohort. Jigsaw is a cooperative learning strategy requiring students to assume responsibility for learning, and subsequently teaching peers. The students were responsible for reading specific sections of the pre-class reading, and then teaching other students in small groups about their specific reading assignments. To assess potential differences, identical pre- and post-tests were administered before and after the MTM section. Additionally, grade performance on an in-class project and final exam questions were compared, and students were surveyed on perceptions of teaching method used. A total of 45 and 43 students completed both the pre- and post-test in the fall and spring (96% and 93% response rate), respectively. Improvement in post-test scores favored the traditional method (p = 0.001). No statistical differences were noted between groups with grade performance on the in-class project and final exam questions. However, students favored the jigsaw method over traditional lecture and perceived improvements in problem solving skills, listening/communication skills and encouragement of cooperative learning (p = 0.018, 0.025 and 0.031). Although students favored the jigsaw learning method, traditional didactic lecture was more effective for the pre- and post-knowledge test performance. This may indicate that traditional didactic lecture is more effective for more foundational content. Copyright © 2017 Elsevier Inc. All rights reserved.
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
ERIC Educational Resources Information Center
Slezak, Jonathan M.; Faas, Caitlin
2017-01-01
This study implemented the components of interteaching as a probe to teach American Psychological Association (APA) Style to undergraduate university students in a psychology research methods and statistics course. The interteaching method was compared to the traditional lecture-based approach between two sections of the course with the same…
Mansoorian, Mohammad Reza; Hosseiny, Marzeih Sadat; Khosravan, Shahla; Alami, Ali; Alaviani, Mehri
2015-06-01
Despite the benefits of the objective structured assessment of technical skills (OSATS) and it appropriateness for evaluating clinical abilities of nursing students , few studies are available on the application of this method in nursing education. The purpose of this study was to compare the effect of using OSATS and traditional methods on the students' learning. We also aimed to signify students' views about these two methods and their views about the scores they received in these methods in a medical emergency course. A quasi-experimental study was performed on 45 first semester students in nursing and medical emergencies passing a course on fundamentals of practice. The students were selected by a census method and evaluated by both the OSATS and traditional methods. Data collection was performed using checklists prepared based on the 'text book of nursing procedures checklists' published by Iranian nursing organization and a questionnaire containing learning rate and students' estimation of their received scores. Descriptive statistics as well as paired t-test and independent samples t-test were used in data analysis. The mean of students' score in OSATS was significantly higher than their mean score in traditional method (P = 0.01). Moreover, the mean of self-evaluation score after the traditional method was relatively the same as the score the students received in the exam. However, the mean of self-evaluation score after the OSATS was relatively lower than the scores the students received in the OSATS exam. Most students believed that OSATS can evaluate a wide range of students' knowledge and skills compared to traditional method. Results of this study indicated the better effect of OSATS on learning and its relative superiority in precise assessment of clinical skills compared with the traditional evaluation method. Therefore, we recommend using this method in evaluation of students in practical courses.
Unwanted pregnancy and traditional self-induced abortion methods known among women aged 15 to 49.
Sensoy, Nazli; Dogan, Nurhan; Sen, Kubra; Aslan, Halit; Tore-Baser, Ayca
2015-05-01
To determine the traditional methods known and used to terminate an unwanted pregnancy and the fertility characteristics of married women. The descriptive cross-sectional study was conducted in Turkey at Afyonkarahisar Zübeyde Hanim Child and Maternity Hospital's outpatient clinic between December 27, 2010 and January 7, 2011, and comprised married women aged 17 to 49 who presented for an examination. Questions related to socio-demographic and fertility characteristics as well as known and used traditional abortion methods were included in the questionnaire which was administered through face-to-face interviews. SPSS 18.0 was used for statistical analysis. The median age of the 600 women in the study was 29.5 (range: 17-49) years. Overall, 134 (22.3%) women had experienced an unwanted pregnancy. In 53 (39.6%) cases, the unwanted pregnancy had occurred between the ages of 30 and 39, and 116(86.6%) women had married when they were between 15 and 24 (p< 0.008) years old. Pregnancy had been concluded normally in 78(58.2%)women with an unwanted pregnancy and 34(35.8%)preferred the withdrawal method for contraception. Traditional abortion methods were known to 413(68.8%)women, but only 8(1.3%) had used any of them. The harms of using a traditional abortion method were known to 464(77.3%)women. Very few women used traditional abortion methods to terminate pregnancy. Knowing the characteristics of women and their need for family planning should be the first priority for the prevention of unwanted pregnancies.
[Effect and regulation of drying on quality of traditional Chinese medicine pills].
Qi, Ya-Ru; Li, Yuan-Hui; Han, Li; Wu, Zhen-Feng; Yue, Peng-Fei; Wang, Xue-Cheng; Xiong, Yao-Kun; Yang, Ming
2017-06-01
The dry quality of traditional Chinese medicine pills is the hot spot of pills research, because their quality has a crucial effect on the efficacy and development of dosage forms. Through literature research and statistical analysis, we would review the current problems on the drying of traditional Chinese medicine pills in this paper, and surrounding the evaluation system for traditional Chinese medicine pills, analyze the characteristics of common drying equipment and processes as well as their effect on quality of pills, discuss the problems in drying equipment and process as well as quality, and put forward the corresponding strategies, hoping to provide new ideas and new methods for the quality improvement of traditional Chinese medicine pills and quality standards. Copyright© by the Chinese Pharmaceutical Association.
Automated detection of hospital outbreaks: A systematic review of methods.
Leclère, Brice; Buckeridge, David L; Boëlle, Pierre-Yves; Astagneau, Pascal; Lepelletier, Didier
2017-01-01
Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results.
Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo
2015-01-01
Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.
Bhardwaj, A; Nagandla, K; Swe, K Mm; Abas, A Bl
2015-01-01
E-learning is the use of Information and Communication Technology (ICT) to provide online education and learning. E- Learning has now been integrated into the traditional teaching as the concept of 'blended learning' that combines digital learning with the existing traditional teaching methods to address the various challenges in the field of medical education. Structured e-learning activities were started in Melaka Manipal Medical College in 2009 via e-learning platform (MOODLE-Modular Object-Oriented Dynamic Learning Environment). The objective of the present study is to investigate the faculty opinions toward the existing e-learning activities, and to analyse the extent of adopting and integration of e-learning into their traditional teaching methods. A cross sectional study was conducted among faculties of Medicine and Dentistry using pre-tested questionnaires. The data was analyzed by using the statistical package for social science, SPSS, version 16.0. The result of our survey indicates that majority of our faculty (65.4%) held positive opinion towards e-learning. Among the few, who demonstrated reservations, it is attributed to their average level of skills and aptitude in the use of computers that was statistically significant (p<0.05). Our study brings to light the need for formal training as perquisite to support e-learning that enables smooth transition of the faculty from their traditional teaching methods into blended approach. Our results are anticipated to strengthen the existing e-learning activities of our college and other universities and convincingly adopt e-learning as a viable teaching and learning strategy.
Web-based vs. traditional classroom instruction in gerontology: a pilot study.
Gallagher, Judith E; Dobrosielski-Vergona, Kathleen A; Wingard, Robin G; Williams, Theresa M
2005-01-01
Numerous studies have documented comparable outcomes from Web-based and traditional classroom instruction. However, there is a paucity of literature comparing these two delivery formats for gerontology courses in dental hygiene curricula. This study examines the effectiveness of alternative methods of course delivery by comparing student profiles and instructional outcomes from a dental hygiene gerontology course offered both on the Web and in a traditional classroom setting. Questionnaires were sent to both groups of students completing the course. The instrument was designed to establish profiles of the participating students. The data collected included familiarity with Web-based instruction, extent of prior computer training, previous interaction with the elderly, and student evaluations of course effectiveness. Traditional instructional outcomes from evaluated course work were compared, as were post-course exam outcomes that assessed retention of course information six months after course completion. The statistical significance of these data was determined using Statistical Package for Social Scientists software (SPSS, Inc., version 12.0, Chicago, IL). A comparison of student characteristics enrolled in the two course formats revealed marked differences. The Web-based group (n=12) included dental hygiene students (67%) and other health care providers (25%). All participants in the traditional classroom format (n=32) were dental hygiene students. Half of the Web-based respondents were over 25 years of age, and the majority (n=8) had previously taken an online course. The majority of traditional classroom students were 25 years of age or younger (n=21) and had never taken a Web-based course (n=20). Statistically significant differences in instructional outcomes were observed between students enrolled in these two formats. Student retention of course material six months after completion of the course was greater in the Web-based format. Students selecting a Web-based course format demonstrated greater motivation and learning success based on final course grades, completion of assignments, and knowledge retention over time. Age, previous experience with online courses, and selection of teaching mode are factors that may confound course delivery method to influence instructional outcomes in a gerontology course within a dental hygiene curriculum.
Brackstone, G J
1984-01-01
The author presents some general thoughts on the implications of technological change for the 1990 round of censuses and for the statistical use of administrative records. Consideration is also given to alternative methods of obtaining the type of data traditionally collected in a population census, by using these new technologies in association with administrative record systems.
solGS: a web-based tool for genomic selection
USDA-ARS?s Scientific Manuscript database
Genomic selection (GS) promises to improve accuracy in estimating breeding values and genetic gain for quantitative traits compared to traditional breeding methods. Its reliance on high-throughput genome-wide markers and statistical complexity, however, is a serious challenge in data management, ana...
Khobragade, Sujata; Abas, Adinegara Lutfi; Khobragade, Yadneshwar Sudam
2016-01-01
Learning outcomes after traditional teaching methods were compared with problem-based learning (PBL) among fifth year medical students. Six students participated each in traditional teaching and PBL methods, respectively. Traditional teaching method involved PowerPoint (PPT) presentation and PBL included study on case scenario and discussion. Both methods were effective in improving performance of students. Postteaching, we did not find significant differences in learning outcomes between these two teaching methods. (1) Study was conducted with an intention to find out which method of learning is more effective; traditional or PBL. (2) To assess the level of knowledge and understanding in anemia/zoonotic diseases as against diabetes/hypertension. All the students posted from February 3, 2014, to March 14, 2014, participated in this study. Six students were asked to prepare and present a lecture (PPT) and subsequent week other six students were asked to present PBL. Both groups presented different topics. Since it was a pre- and post-test, same students were taken as control. To maintain uniformity and to avoid bias due cultural diversity, language etc., same questions were administered. After taking verbal consent, all 34 students were given pretest on anemia and zoonotic diseases. Then lecture (PPT) by six students on the same topic was given it followed by posttest questionnaire. Subsequent week pretest was conducted on hypertension and diabetes. Then case scenario presentation and discussion (PBL) was done by different six students followed by posttest. Both the methods were compared. Analysis was done manually and standard error of means and students t -test was used to find out statistical significance. We found statistically significant improvement in performance of students after PPT presentation as well as PBL. Both methods are equally effective. However, Pretest results of students in anemia and zoonotic diseases (Group A) were poor compared to pretest results of students in hypertension and diabetes (Group B). The students who participated in presentation did not influence their performance as they were covering a small part of the topic and there were no differences in their marks compared to other students. We did not find significant differences in outcome after teaching between PBL and traditional methods. Performances of students were poor in anemia and zoonotic diseases which need remedial teaching. Assessment may influence retention ability and performance.
Scaling images using their background ratio. An application in statistical comparisons of images.
Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J
2003-06-07
Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.
Feng, Yong; Chen, Aiqing
2017-01-01
This study aimed to quantify blood pressure (BP) measurement accuracy and variability with different techniques. Thirty video clips of BP recordings from the BHS training database were converted to Korotkoff sound waveforms. Ten observers without receiving medical training were asked to determine BPs using (a) traditional manual auscultatory method and (b) visual auscultation method by visualizing the Korotkoff sound waveform, which was repeated three times on different days. The measurement error was calculated against the reference answers, and the measurement variability was calculated from the SD of the three repeats. Statistical analysis showed that, in comparison with the auscultatory method, visual method significantly reduced overall variability from 2.2 to 1.1 mmHg for SBP and from 1.9 to 0.9 mmHg for DBP (both p < 0.001). It also showed that BP measurement errors were significant for both techniques (all p < 0.01, except DBP from the traditional method). Although significant, the overall mean errors were small (−1.5 and −1.2 mmHg for SBP and −0.7 and 2.6 mmHg for DBP, resp., from the traditional auscultatory and visual auscultation methods). In conclusion, the visual auscultation method had the ability to achieve an acceptable degree of BP measurement accuracy, with smaller variability in comparison with the traditional auscultatory method. PMID:29423405
Compositional data analysis for physical activity, sedentary time and sleep research.
Dumuid, Dorothea; Stanford, Tyman E; Martin-Fernández, Josep-Antoni; Pedišić, Željko; Maher, Carol A; Lewis, Lucy K; Hron, Karel; Katzmarzyk, Peter T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Lambert, Estelle V; Maia, José; Sarmiento, Olga L; Standage, Martyn; Barreira, Tiago V; Broyles, Stephanie T; Tudor-Locke, Catrine; Tremblay, Mark S; Olds, Timothy
2017-01-01
The health effects of daily activity behaviours (physical activity, sedentary time and sleep) are widely studied. While previous research has largely examined activity behaviours in isolation, recent studies have adjusted for multiple behaviours. However, the inclusion of all activity behaviours in traditional multivariate analyses has not been possible due to the perfect multicollinearity of 24-h time budget data. The ensuing lack of adjustment for known effects on the outcome undermines the validity of study findings. We describe a statistical approach that enables the inclusion of all daily activity behaviours, based on the principles of compositional data analysis. Using data from the International Study of Childhood Obesity, Lifestyle and the Environment, we demonstrate the application of compositional multiple linear regression to estimate adiposity from children's daily activity behaviours expressed as isometric log-ratio coordinates. We present a novel method for predicting change in a continuous outcome based on relative changes within a composition, and for calculating associated confidence intervals to allow for statistical inference. The compositional data analysis presented overcomes the lack of adjustment that has plagued traditional statistical methods in the field, and provides robust and reliable insights into the health effects of daily activity behaviours.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
INFLUENCE OF MICROWAVE STERILIZATION ON THE CUTTING CAPACITY OF CARBIDE BURS
Fais, Laiza Maria Grassi; Pinelli, Lígia Antunes Pereira; Adabo, Gelson Luis; da Silva, Regina Helena Barbosa Tavares; Marcelo, Caroline Canhizares; Guaglianoni, Dalton Geraldo
2009-01-01
Objective: This study compared the cutting capacity of carbide burs sterilized with microwaves and traditional sterilization methods. Material and Methods: Sixty burs were divided into 5 groups according to the sterilization methods: dry heat (G1), autoclave (G2), microwave irradiation (G3), glutaraldehyde (G4) or control – no sterilization (G5). The burs were used to cut glass plates in a cutting machine set for twelve 2.5-min periods and, after each period, they were sterilized (except G5) following the protocol established for each group. The cutting capacity of the burs was determined by a weight-loss method. Data were analyzed statistically by Kruskal-Wallis and Dunn's test. Results: The means of the cutting amount performed by each group after the 12 periods were G1 = 0.2167 ± 0.0627 g; G2 = 0.2077 ± 0.0231 g; G3 = 0.1980 ± 0.0326 g; G4 = 0.1203 ± 0.0459 g; G5 = 0.2642 ± 0.0359 g. There were statistically significant differences among the groups (p<0.05); only dry heat sterilization was similar to the control. Conclusion: Sterilization by dry heat was the method that least affected the cutting capacity of the carbide burs and microwave sterilization was not better than traditional sterilization methods. PMID:20027431
Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications
NASA Technical Reports Server (NTRS)
Hughes, William O.; Paez, Thomas L.
2006-01-01
This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.
A Statistical Project Control Tool for Engineering Managers
NASA Technical Reports Server (NTRS)
Bauch, Garland T.
2001-01-01
This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.
Automated detection of hospital outbreaks: A systematic review of methods
Buckeridge, David L.; Lepelletier, Didier
2017-01-01
Objectives Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. Methods We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Results Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Conclusion Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results. PMID:28441422
Statistical-Dynamical Seasonal Forecasts of Central-Southwest Asian Winter Precipitation.
NASA Astrophysics Data System (ADS)
Tippett, Michael K.; Goddard, Lisa; Barnston, Anthony G.
2005-06-01
Interannual precipitation variability in central-southwest (CSW) Asia has been associated with East Asian jet stream variability and western Pacific tropical convection. However, atmospheric general circulation models (AGCMs) forced by observed sea surface temperature (SST) poorly simulate the region's interannual precipitation variability. The statistical-dynamical approach uses statistical methods to correct systematic deficiencies in the response of AGCMs to SST forcing. Statistical correction methods linking model-simulated Indo-west Pacific precipitation and observed CSW Asia precipitation result in modest, but statistically significant, cross-validated simulation skill in the northeast part of the domain for the period from 1951 to 1998. The statistical-dynamical method is also applied to recent (winter 1998/99 to 2002/03) multimodel, two-tier December-March precipitation forecasts initiated in October. This period includes 4 yr (winter of 1998/99 to 2001/02) of severe drought. Tercile probability forecasts are produced using ensemble-mean forecasts and forecast error estimates. The statistical-dynamical forecasts show enhanced probability of below-normal precipitation for the four drought years and capture the return to normal conditions in part of the region during the winter of 2002/03.May Kabul be without gold, but not without snow.—Traditional Afghan proverb
Tu, Li-ping; Chen, Jing-bo; Hu, Xiao-juan; Zhang, Zhi-feng
2016-01-01
Background and Goal. The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods. Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results. The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions. At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible. PMID:28050555
Qi, Zhen; Tu, Li-Ping; Chen, Jing-Bo; Hu, Xiao-Juan; Xu, Jia-Tuo; Zhang, Zhi-Feng
2016-01-01
Background and Goal . The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods . Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results . The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions . At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible.
On-demand Reporting of Risk-adjusted and Smoothed Rates for Quality Profiling in ACS NSQIP.
Cohen, Mark E; Liu, Yaoming; Huffman, Kristopher M; Ko, Clifford Y; Hall, Bruce L
2016-12-01
Surgical quality improvement depends on hospitals having accurate and timely information about comparative performance. Profiling accuracy is improved by risk adjustment and shrinkage adjustment to stabilize estimates. These adjustments are included in ACS NSQIP reports, where hospital odds ratios (OR) are estimated using hierarchical models built on contemporaneous data. However, the timeliness of feedback remains an issue. We describe an alternative, nonhierarchical approach, which yields risk- and shrinkage-adjusted rates. In contrast to our "Traditional" NSQIP method, this approach uses preexisting equations, built on historical data, which permits hospitals to have near immediate access to profiling results. We compared our traditional method to this new "on-demand" approach with respect to outlier determinations, kappa statistics, and correlations between logged OR and standardized rates, for 12 models (4 surgical groups by 3 outcomes). When both methods used the same contemporaneous data, there were similar numbers of hospital outliers and correlations between logged OR and standardized rates were high. However, larger differences were observed when the effect of contemporaneous versus historical data was added to differences in statistical methodology. The on-demand, nonhierarchical approach provides results similar to the traditional hierarchical method and offers immediacy, an "over-time" perspective, application to a broader range of models and data subsets, and reporting of more easily understood rates. Although the nonhierarchical method results are now available "on-demand" in a web-based application, the hierarchical approach has advantages, which support its continued periodic publication as the gold standard for hospital profiling in the program.
2011-01-01
Background Graduate-entry medicine is a recent development in the UK, intended to expand and broaden access to medical training. After eight years, it is time to evaluate its success in recruitment. Objectives This study aimed to compare the applications and admissions profiles of graduate-entry programmes in the UK to traditional 5 and 6-year courses. Methods Aggregate data on applications and admissions were obtained from the Universities and Colleges Admission Service covering 2003 to 2009. Data were extracted, grouped as appropriate and analysed with the Statistical Package for the Social Sciences. Results Graduate-entry attracts 10,000 applications a year. Women form the majority of applicants and admissions to graduate-entry and traditional medicine programmes. Graduate-entry age profile is older, typically 20's or 30's compared to 18 or 19 years in traditional programmes. Graduate-entry applications and admissions were higher from white and black UK ethnic communities than traditional programmes, and lower from southern and Chinese Asian groups. Graduate-entry has few applications or admissions from Scotland or Northern Ireland. Secondary educational achievement is poorer amongst graduate-entry applicants and admissions than traditional programmes. Conclusions Graduate-entry has succeeded in recruiting substantial additional numbers of older applicants to medicine, in which white and black groups are better represented and Asian groups more poorly represented than in traditional undergraduate programmes. PMID:21943332
Online Learning and the Development of Counseling Self-Efficacy Beliefs
ERIC Educational Resources Information Center
Watson, Joshua C.
2012-01-01
This study examined the relationship between enrollment in online counseling courses and students' counseling selfefficacy beliefs. Results indicate that students enrolled in online courses report statistically significant higher selfefficacy beliefs than students in traditional FTF courses. Online instructional method may increase counselor…
Røislien, Jo; Lossius, Hans Morten; Kristiansen, Thomas
2015-01-01
Background Trauma is a leading global cause of death. Trauma mortality rates are higher in rural areas, constituting a challenge for quality and equality in trauma care. The aim of the study was to explore population density and transport time to hospital care as possible predictors of geographical differences in mortality rates, and to what extent choice of statistical method might affect the analytical results and accompanying clinical conclusions. Methods Using data from the Norwegian Cause of Death registry, deaths from external causes 1998–2007 were analysed. Norway consists of 434 municipalities, and municipality population density and travel time to hospital care were entered as predictors of municipality mortality rates in univariate and multiple regression models of increasing model complexity. We fitted linear regression models with continuous and categorised predictors, as well as piecewise linear and generalised additive models (GAMs). Models were compared using Akaike's information criterion (AIC). Results Population density was an independent predictor of trauma mortality rates, while the contribution of transport time to hospital care was highly dependent on choice of statistical model. A multiple GAM or piecewise linear model was superior, and similar, in terms of AIC. However, while transport time was statistically significant in multiple models with piecewise linear or categorised predictors, it was not in GAM or standard linear regression. Conclusions Population density is an independent predictor of trauma mortality rates. The added explanatory value of transport time to hospital care is marginal and model-dependent, highlighting the importance of exploring several statistical models when studying complex associations in observational data. PMID:25972600
Multiresolution multiscale active mask segmentation of fluorescence microscope images
NASA Astrophysics Data System (ADS)
Srinivasa, Gowri; Fickus, Matthew; Kovačević, Jelena
2009-08-01
We propose an active mask segmentation framework that combines the advantages of statistical modeling, smoothing, speed and flexibility offered by the traditional methods of region-growing, multiscale, multiresolution and active contours respectively. At the crux of this framework is a paradigm shift from evolving contours in the continuous domain to evolving multiple masks in the discrete domain. Thus, the active mask framework is particularly suited to segment digital images. We demonstrate the use of the framework in practice through the segmentation of punctate patterns in fluorescence microscope images. Experiments reveal that statistical modeling helps the multiple masks converge from a random initial configuration to a meaningful one. This obviates the need for an involved initialization procedure germane to most of the traditional methods used to segment fluorescence microscope images. While we provide the mathematical details of the functions used to segment fluorescence microscope images, this is only an instantiation of the active mask framework. We suggest some other instantiations of the framework to segment different types of images.
Zhang, Ying; Xue, Liu-Hua; Chen, Yu-Xia; Huang, Shi-Jing; Pan, Ju-Hua; Wang, Jie
2013-08-01
To norm the behavior of AIDS cough in traditional Chinese medicine diagnosis and treatment and improve the clinical level of cough treatment for HIV/AIDS, and build AIDS cough diagnosis and treatment procedures in traditional Chinese medicine. Combined with clinical practice,to formulate questionnaire on AIDS cough in traditional Chinese medicine diagnosis and treatment by both English and Chinese literature research to expertise consultation and verify the results of the questionnaires on the statistics using the Delphi method. Questionnaire contents consist of overview, pathogeny, diagnosis standard, dialectical medication (phlegm heat resistance pulmonary lung and kidney Yin deficiency lung spleen-deficiency), treating spleen-deficiency (lung), moxibustion treatment and aftercare care and diet and mental, average (2.93-3.00), full mark rate (93.10%-100%) ranks average (9.91-10.67) and (287.50-309.50) of which are the most high value, and the variation coefficient is 0.00, the Kendall coefficient (Kendalls W) is 0.049 which is statistical significance, the questionnaire reliability value of alpha was 0.788. Preliminary standarded concept, etiology and pathogenesis, diagnosis and syndrome differentiation treatment of AIDS cough, basically recognised by the experts in this field, and laid the foundation of traditional Chinese medicine diagnosis and treatment on develop the AIDS cough specifications.
The Impact of Team-Based Learning on Nervous System Examination Knowledge of Nursing Students.
Hemmati Maslakpak, Masomeh; Parizad, Naser; Zareie, Farzad
2015-12-01
Team-based learning is one of the active learning approaches in which independent learning is combined with small group discussion in the class. This study aimed to determine the impact of team-based learning in nervous system examination knowledge of nursing students. This quasi-experimental study was conducted on 3(rd) grade nursing students, including 5th semester (intervention group) and 6(th) semester (control group). The traditional lecture method and the team-based learning method were used for educating the examination of the nervous system for intervention and control groups, respectively. The data were collected by a test covering 40-questions (multiple choice, matching, gap-filling and descriptive questions) before and after intervention in both groups. Individual Readiness Assurance Test (RAT) and Group Readiness Assurance Test (GRAT) used to collect data in the intervention group. In the end, the collected data were analyzed by SPSS ver. 13 using descriptive and inferential statistical tests. In team-based learning group, mean and standard deviation was 13.39 (4.52) before the intervention, which had been increased to 31.07 (3.20) after the intervention and this increase was statistically significant. Also, there was a statistically significant difference between the scores of RAT and GRAT in team-based learning group. Using team-based learning approach resulted in much better improvement and stability in the nervous system examination knowledge of nursing students compared to traditional lecture method; therefore, this method could be efficiently used as an effective educational approach in nursing education.
Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for fluid-particle flows
NASA Astrophysics Data System (ADS)
Kong, Bo; Patel, Ravi G.; Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney O.
2017-11-01
In this work, we study the performance of three simulation techniques for fluid-particle flows: (1) a volume-filtered Euler-Lagrange approach (EL), (2) a quadrature-based moment method using the anisotropic Gaussian closure (AG), and (3) a traditional two-fluid model. By simulating two problems: particles in frozen homogeneous isotropic turbulence (HIT), and cluster-induced turbulence (CIT), the convergence of the methods under grid refinement is found to depend on the simulation method and the specific problem, with CIT simulations facing fewer difficulties than HIT. Although EL converges under refinement for both HIT and CIT, its statistical results exhibit dependence on the techniques used to extract statistics for the particle phase. For HIT, converging both EE methods (TFM and AG) poses challenges, while for CIT, AG and EL produce similar results. Overall, all three methods face challenges when trying to extract converged, parameter-independent statistics due to the presence of shocks in the particle phase. National Science Foundation and National Energy Technology Laboratory.
E-Assessment within the Bologna Paradigm: Evidence from Portugal
ERIC Educational Resources Information Center
Ferrao, Maria
2010-01-01
The Bologna Declaration brought reforms into higher education that imply changes in teaching methods, didactic materials and textbooks, infrastructures and laboratories, etc. Statistics and mathematics are disciplines that traditionally have the worst success rates, particularly in non-mathematics core curricula courses. This research project,…
School Collective Efficacy and Bullying Behaviour: A Multilevel Study.
Olsson, Gabriella; Låftman, Sara Brolin; Modin, Bitte
2017-12-20
As with other forms of violent behaviour, bullying is the result of multiple influences acting on different societal levels. Yet the majority of studies on bullying focus primarily on the characteristics of individual bullies and bullied. Fewer studies have explored how the characteristics of central contexts in young people's lives are related to bullying behaviour over and above the influence of individual-level characteristics. This study explores how teacher-rated school collective efficacy is related to student-reported bullying behaviour (traditional and cyberbullying victimization and perpetration). A central focus is to explore if school collective efficacy is related similarly to both traditional bullying and cyberbullying. Analyses are based on combined information from two independent data collections conducted in 2016 among 11th grade students ( n = 6067) and teachers ( n = 1251) in 58 upper secondary schools in Stockholm. The statistical method used is multilevel modelling, estimating two-level binary logistic regression models. The results demonstrate statistically significant between-school differences in all outcomes, except traditional bullying perpetration. Strong school collective efficacy is related to less traditional bullying perpetration and less cyberbullying victimization and perpetration, indicating that collective norm regulation and school social cohesion may contribute to reducing the occurrence of bullying.
School Collective Efficacy and Bullying Behaviour: A Multilevel Study
Olsson, Gabriella; Låftman, Sara Brolin; Modin, Bitte
2017-01-01
As with other forms of violent behaviour, bullying is the result of multiple influences acting on different societal levels. Yet the majority of studies on bullying focus primarily on the characteristics of individual bullies and bullied. Fewer studies have explored how the characteristics of central contexts in young people’s lives are related to bullying behaviour over and above the influence of individual-level characteristics. This study explores how teacher-rated school collective efficacy is related to student-reported bullying behaviour (traditional and cyberbullying victimization and perpetration). A central focus is to explore if school collective efficacy is related similarly to both traditional bullying and cyberbullying. Analyses are based on combined information from two independent data collections conducted in 2016 among 11th grade students (n = 6067) and teachers (n = 1251) in 58 upper secondary schools in Stockholm. The statistical method used is multilevel modelling, estimating two-level binary logistic regression models. The results demonstrate statistically significant between-school differences in all outcomes, except traditional bullying perpetration. Strong school collective efficacy is related to less traditional bullying perpetration and less cyberbullying victimization and perpetration, indicating that collective norm regulation and school social cohesion may contribute to reducing the occurrence of bullying. PMID:29261114
Hansen, J V; Nelson, R D
1997-01-01
Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.
Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.
2016-01-01
Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). Conclusion This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832
A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress
2018-01-01
The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies. PMID:29765399
A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress.
Cheng, Ching-Hsue; Chan, Chia-Pang; Yang, Jun-He
2018-01-01
The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.
Properties of the endogenous post-stratified estimator using a random forests model
John Tipton; Jean Opsomer; Gretchen G. Moisen
2012-01-01
Post-stratification is used in survey statistics as a method to improve variance estimates. In traditional post-stratification methods, the variable on which the data is being stratified must be known at the population level. In many cases this is not possible, but it is possible to use a model to predict values using covariates, and then stratify on these predicted...
Validation of Milliflex® Quantum for Bioburden Testing of Pharmaceutical Products.
Gordon, Oliver; Goverde, Marcel; Staerk, Alexandra; Roesti, David
2017-01-01
This article reports the validation strategy used to demonstrate that the Milliflex ® Quantum yielded non-inferior results to the traditional bioburden method. It was validated according to USP <1223>, European Pharmacopoeia 5.1.6, and Parenteral Drug Association Technical Report No. 33 and comprised the validation parameters robustness, ruggedness, repeatability, specificity, limit of detection and quantification, accuracy, precision, linearity, range, and equivalence in routine operation. For the validation, a combination of pharmacopeial ATCC strains as well as a broad selection of in-house isolates were used. In-house isolates were used in stressed state. Results were statistically evaluated regarding the pharmacopeial acceptance criterion of ≥70% recovery compared to the traditional method. Post-hoc test power calculations verified the appropriateness of the used sample size to detect such a difference. Furthermore, equivalence tests verified non-inferiority of the rapid method as compared to the traditional method. In conclusion, the rapid bioburden on basis of the Milliflex ® Quantum was successfully validated as alternative method to the traditional bioburden test. LAY ABSTRACT: Pharmaceutical drug products must fulfill specified quality criteria regarding their microbial content in order to ensure patient safety. Drugs that are delivered into the body via injection, infusion, or implantation must be sterile (i.e., devoid of living microorganisms). Bioburden testing measures the levels of microbes present in the bulk solution of a drug before sterilization, and thus it provides important information for manufacturing a safe product. In general, bioburden testing has to be performed using the methods described in the pharmacopoeias (membrane filtration or plate count). These methods are well established and validated regarding their effectiveness; however, the incubation time required to visually identify microbial colonies is long. Thus, alternative methods that detect microbial contamination faster will improve control over the manufacturing process and speed up product release. Before alternative methods may be used, they must undergo a side-by-side comparison with pharmacopeial methods. In this comparison, referred to as validation, it must be shown in a statistically verified manner that the effectiveness of the alternative method is at least equivalent to that of the pharmacopeial methods. Here we describe the successful validation of an alternative bioburden testing method based on fluorescent staining of growing microorganisms applying the Milliflex ® Quantum system by MilliporeSigma. © PDA, Inc. 2017.
Bryant, Fred B
2016-12-01
This paper introduces a special section of the current issue of the Journal of Evaluation in Clinical Practice that includes a set of 6 empirical articles showcasing a versatile, new machine-learning statistical method, known as optimal data (or discriminant) analysis (ODA), specifically designed to produce statistical models that maximize predictive accuracy. As this set of papers clearly illustrates, ODA offers numerous important advantages over traditional statistical methods-advantages that enhance the validity and reproducibility of statistical conclusions in empirical research. This issue of the journal also includes a review of a recently published book that provides a comprehensive introduction to the logic, theory, and application of ODA in empirical research. It is argued that researchers have much to gain by using ODA to analyze their data. © 2016 John Wiley & Sons, Ltd.
Teaching-learning: stereoscopic 3D versus Traditional methods in Mexico City.
Mendoza Oropeza, Laura; Ortiz Sánchez, Ricardo; Ojeda Villagómez, Raúl
2015-01-01
In the UNAM Faculty of Odontology, we use a stereoscopic 3D teaching method that has grown more common in the last year, which makes it important to know whether students can learn better with this strategy. The objective of the study is to know, if the 4th year students of the bachelor's degree in dentistry learn more effectively with the use of stereoscopic 3D than the traditional method in Orthodontics. first, we selected the course topics, to be used for both methods; the traditional method using projection of slides and for the stereoscopic third dimension, with the use of videos in digital stereo projection (seen through "passive" polarized 3D glasses). The main topic was supernumerary teeth, including and diverted from their guide eruption. Afterwards we performed an exam on students, containing 24 items, validated by expert judgment in Orthodontics teaching. The results of the data were compared between the two educational methods for determined effectiveness using the model before and after measurement with the statistical package SPSS 20 version. The results presented for the 9 groups of undergraduates in dentistry, were collected with a total of 218 students for 3D and traditional methods, we found in a traditional method a mean 4.91, SD 1.4752 in the pretest and X=6.96, SD 1.26622, St Error 0.12318 for the posttest. The 3D method had a mean 5.21, SD 1.996779 St Error 0.193036 for the pretest X= 7.82, SD =0.963963, St Error 0.09319 posttest; the analysis of Variance between groups F= 5.60 Prob > 0.0000 and Bartlett's test for equal variances 21.0640 Prob > chi2 = 0.007. These results show that the student's learning in 3D means a significant improvement as compared to the traditional teaching method and having a strong association between the two methods. The findings suggest that the stereoscopic 3D method lead to improved student learning compared to traditional teaching.
Applications of quantum entropy to statistics
NASA Astrophysics Data System (ADS)
Silver, R. N.; Martz, H. F.
This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.
Approaches to Art Therapy for Cancer Inpatients: Research and Practice Considerations
ERIC Educational Resources Information Center
Nainis, Nancy A.
2008-01-01
Common symptoms reported by cancer patients include pain, fatigue, breathlessness, insomnia, lack of appetite, and anxiety. A study conducted by an interdisciplinary research team (Nainis et al., 2006) demonstrated statistically significant reductions in these cancer symptoms with the use of traditional art therapy methods. The study found a…
Evaluation of Models of the Reading Process.
ERIC Educational Resources Information Center
Balajthy, Ernest
A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…
Influences of environment and disturbance on forest patterns in coastal Oregon watersheds.
Michael C. Wimberly; Thomas A. Spies
2001-01-01
Modern ecology often emphasizes the distinction between traditional theories of stable, environmentally structured communities and a new paradigm of disturbance driven, nonequilibrium dynamics. However, multiple hypotheses for observed vegetation patterns have seldom been explicitly tested. We used multivariate statistics and variation partitioning methods to assess...
Implementation and Use of the Reference Analytics Module of LibAnswers
ERIC Educational Resources Information Center
Flatley, Robert; Jensen, Robert Bruce
2012-01-01
Academic libraries have traditionally collected reference statistics using hash marks on paper. Although efficient and simple, this method is not an effective way to capture the complexity of reference transactions. Several electronic tools are now available to assist libraries with collecting often elusive reference data--among them homegrown…
Aiding Participation and Engagement in a Blended Learning Environment
ERIC Educational Resources Information Center
Alrushiedat, Nimer; Olfman, Lorne
2013-01-01
This research was conducted as a field experiment that explored the potential benefits of anchoring in asynchronous online discussions for business statistics classes required for information systems majors. These classes are usually taught using traditional methods with emphasis on lecturing, knowledge reproduction, and treatment of students as…
Controlled Trial Using Computerized Feedback to Improve Physicians' Diagnostic Judgments.
ERIC Educational Resources Information Center
Poses, Roy M.; And Others
1992-01-01
A study involving 14 experienced physicians investigated the effectiveness of a computer program (providing statistical feedback to teach a clinical diagnostic rule that predicts the probability of streptococcal pharyngitis), in conjunction with traditional lecture and periodic disease-prevalence reports. Results suggest the integrated method is a…
Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L
2003-11-01
The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.
Possibility of reconstruction of dental plaster cast from 3D digital study models
2013-01-01
Objectives To compare traditional plaster casts, digital models and 3D printed copies of dental plaster casts based on various criteria. To determine whether 3D printed copies obtained using open source system RepRap can replace traditional plaster casts in dental practice. To compare and contrast the qualities of two possible 3D printing options – open source system RepRap and commercially available 3D printing. Design and settings A method comparison study on 10 dental plaster casts from the Orthodontic department, Department of Stomatology, 2nd medical Faulty, Charles University Prague, Czech Republic. Material and methods Each of 10 plaster casts were scanned by inEos Blue scanner and the printed on 3D printer RepRap [10 models] and ProJet HD3000 3D printer [1 model]. Linear measurements between selected points on the dental arches of upper and lower jaws on plaster casts and its 3D copy were recorded and statistically analyzed. Results 3D printed copies have many advantages over traditional plaster casts. The precision and accuracy of the RepRap 3D printed copies of plaster casts were confirmed based on the statistical analysis. Although the commercially available 3D printing enables to print more details than the RepRap system, it is expensive and for the purpose of clinical use can be replaced by the cheaper prints obtained from RepRap printed copies. Conclusions Scanning of the traditional plaster casts to obtain a digital model offers a pragmatic approach. The scans can subsequently be used as a template to print the plaster casts as required. Using 3D printers can replace traditional plaster casts primarily due to their accuracy and price. PMID:23721330
Ding, Weifu; Zhang, Jiangshe; Leung, Yee
2016-10-01
In this paper, we predict air pollutant concentration using a feedforward artificial neural network inspired by the mechanism of the human brain as a useful alternative to traditional statistical modeling techniques. The neural network is trained based on sparse response back-propagation in which only a small number of neurons respond to the specified stimulus simultaneously and provide a high convergence rate for the trained network, in addition to low energy consumption and greater generalization. Our method is evaluated on Hong Kong air monitoring station data and corresponding meteorological variables for which five air quality parameters were gathered at four monitoring stations in Hong Kong over 4 years (2012-2015). Our results show that our training method has more advantages in terms of the precision of the prediction, effectiveness, and generalization of traditional linear regression algorithms when compared with a feedforward artificial neural network trained using traditional back-propagation.
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...
2017-09-01
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benjamin Langhorst; Thomas M Lillo; Henry S Chu
2014-05-01
A statistics based ballistic test method is presented for use when comparing multiple groups of test articles of unknown relative ballistic perforation resistance. The method is intended to be more efficient than many traditional methods for research and development testing. To establish the validity of the method, it is employed in this study to compare test groups of known relative ballistic performance. Multiple groups of test articles were perforated using consistent projectiles and impact conditions. Test groups were made of rolled homogeneous armor (RHA) plates and differed in thickness. After perforation, each residual projectile was captured behind the target andmore » its mass was measured. The residual masses measured for each test group were analyzed to provide ballistic performance rankings with associated confidence levels. When compared to traditional V50 methods, the residual mass (RM) method was found to require fewer test events and be more tolerant of variations in impact conditions.« less
Artificial intelligence in the diagnosis of low back pain.
Mann, N H; Brown, M D
1991-04-01
Computerized methods are used to recognize the characteristics of patient pain drawings. Artificial neural network (ANN) models are compared with expert predictions and traditional statistical classification methods when placing the pain drawings of low back pain patients into one of five clinically significant categories. A discussion is undertaken outlining the differences in these classifiers and the potential benefits of the ANN model as an artificial intelligence technique.
Lee, Christopher C; Im, Mark; Kim, Tae Min; Stapleton, Edward R; Kim, Kyuseok; Suh, Gil Joon; Singer, Adam J; Henry, Mark C
2010-01-01
Current Advanced Cardiac Life Support (ACLS) course instruction involves a 2-day course with traditional lectures and limited team interaction. We wish to explore the advantages of a scenario-based performance-oriented team instruction (SPOTI) method to implement core ACLS skills for non-English-speaking international paramedic students. The objective of this study was to determine if scenario-based, performance-oriented team instruction (SPOTI) improves educational outcomes for the ACLS instruction of Korean paramedic students. Thirty Korean paramedic students were randomly selected into two groups. One group of 15 students was taught the traditional ACLS course. The other 15 students were instructed using a SPOTI method. Each group was tested using ACLS megacode examinations endorsed by the American Heart Association. All 30 students passed the ACLS megacode examination. In the traditional ACLS study group an average of 85% of the core skills were met. In the SPOTI study group an average of 93% of the core skills were met. In particular, the SPOTI study group excelled at physical examination skills such as airway opening, assessment of breathing, signs of circulation, and compression rates. In addition, the SPOTI group performed with higher marks on rhythm recognition compared to the traditional group. The traditional group performed with higher marks at providing proper drug dosages compared to the SPOTI students. However, the students enrolled in the SPOTI method resulted in higher megacode core compliance scores compared to students trained in traditional ACLS course instruction. These differences did not achieve statistical significance due to the small sample size. Copyright 2010 Elsevier Inc. All rights reserved.
Measuring digit lengths with 3D digital stereophotogrammetry: A comparison across methods.
Gremba, Allison; Weinberg, Seth M
2018-05-09
We compared digital 3D stereophotogrammetry to more traditional measurement methods (direct anthropometry and 2D scanning) to capture digit lengths and ratios. The length of the second and fourth digits was measured by each method and the second-to-fourth ratio was calculated. For each digit measurement, intraobserver agreement was calculated for each of the three collection methods. Further, measurements from the three methods were compared directly to one another. Agreement statistics included the intraclass correlation coefficient (ICC) and technical error of measurement (TEM). Intraobserver agreement statistics for the digit length measurements were high for all three methods; ICC values exceeded 0.97 and TEM values were below 1 mm. For digit ratio, intraobserver agreement was also acceptable for all methods, with direct anthropometry exhibiting lower agreement (ICC = 0.87) compared to indirect methods. For the comparison across methods, the overall agreement was high for digit length measurements (ICC values ranging from 0.93 to 0.98; TEM values below 2 mm). For digit ratios, high agreement was observed between the two indirect methods (ICC = 0.93), whereas indirect methods showed lower agreement when compared to direct anthropometry (ICC < 0.75). Digit measurements and derived ratios from 3D stereophotogrammetry showed high intraobserver agreement (similar to more traditional methods) suggesting that landmarks could be placed reliably on 3D hand surface images. While digit length measurements were found to be comparable across all three methods, ratios derived from direct anthropometry tended to be higher than those calculated indirectly from 2D or 3D images. © 2018 Wiley Periodicals, Inc.
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
Wang, Qi; Wang, Huaxiang; Cui, Ziqiang; Yang, Chengyi
2012-11-01
Electrical impedance tomography (EIT) calculates the internal conductivity distribution within a body using electrical contact measurements. The image reconstruction for EIT is an inverse problem, which is both non-linear and ill-posed. The traditional regularization method cannot avoid introducing negative values in the solution. The negativity of the solution produces artifacts in reconstructed images in presence of noise. A statistical method, namely, the expectation maximization (EM) method, is used to solve the inverse problem for EIT in this paper. The mathematical model of EIT is transformed to the non-negatively constrained likelihood minimization problem. The solution is obtained by the gradient projection-reduced Newton (GPRN) iteration method. This paper also discusses the strategies of choosing parameters. Simulation and experimental results indicate that the reconstructed images with higher quality can be obtained by the EM method, compared with the traditional Tikhonov and conjugate gradient (CG) methods, even with non-negative processing. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Applying Item Response Theory Methods to Examine the Impact of Different Response Formats
ERIC Educational Resources Information Center
Hohensinn, Christine; Kubinger, Klaus D.
2011-01-01
In aptitude and achievement tests, different response formats are usually used. A fundamental distinction must be made between the class of multiple-choice formats and the constructed response formats. Previous studies have examined the impact of different response formats applying traditional statistical approaches, but these influences can also…
Reliability Estimation for Aggregated Data: Applications for Organizational Research.
ERIC Educational Resources Information Center
Hart, Roland J.; Bradshaw, Stephen C.
This report provides the statistical tools necessary to measure the extent of error that exists in organizational record data and group survey data. It is felt that traditional methods of measuring error are inappropriate or incomplete when applied to organizational groups, especially in studies of organizational change when the same variables are…
A Case-Based Curriculum for Introductory Geology
ERIC Educational Resources Information Center
Goldsmith, David W.
2011-01-01
For the past 5 years I have been teaching my introductory geology class using a case-based method that promotes student engagement and inquiry. This article presents an explanation of how a case-based curriculum differs from a more traditional approach to the material. It also presents a statistical analysis of several years' worth of student…
Adjusting for radiotelemetry error to improve estimates of habitat use.
Scott L. Findholt; Bruce K. Johnson; Lyman L. McDonald; John W. Kern; Alan Ager; Rosemary J. Stussy; Larry D. Bryant
2002-01-01
Animal locations estimated from radiotelemetry have traditionally been treated as error-free when analyzed in relation to habitat variables. Location error lowers the power of statistical tests of habitat selection. We describe a method that incorporates the error surrounding point estimates into measures of environmental variables determined from a geographic...
Methods for Measuring the Influence of Concept Mapping on Student Information Literacy.
ERIC Educational Resources Information Center
Gordon, Carol A.
2002-01-01
Discusses research traditions in education and in information retrieval and explores the theory of expected information which uses formulas derived from the Fano measure and Bayesian statistics. Demonstrates its application in a study on the effects of concept mapping on the search behavior of tenth-grade biology students. (Author/LRW)
Student Attitudes to Learning Business Statistics: Comparison of Online and Traditional Methods
ERIC Educational Resources Information Center
Suanpang, Pannee; Petocz, Peter; Kalceff, Walter
2004-01-01
Worldwide, electronic learning (E-learning) has become an important part of the education agenda in the last decade. The Suan Dusit Rajabhat University (SDRU), Thailand has made significant efforts recently to use Internet technologies to enhance learning opportunities. The results reported here are part of a pioneering study to determine the…
Lahti, Mari; Hätönen, Heli; Välimäki, Maritta
2014-01-01
To review the impact of e-learning on nurses' and nursing student's knowledge, skills and satisfaction related to e-learning. We conducted a systematic review and meta-analysis of randomized controlled trials (RCT) to assess the impact of e-learning on nurses' and nursing student's knowledge, skills and satisfaction. Electronic databases including MEDLINE (1948-2010), CINAHL (1981-2010), Psychinfo (1967-2010) and Eric (1966-2010) were searched in May 2010 and again in December 2010. All RCT studies evaluating the effectiveness of e-learning and differentiating between traditional learning methods among nurses were included. Data was extracted related to the purpose of the trial, sample, measurements used, index test results and reference standard. An extraction tool developed for Cochrane reviews was used. Methodological quality of eligible trials was assessed. 11 trials were eligible for inclusion in the analysis. We identified 11 randomized controlled trials including a total of 2491 nurses and student nurses'. First, the random effect size for four studies showed some improvement associated with e-learning compared to traditional techniques on knowledge. However, the difference was not statistically significant (p=0.39, MD 0.44, 95% CI -0.57 to 1.46). Second, one study reported a slight impact on e-learning on skills, but the difference was not statistically significant, either (p=0.13, MD 0.03, 95% CI -0.09 to 0.69). And third, no results on nurses or student nurses' satisfaction could be reported as the statistical data from three possible studies were not available. Overall, there was no statistical difference between groups in e-learning and traditional learning relating to nurses' or student nurses' knowledge, skills and satisfaction. E-learning can, however, offer an alternative method of education. In future, more studies following the CONSORT and QUOROM statements are needed to evaluate the effects of these interventions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Niu, Renjie; Fu, Chenyu; Xu, Zhiyong; Huang, Jianyuan
2016-04-29
Doctors who practice Traditional Chinese Medicine (TCM) diagnose using four methods - inspection, auscultation and olfaction, interrogation, and pulse feeling/palpation. The shape and shape changes of the moon marks on the nails are an important indication when judging the patient's health. There are a series of classical and experimental theories about moon marks in TCM, which does not have support from statistical data. To verify some experiential theories on moon mark in TCM by automatic data-processing equipment. This paper proposes the equipment that utilizes image processing technology to collect moon mark data of different target groups conveniently and quickly, building a database that combines this information with that gathered from the health and mental status questionnaire in each test. This equipment has a simple design, a low cost, and an optimized algorithm. The practice has been proven to quickly complete automatic acquisition and preservation of key data about moon marks. In the future, some conclusions will likely be obtained from these data; some changes of moon marks related to a special pathological change will be established with statistical methods.
Inan, U; Gurel, M
2017-02-01
Instrument fracture is a serious concern in endodontic practice. The aim of this study was to investigate the surface quality of new and used rotary nickel-titanium (NiTi) instruments manufactured by the traditional grinding process and twisting methods. Total 16 instruments of two rotary NiTi systems were used in this study. Eight Twisted Files (TF) (SybronEndo, Orange, CA, USA) and 8 Mtwo (VDW, Munich, Germany) instruments were evaluated. New and used of 4 experimental groups were evaluated using an atomic force microscopy (AFM). New and used instruments were analyzed on 3 points along a 3 mm. section at the tip of the instrument. Quantitative measurements according to the topographical deviations were recorded. The data were statistically analyzed with paired samples t-test and independent samples t-test. Mean root mean square (RMS) values for new and used TF 25.06 files were 10.70 ± 2.80 nm and 21.58 ± 6.42 nm, respectively, and the difference between them was statistically significant (P < 0.05). Mean RMS values for new and used Mtwo 25.06 files were 24.16 ± 9.30 nm and 39.15 ± 16.20 nm respectively, the difference between them also was statistically significant (P < 0.05). According to the AFM analysis, instruments produced by twisting method (TF 25.06) had better surface quality than the instruments produced by traditional grinding process (Mtwo 25.06 files).
Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José
2013-11-01
To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
[Study on commercial specification of atractylodes based on Delphi method].
Wang, Hao; Chen, Li-Xiao; Huang, Lu-Qi; Zhang, Tian-Tian; Li, Ying; Zheng, Yu-Guang
2016-03-01
This research adopts "Delphi method" to evaluate atractylodes traditional traits and rank correlation. By using methods of mathematical statistics the relationship of the traditional identification indicators and atractylodes goods rank correlation was analyzed, It is found that the main characteristics affectingatractylodes commodity specifications and grades of main characters wereoil points of transaction,color of transaction,color of surface,grain of transaction,texture of transaction andspoilage. The study points out that the original "seventy-six kinds of medicinal materials commodity specification standards of atractylodes differentiate commodity specification" is not in conformity with the actual market situation, we need to formulate corresponding atractylodes medicinal products specifications and grades.This study combined with experimental results "Delphi method" and the market actual situation, proposed the new draft atractylodes commodity specifications and grades, as the new atractylodes commodity specifications and grades standards. It provides a reference and theoretical basis. Copyright© by the Chinese Pharmaceutical Association.
Potential application of machine learning in health outcomes research and some statistical cautions.
Crown, William H
2015-03-01
Traditional analytic methods are often ill-suited to the evolving world of health care big data characterized by massive volume, complexity, and velocity. In particular, methods are needed that can estimate models efficiently using very large datasets containing healthcare utilization data, clinical data, data from personal devices, and many other sources. Although very large, such datasets can also be quite sparse (e.g., device data may only be available for a small subset of individuals), which creates problems for traditional regression models. Many machine learning methods address such limitations effectively but are still subject to the usual sources of bias that commonly arise in observational studies. Researchers using machine learning methods such as lasso or ridge regression should assess these models using conventional specification tests. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Davalos, Angel D; Luben, Thomas J; Herring, Amy H; Sacks, Jason D
2017-02-01
Air pollution epidemiology traditionally focuses on the relationship between individual air pollutants and health outcomes (e.g., mortality). To account for potential copollutant confounding, individual pollutant associations are often estimated by adjusting or controlling for other pollutants in the mixture. Recently, the need to characterize the relationship between health outcomes and the larger multipollutant mixture has been emphasized in an attempt to better protect public health and inform more sustainable air quality management decisions. New and innovative statistical methods to examine multipollutant exposures were identified through a broad literature search, with a specific focus on those statistical approaches currently used in epidemiologic studies of short-term exposures to criteria air pollutants (i.e., particulate matter, carbon monoxide, sulfur dioxide, nitrogen dioxide, and ozone). Five broad classes of statistical approaches were identified for examining associations between short-term multipollutant exposures and health outcomes, specifically additive main effects, effect measure modification, unsupervised dimension reduction, supervised dimension reduction, and nonparametric methods. These approaches are characterized including advantages and limitations in different epidemiologic scenarios. By highlighting the characteristics of various studies in which multipollutant statistical methods have been used, this review provides epidemiologists and biostatisticians with a resource to aid in the selection of the most optimal statistical method to use when examining multipollutant exposures. Published by Elsevier Inc.
A Bayesian approach to the statistical analysis of device preference studies.
Fu, Haoda; Qu, Yongming; Zhu, Baojin; Huster, William
2012-01-01
Drug delivery devices are required to have excellent technical specifications to deliver drugs accurately, and in addition, the devices should provide a satisfactory experience to patients because this can have a direct effect on drug compliance. To compare patients' experience with two devices, cross-over studies with patient-reported outcomes (PRO) as response variables are often used. Because of the strength of cross-over designs, each subject can directly compare the two devices by using the PRO variables, and variables indicating preference (preferring A, preferring B, or no preference) can be easily derived. Traditionally, methods based on frequentist statistics can be used to analyze such preference data, but there are some limitations for the frequentist methods. Recently, Bayesian methods are considered an acceptable method by the US Food and Drug Administration to design and analyze device studies. In this paper, we propose a Bayesian statistical method to analyze the data from preference trials. We demonstrate that the new Bayesian estimator enjoys some optimal properties versus the frequentist estimator. Copyright © 2012 John Wiley & Sons, Ltd.
Gu, Zhan; Qi, Xiuzhong; Zhai, Xiaofeng; Lang, Qingbo; Lu, Jianying; Ma, Changping; Liu, Long; Yue, Xiaoqiang
2015-01-01
Primary liver cancer (PLC) is one of the most common malignant tumors because of its high incidence and high mortality. Traditional Chinese medicine (TCM) plays an active role in the treatment of PLC. As the most important part in the TCM system, syndrome differentiation based on the clinical manifestations from traditional four diagnostic methods has met great challenges and questions with the lack of statistical validation support. In this study, we provided evidences for TCM syndrome differentiation of PLC using the method of analysis of latent structural model from clinic data, thus providing basis for establishing TCM syndrome criteria. And also we obtain the common syndromes of PLC as well as their typical clinical manifestations, respectively.
Reither, Eric N; Olshansky, S Jay; Yang, Yang
2011-08-01
Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.
Bone age maturity assessment using hand-held device
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Gilsanz, Vicente; Liu, Xiaodong; Boechat, M. I.
2004-04-01
Purpose: Assessment of bone maturity is traditionally performed through visual comparison of hand and wrist radiograph with existing reference images in textbooks. Our goal was to develop a digital index based on idealized hand Xray images that can be incorporated in a hand held computer and used for visual assessment of bone age for patients. Material and methods: Due to the large variability in bone maturation in normals, we generated a set of "ideal" images obtained by computer combinations of images from our normal reference data sets. Software for hand-held PDA devices was developed for easy navigation through the set of images and visual selection of matching images. A formula based on our statistical analysis provides the standard deviation from normal based on the chronological age of the patient. The accuracy of the program was compared to traditional interpretation by two radiologists in a double blind reading of 200 normal Caucasian children (100 boys, 100 girls). Results: Strong correlations were present between chronological age and bone age (r > 0.9) with no statistical difference between the digital and traditional assessment methods. Determinations of carpal bone maturity in adolescents was slightly more accurate using the digital system. The users did praise the convenience and effectiveness of the digital Palm Index in clinical practice. Conclusion: An idealized digital Palm Bone Age Index provides a convenient and effective alternative to conventional atlases for the assessment of skeletal maturity.
Linear models: permutation methods
Cade, B.S.; Everitt, B.S.; Howell, D.C.
2005-01-01
Permutation tests (see Permutation Based Inference) for the linear model have applications in behavioral studies when traditional parametric assumptions about the error term in a linear model are not tenable. Improved validity of Type I error rates can be achieved with properly constructed permutation tests. Perhaps more importantly, increased statistical power, improved robustness to effects of outliers, and detection of alternative distributional differences can be achieved by coupling permutation inference with alternative linear model estimators. For example, it is well-known that estimates of the mean in linear model are extremely sensitive to even a single outlying value of the dependent variable compared to estimates of the median [7, 19]. Traditionally, linear modeling focused on estimating changes in the center of distributions (means or medians). However, quantile regression allows distributional changes to be estimated in all or any selected part of a distribution or responses, providing a more complete statistical picture that has relevance to many biological questions [6]...
[Rank distributions in community ecology from the statistical viewpoint].
Maksimov, V N
2004-01-01
Traditional statistical methods for definition of empirical functions of abundance distribution (population, biomass, production, etc.) of species in a community are applicable for processing of multivariate data contained in the above quantitative indices of the communities. In particular, evaluation of moments of distribution suffices for convolution of the data contained in a list of species and their abundance. At the same time, the species should be ranked in the list in ascending rather than descending population and the distribution models should be analyzed on the basis of the data on abundant species only.
Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines
NASA Astrophysics Data System (ADS)
Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.
2016-12-01
Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.
A statistical method for measuring activation of gene regulatory networks.
Esteves, Gustavo H; Reis, Luiz F L
2018-06-13
Gene expression data analysis is of great importance for modern molecular biology, given our ability to measure the expression profiles of thousands of genes and enabling studies rooted in systems biology. In this work, we propose a simple statistical model for the activation measuring of gene regulatory networks, instead of the traditional gene co-expression networks. We present the mathematical construction of a statistical procedure for testing hypothesis regarding gene regulatory network activation. The real probability distribution for the test statistic is evaluated by a permutation based study. To illustrate the functionality of the proposed methodology, we also present a simple example based on a small hypothetical network and the activation measuring of two KEGG networks, both based on gene expression data collected from gastric and esophageal samples. The two KEGG networks were also analyzed for a public database, available through NCBI-GEO, presented as Supplementary Material. This method was implemented in an R package that is available at the BioConductor project website under the name maigesPack.
Reconciling statistical and systems science approaches to public health.
Ip, Edward H; Rahmandad, Hazhir; Shoham, David A; Hammond, Ross; Huang, Terry T-K; Wang, Youfa; Mabry, Patricia L
2013-10-01
Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision.
Reconciling Statistical and Systems Science Approaches to Public Health
Ip, Edward H.; Rahmandad, Hazhir; Shoham, David A.; Hammond, Ross; Huang, Terry T.-K.; Wang, Youfa; Mabry, Patricia L.
2016-01-01
Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision. PMID:24084395
Statistics of Macroturbulence from Flow Equations
NASA Astrophysics Data System (ADS)
Marston, Brad; Iadecola, Thomas; Qi, Wanming
2012-02-01
Probability distribution functions of stochastically-driven and frictionally-damped fluids are governed by a linear framework that resembles quantum many-body theory. Besides the Fokker-Planck approach, there is a closely related Hopf functional methodfootnotetextOokie Ma and J. B. Marston, J. Stat. Phys. Th. Exp. P10007 (2005).; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we generalize the flow equation approachfootnotetextF. Wegner, Ann. Phys. 3, 77 (1994). (also known as the method of continuous unitary transformationsfootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994).) to find the zero mode. We test the approach using a prototypical model of geophysical and astrophysical flows on a rotating sphere that spontaneously organizes into a coherent jet. Good agreement is found with low-order equal-time statistics accumulated by direct numerical simulation, the traditional method. Different choices for the generators of the continuous transformations, and for closure approximations of the operator algebra, are discussed.
Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping
2016-10-21
Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.
Huang, Zhi; Wang, Xin-zhi; Hou, Yue-Zhong
2015-02-01
Making impressions for maxillectomy patients is an essential but difficult task. This study developed a novel method to fabricate individual trays by computer-aided design (CAD) and rapid prototyping (RP) to simplify the process and enhance patient safety. Five unilateral maxillectomy patients were recruited for this study. For each patient, a computed tomography (CT) scan was taken. Based on the 3D surface reconstruction of the target area, an individual tray was manufactured by CAD/RP. With a conventional custom tray as control, two final impressions were made using the different types of tray for each patient. The trays were sectioned, and in each section the thickness of the material was measured at six evenly distributed points. Descriptive statistics and paired t-test were used to examine the difference of the impression thickness. SAS 9.3 was applied in the statistical analysis. Afterwards, all casts were then optically 3D scanned and compared digitally to evaluate the feasibility of this method. Impressions of all five maxillectomy patients were successfully made with individual trays fabricated by CAD/RP and traditional trays. The descriptive statistics of impression thickness measurement showed slightly more uneven results in the traditional trays, but no statistical significance was shown. A 3D digital comparison showed acceptable discrepancies within 1 mm in the majority of cast areas. The largest difference of 3 mm was observed in the buccal wall of the defective areas. Moderate deviations of 1 to 2 mm were detected in the buccal and labial vestibular groove areas. This study confirmed the feasibility of a novel method of fabricating individual trays by CAD/RP. Impressions made by individual trays manufactured using CAD/RP had a uniform thickness, with an acceptable level of accuracy compared to those made through conventional processes. © 2014 by the American College of Prosthodontists.
ERIC Educational Resources Information Center
Henry, Kimberly L.; Muthen, Bengt
2010-01-01
Latent class analysis (LCA) is a statistical method used to identify subtypes of related cases using a set of categorical or continuous observed variables. Traditional LCA assumes that observations are independent. However, multilevel data structures are common in social and behavioral research and alternative strategies are needed. In this…
On Improving the Experiment Methodology in Pedagogical Research
ERIC Educational Resources Information Center
Horakova, Tereza; Houska, Milan
2014-01-01
The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…
Data mining: sophisticated forms of managed care modeling through artificial intelligence.
Borok, L S
1997-01-01
Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.
Statistics for Time-Series Spatial Data: Applying Survival Analysis to Study Land-Use Change
ERIC Educational Resources Information Center
Wang, Ninghua Nathan
2013-01-01
Traditional spatial analysis and data mining methods fall short of extracting temporal information from data. This inability makes their use difficult to study changes and the associated mechanisms of many geographic phenomena of interest, for example, land-use. On the other hand, the growing availability of land-change data over multiple time…
Meta-analysis of teaching methods: a 50k+ student study
NASA Astrophysics Data System (ADS)
Sayre, Eleanor; Archibeque, Benjamin; Gomez, K. Alison; Heckendorf, Tyrel; Madsen, Adrian M.; McKagan, Sarah B.; Schenk, Edward W.; Shepard, Chase; Sorell, Lane; von Korff, Joshua
2015-04-01
The Force Concept Inventory (FCI) and the Force and Motion Conceptual Evaluation (FMCE) are the two most widely-used conceptual tests in introductory mechanics. Because they are so popular, they provide an excellent avenue to compare different teaching methods at different kinds of institutions with varying student populations. We conducted a secondary analysis of all peer-reviewed papers which publish data from US and Canadian colleges and universities. Our data include over fifty thousand students drawn from approximately 100 papers; papers were drawn from Scopus, ERIC, ComPADRE, and journal websites. We augment published data about teaching methods with institutional data such as Carnegie Classification and average SAT scores. We statistically determine the effectiveness of different teaching methods as measured by FCI and FMCE gains and mediated by institutional and course factors. As in the landmark 1998 Hake study, we find that classes using interactive engagement (IE) have significantly larger learning gains than classes using traditional instruction. However, we find a broader distribution of normalized gains occurs in each of traditional and IE classes, and the differences between IE and traditional instruction have changed over time and are more context dependent.
Effectiveness of Traditional Chinese Acupuncture versus Sham Acupuncture: a Systematic Review
Carlos, Luís; da Cruz, Lóris Aparecida Prado; Leopoldo, Vanessa Cristina; de Campos, Fabrício Ribeiro; de Almeida, Ana Maria; Silveira, Renata Cristina de Campos Pereira
2016-01-01
ABSTRACT Objective: to identify and synthesize the evidence from randomized clinical trials that tested the effectiveness of traditional Chinese acupuncture in relation to sham acupuncture for the treatment of hot flashes in menopausal women with breast cancer. Method: systematic review guided by the recommendations of the Cochrane Collaboration. Citations were searched in the following databases: MEDLINE via PubMed, Web of Science, CENTRAL, CINAHL, and LILACS. A combination of the following keywords was used: breast neoplasm, acupuncture, acupuncture therapy, acupuncture points, placebos, sham treatment, hot flashes, hot flushes, menopause, climacteric, and vasomotor symptoms. Results: a total of 272 studies were identified, five of which were selected and analyzed. Slight superiority of traditional acupuncture compared with sham acupuncture was observed; however, there were no strong statistical associations. Conclusions: the evidence gathered was not sufficient to affirm the effectiveness of traditional acupuncture compared with sham acupuncture. PMID:27533271
Effectiveness of modified seminars as a teaching-learning method in pharmacology
Palappallil, Dhanya Sasidharan; Sushama, Jitha; Ramnath, Sai Nathan
2016-01-01
Context: Student-led seminars (SLS) are adopted as a teaching-learning (T-L) method in pharmacology. Previous studies assessing the feedback on T-L methods in pharmacology points out that the traditional seminars consistently received poor feedbacks as they were not favorite among the students. Aims: This study aimed to obtain feedback on traditional SLS, introduce modified SLS and compare the modified seminars with the traditional ones. Settings and Design: This was a prospective interventional study done for 2 months in medical undergraduates of fifth semester attending Pharmacology seminars at a Government Medical College in South India. Subjects and Methods: Structured questionnaire was used to elicit feedback from participants. The responses were coded on 5-point Likert scale. Modifications in seminar sessions such as role plays, quiz, tests, group discussion, and patient-oriented problem-solving exercises were introduced along with SLS. Statistical Analysis Used: The data were analyzed using SPSS version 16. The descriptive data were expressed using frequencies and percentages. Wilcoxon signed rank test, and Friedman tests were used to compare traditional with modified seminars. Results: The participants identified interaction as the most important component of a seminar. Majority opined that the teacher should summarize at the end of SLS. Student feedback shows that modified seminars created more interest, enthusiasm, and inspiration to learn the topic when compared to traditional SLS. They also increased peer coordination and group dynamics. Students opined that communication skills and teacher-student interactions were not improved with modified seminars. Conclusions: Interventions in the form of modified SLS may be adopted to break the monotony of traditional seminars through active participation, peer interaction, and teamwork. PMID:27563587
Modeling urbanization patterns at a global scale with generative adversarial networks
NASA Astrophysics Data System (ADS)
Albert, A. T.; Strano, E.; Gonzalez, M.
2017-12-01
Current demographic projections show that, in the next 30 years, global population growth will mostly take place in developing countries. Coupled with a decrease in density, such population growth could potentially double the land occupied by settlements by 2050. The lack of reliable and globally consistent socio-demographic data, coupled with the limited predictive performance underlying traditional urban spatial explicit models, call for developing better predictive methods, calibrated using a globally-consistent dataset. Thus, richer models of the spatial interplay between the urban built-up land, population distribution and energy use are central to the discussion around the expansion and development of cities, and their impact on the environment in the context of a changing climate. In this talk we discuss methods for, and present an analysis of, urban form, defined as the spatial distribution of macroeconomic quantities that characterize a city, using modern machine learning methods and best-available remote-sensing data for the world's largest 25,000 cities. We first show that these cities may be described by a small set of patterns in radial building density, nighttime luminosity, and population density, which highlight, to first order, differences in development and land use across the world. We observe significant, spatially-dependent variance around these typical patterns, which would be difficult to model using traditional statistical methods. We take a first step in addressing this challenge by developing CityGAN, a conditional generative adversarial network model for simulating realistic urban forms. To guide learning and measure the quality of the simulated synthetic cities, we develop a specialized loss function for GAN optimization that incorporates standard spatial statistics used by urban analysis experts. Our framework is a stark departure from both the standard physics-based approaches in the literature (that view urban forms as fractals with a scale-free behavior), and the traditional statistical learning approaches (whereby values of individual pixels are modeled as functions of locally-defined, hand-engineered features). This is a first-of-its-kind analysis of urban forms using data at a planetary scale.
Bruse, Jan L; McLeod, Kristin; Biglino, Giovanni; Ntsinjana, Hopewell N; Capelli, Claudio; Hsia, Tain-Yen; Sermesant, Maxime; Pennec, Xavier; Taylor, Andrew M; Schievano, Silvia
2016-05-31
Medical image analysis in clinical practice is commonly carried out on 2D image data, without fully exploiting the detailed 3D anatomical information that is provided by modern non-invasive medical imaging techniques. In this paper, a statistical shape analysis method is presented, which enables the extraction of 3D anatomical shape features from cardiovascular magnetic resonance (CMR) image data, with no need for manual landmarking. The method was applied to repaired aortic coarctation arches that present complex shapes, with the aim of capturing shape features as biomarkers of potential functional relevance. The method is presented from the user-perspective and is evaluated by comparing results with traditional morphometric measurements. Steps required to set up the statistical shape modelling analyses, from pre-processing of the CMR images to parameter setting and strategies to account for size differences and outliers, are described in detail. The anatomical mean shape of 20 aortic arches post-aortic coarctation repair (CoA) was computed based on surface models reconstructed from CMR data. By analysing transformations that deform the mean shape towards each of the individual patient's anatomy, shape patterns related to differences in body surface area (BSA) and ejection fraction (EF) were extracted. The resulting shape vectors, describing shape features in 3D, were compared with traditionally measured 2D and 3D morphometric parameters. The computed 3D mean shape was close to population mean values of geometric shape descriptors and visually integrated characteristic shape features associated with our population of CoA shapes. After removing size effects due to differences in body surface area (BSA) between patients, distinct 3D shape features of the aortic arch correlated significantly with EF (r = 0.521, p = .022) and were well in agreement with trends as shown by traditional shape descriptors. The suggested method has the potential to discover previously unknown 3D shape biomarkers from medical imaging data. Thus, it could contribute to improving diagnosis and risk stratification in complex cardiac disease.
Dendritic tree extraction from noisy maximum intensity projection images in C. elegans.
Greenblum, Ayala; Sznitman, Raphael; Fua, Pascal; Arratia, Paulo E; Oren, Meital; Podbilewicz, Benjamin; Sznitman, Josué
2014-06-12
Maximum Intensity Projections (MIP) of neuronal dendritic trees obtained from confocal microscopy are frequently used to study the relationship between tree morphology and mechanosensory function in the model organism C. elegans. Extracting dendritic trees from noisy images remains however a strenuous process that has traditionally relied on manual approaches. Here, we focus on automated and reliable 2D segmentations of dendritic trees following a statistical learning framework. Our dendritic tree extraction (DTE) method uses small amounts of labelled training data on MIPs to learn noise models of texture-based features from the responses of tree structures and image background. Our strategy lies in evaluating statistical models of noise that account for both the variability generated from the imaging process and from the aggregation of information in the MIP images. These noisy models are then used within a probabilistic, or Bayesian framework to provide a coarse 2D dendritic tree segmentation. Finally, some post-processing is applied to refine the segmentations and provide skeletonized trees using a morphological thinning process. Following a Leave-One-Out Cross Validation (LOOCV) method for an MIP databse with available "ground truth" images, we demonstrate that our approach provides significant improvements in tree-structure segmentations over traditional intensity-based methods. Improvements for MIPs under various imaging conditions are both qualitative and quantitative, as measured from Receiver Operator Characteristic (ROC) curves and the yield and error rates in the final segmentations. In a final step, we demonstrate our DTE approach on previously unseen MIP samples including the extraction of skeletonized structures, and compare our method to a state-of-the art dendritic tree tracing software. Overall, our DTE method allows for robust dendritic tree segmentations in noisy MIPs, outperforming traditional intensity-based methods. Such approach provides a useable segmentation framework, ultimately delivering a speed-up for dendritic tree identification on the user end and a reliable first step towards further morphological characterizations of tree arborization.
Nonparametric entropy estimation using kernel densities.
Lake, Douglas E
2009-01-01
The entropy of experimental data from the biological and medical sciences provides additional information over summary statistics. Calculating entropy involves estimates of probability density functions, which can be effectively accomplished using kernel density methods. Kernel density estimation has been widely studied and a univariate implementation is readily available in MATLAB. The traditional definition of Shannon entropy is part of a larger family of statistics, called Renyi entropy, which are useful in applications that require a measure of the Gaussianity of data. Of particular note is the quadratic entropy which is related to the Friedman-Tukey (FT) index, a widely used measure in the statistical community. One application where quadratic entropy is very useful is the detection of abnormal cardiac rhythms, such as atrial fibrillation (AF). Asymptotic and exact small-sample results for optimal bandwidth and kernel selection to estimate the FT index are presented and lead to improved methods for entropy estimation.
A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan
2018-04-01
This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.
Searching the Heavens: Astronomy, Computation, Statistics, Data Mining and Philosophy
NASA Astrophysics Data System (ADS)
Glymour, Clark
2012-03-01
Our first and purest science, the mother of scientific methods, sustained by sheer curiosity, searching the heavens we cannot manipulate. From the beginning, astronomy has combined mathematical idealization, technological ingenuity, and indefatigable data collection with procedures to search through assembled data for the processes that govern the cosmos. Astronomers are, and ever have been, data miners, and for that reason astronomical methods (but not astronomical discoveries) have often been despised by statisticians and philosophers. Epithets laced the statistical literature: Ransacking! Data dredging! Double Counting! Statistical disdain was usually directed at social scientists and biologists, rarely if ever at astronomers, but the methodological attitudes and goals that many twentieth-century philosophers and statisticians rejected were creations of the astronomical tradition. The philosophical criticisms were earlier and more direct. In the shadow (or in Alexander Popeâs phrasing, the light) cast on nature in the eighteenth century by the Newtonian triumph, David Hume revived arguments from the ancient Greeks to challenge the very possibility of coming to know what causes what. His conclusion was endorsed in the twentieth century by many philosophers who found talk of causation unnecessary or unacceptably metaphysical, and absorbed by many statisticians as a general suspicion of causal claims, except possibly when they are founded on experimental manipulation. And yet in the hands of a mathematician, Thomas Bayes, and another mathematician and philosopher, Richard Price, Humeâs essays prompted the development of a new kind of statistics, the kind we now call "Bayesian." The computer and new data acquisition methods have begun to dissolve the antipathy between astronomy, philosophy, and statistics. But the resolution is practical, without much reflection on the arguments or the course of events. So, I offer a largely unoriginal history, substituting rather dry commentary on method for the fuller, livelier history of astronomersâ ambitions, politics, and passions. My accounts of various episodes in the astronomical tradition are taken from standard sources, especially Neugebauer (1952), Baum & Sheehan (1997), Crelensten (2006), and Stigler (1990). Methodological commentary is mine, not that of these sources.
BAYESIAN ESTIMATION OF THERMONUCLEAR REACTION RATES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliadis, C.; Anderson, K. S.; Coc, A.
The problem of estimating non-resonant astrophysical S -factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We presentmore » astrophysical S -factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p, γ ){sup 3}He, {sup 3}He({sup 3}He,2p){sup 4}He, and {sup 3}He( α , γ ){sup 7}Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.« less
Collection development using interlibrary loan borrowing and acquisitions statistics.
Byrd, G D; Thomas, D A; Hughes, K E
1982-01-01
Libraries, especially those supporting the sciences, continually face the problem of selecting appropriate new books for their users. Traditional collection development techniques include the use of librarian or user subject specialists, user recommendations, and approval plans. These methods of selection, however, are most effective in large libraries and do not systemically correlate new book purchases with the actual demands of users served. This paper describes a statistical method for determining subject strengths and weaknesses in a library book collection in relation to user demand. Using interlibrary loan borrowing and book acquisition statistics gathered for one fiscal year from three health sciences libraries, the authors developed a way to graph the broad and narrow subject fields of strength and potential weakness in a book collection. This method has the advantages of simplicity, speed of implementation, and clarity. It can also be used over a period of time to verify the success or failure of a collection development program. Finally, the method has potential as a tool for use by two or more libraries seeking to improve cooperative collection development in a network or consortium. PMID:7059712
Effect of simulation on the ability of first year nursing students to learn vital signs.
Eyikara, Evrim; Baykara, Zehra Göçmen
2018-01-01
The acquisition of cognitive, affective and psychomotor knowledge and skills are required in nursing, made possible via an interactive teaching method, such as simulation. This study conducted to identify the impact of simulation on first-year nursing students' ability to learn vital signs. A convenience sample of 90 first-year nursing students enrolled at a University, Ankara, in 2014-2015. Ninety students enrolled for lessons on the "Fundamentals of Nursing" were identified using a simple random sampling method. The students were taught vital signs theory via traditional methods. They were grouped into experimental 1, experimental 2 and control group, of 30 students each. Students in the experimental 1 group attended sessions on simulation and those in the experimental 2 group sessions on laboratory work, followed by simulation. The control group were taught via traditional methods and only attended the laboratory work sessions. The students' cognitive knowledge acquisition was evaluated using a knowledge test before and after the lessons. The ability to measure vital signs in adults (healthy ones and patients) was evaluated using a skill control list. A statistically significant difference was not observed between the groups in terms of the average pre-test scores on knowledge (p>0.050). Groups exposed to simulation obtained statistically significantly higher scores than the control group in post-test knowledge (p<0.050). The ability of the groups exposed to simulation to measure vital signs in healthy adults and patients was more successful than that the control group (p<0.050). This was statistically significant. Simulation had a positive effect on the ability of nursing students to measure vital signs. Thus, simulation should be included in the mainstream curriculum in order to effectively impart nursing knowledge and skills. Copyright © 2017 Elsevier Ltd. All rights reserved.
Montgomery, Eric; Gao, Chen; de Luca, Julie; Bower, Jessie; Attwood, Kristropher; Ylagan, Lourdes
2014-12-01
The Cellient(®) cell block system has become available as an alternative, partially automated method to create cell blocks in cytology. We sought to show a validation method for immunohistochemical (IHC) staining on the Cellient cell block system (CCB) in comparison with the formalin fixed paraffin embedded traditional cell block (TCB). Immunohistochemical staining was performed using 31 antibodies on 38 patient samples for a total of 326 slides. Split samples were processed using both methods by following the Cellient(®) manufacturer's recommendations for the Cellient cell block (CCB) and the Histogel method for preparing the traditional cell block (TCB). Interpretation was performed by three pathologists and two cytotechnologists. Immunohistochemical stains were scored as: 0/1+ (negative) and 2/3+ (positive). Inter-rater agreement for each antibody was evaluated for CCB and TCB, as well as the intra-rater agreement between TCB and CCB between observers. Interobserver staining concordance for the TCB was obtained with statistical significance (P < 0.05) in 24 of 31 antibodies. Interobserver staining concordance for the CCB was obtained with statistical significance in 27 of 31 antibodies. Intra-observer staining concordance between TCB and CCB was obtained with statistical significance in 24 of 31 antibodies tested. In conclusions, immunohistochemical stains on cytologic specimens processed by the Cellient system are reliable and concordant with stains performed on the same split samples processed via a formalin fixed-paraffin embedded (FFPE) block. The Cellient system is a welcome adjunct to cytology work-flow by producing cell block material of sufficient quality to allow the use of routine IHC. © 2014 Wiley Periodicals, Inc.
Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere
2011-01-01
Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004
ERIC Educational Resources Information Center
Gundlach, Ellen; Richards, K. Andrew R.; Nelson, David; Levesque-Bristol, Chantal
2015-01-01
Web-augmented traditional lecture, fully online, and flipped sections, all taught by the same instructor with the same course schedule, assignments, and exams in the same semester, were compared with regards to student attitudes; statistical reasoning; performance on common exams, homework, and projects; and perceptions of the course and…
An Intelligent Model for Pairs Trading Using Genetic Algorithms.
Huang, Chien-Feng; Hsu, Chi-Jen; Chen, Chi-Chung; Chang, Bao Rong; Li, Chen-An
2015-01-01
Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice.
An Intelligent Model for Pairs Trading Using Genetic Algorithms
Hsu, Chi-Jen; Chen, Chi-Chung; Li, Chen-An
2015-01-01
Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice. PMID:26339236
Lu, Pei; Xia, Jun; Li, Zhicheng; Xiong, Jing; Yang, Jian; Zhou, Shoujun; Wang, Lei; Chen, Mingyang; Wang, Cheng
2016-11-08
Accurate segmentation of blood vessels plays an important role in the computer-aided diagnosis and interventional treatment of vascular diseases. The statistical method is an important component of effective vessel segmentation; however, several limitations discourage the segmentation effect, i.e., dependence of the image modality, uneven contrast media, bias field, and overlapping intensity distribution of the object and background. In addition, the mixture models of the statistical methods are constructed relaying on the characteristics of the image histograms. Thus, it is a challenging issue for the traditional methods to be available in vessel segmentation from multi-modality angiographic images. To overcome these limitations, a flexible segmentation method with a fixed mixture model has been proposed for various angiography modalities. Our method mainly consists of three parts. Firstly, multi-scale filtering algorithm was used on the original images to enhance vessels and suppress noises. As a result, the filtered data achieved a new statistical characteristic. Secondly, a mixture model formed by three probabilistic distributions (two Exponential distributions and one Gaussian distribution) was built to fit the histogram curve of the filtered data, where the expectation maximization (EM) algorithm was used for parameters estimation. Finally, three-dimensional (3D) Markov random field (MRF) were employed to improve the accuracy of pixel-wise classification and posterior probability estimation. To quantitatively evaluate the performance of the proposed method, two phantoms simulating blood vessels with different tubular structures and noises have been devised. Meanwhile, four clinical angiographic data sets from different human organs have been used to qualitatively validate the method. To further test the performance, comparison tests between the proposed method and the traditional ones have been conducted on two different brain magnetic resonance angiography (MRA) data sets. The results of the phantoms were satisfying, e.g., the noise was greatly suppressed, the percentages of the misclassified voxels, i.e., the segmentation error ratios, were no more than 0.3%, and the Dice similarity coefficients (DSCs) were above 94%. According to the opinions of clinical vascular specialists, the vessels in various data sets were extracted with high accuracy since complete vessel trees were extracted while lesser non-vessels and background were falsely classified as vessel. In the comparison experiments, the proposed method showed its superiority in accuracy and robustness for extracting vascular structures from multi-modality angiographic images with complicated background noises. The experimental results demonstrated that our proposed method was available for various angiographic data. The main reason was that the constructed mixture probability model could unitarily classify vessel object from the multi-scale filtered data of various angiography images. The advantages of the proposed method lie in the following aspects: firstly, it can extract the vessels with poor angiography quality, since the multi-scale filtering algorithm can improve the vessel intensity in the circumstance such as uneven contrast media and bias field; secondly, it performed well for extracting the vessels in multi-modality angiographic images despite various signal-noises; and thirdly, it was implemented with better accuracy, and robustness than the traditional methods. Generally, these traits declare that the proposed method would have significant clinical application.
Gleason, Shaun E; McNair, Bryan; Kiser, Tyree H; Franson, Kari L
Non-traditional learning (NTL), including aspects of self-directed learning (SDL), may address self-awareness development needs. Many factors can impact successful implementation of NTL. To share our multi-year experience with modifications that aim to improve NTL sessions in a traditional curriculum. To improve understanding of applied implementation variables (some of which were based on successful SDL implementation components) that impact NTL. We delivered a single lesson in a traditional-delivery curriculum once annually for five years, varying delivery annually in response to student learning and reaction-to-learning results. At year 5, we compared student learning and reaction-to-learning to applied implementation factors using logistic regression. Higher instructor involvement and overall NTL levels predicted correct exam responses (p=0.0007 and p<0.0001, respectively). Exam responses were statistically equivalent between the most traditional and highest overall NTL deliveries. Students rated instructor presentation skills and teaching methods higher when greater instructor involvement (p<0.0001, both) and lower overall NTL levels (P<0.0001, both) were used. Students perceived that teaching methods were most effective when lower student involvement and higher technology levels (p<0.0001, both) were used. When implementing NTL sessions as a single lesson in a traditional-delivery curriculum, instructor involvement appears essential, while the impact of student involvement and educational technology levels varies. Copyright © 2017 Elsevier Inc. All rights reserved.
Strom, Suzanne L; Anderson, Craig L; Yang, Luanna; Canales, Cecilia; Amin, Alpesh; Lotfipour, Shahram; McCoy, C Eric; Osborn, Megan Boysen; Langdorf, Mark I
2015-11-01
Traditional Advanced Cardiac Life Support (ACLS) courses are evaluated using written multiple-choice tests. High-fidelity simulation is a widely used adjunct to didactic content, and has been used in many specialties as a training resource as well as an evaluative tool. There are no data to our knowledge that compare simulation examination scores with written test scores for ACLS courses. To compare and correlate a novel high-fidelity simulation-based evaluation with traditional written testing for senior medical students in an ACLS course. We performed a prospective cohort study to determine the correlation between simulation-based evaluation and traditional written testing in a medical school simulation center. Students were tested on a standard acute coronary syndrome/ventricular fibrillation cardiac arrest scenario. Our primary outcome measure was correlation of exam results for 19 volunteer fourth-year medical students after a 32-hour ACLS-based Resuscitation Boot Camp course. Our secondary outcome was comparison of simulation-based vs. written outcome scores. The composite average score on the written evaluation was substantially higher (93.6%) than the simulation performance score (81.3%, absolute difference 12.3%, 95% CI [10.6-14.0%], p<0.00005). We found a statistically significant moderate correlation between simulation scenario test performance and traditional written testing (Pearson r=0.48, p=0.04), validating the new evaluation method. Simulation-based ACLS evaluation methods correlate with traditional written testing and demonstrate resuscitation knowledge and skills. Simulation may be a more discriminating and challenging testing method, as students scored higher on written evaluation methods compared to simulation.
Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview
Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee
2013-01-01
Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037
Gagnon, Marie-Pierre; Gagnon, Johanne; Desmartis, Marie; Njoya, Merlin
2013-01-01
This study aimed to assess the effectiveness of a blended-teaching intervention using Internet-based tutorials coupled with traditional lectures in an introduction to research undergraduate nursing course. Effects of the intervention were compared with conventional, face-to-face classroom teaching on three outcomes: knowledge, satisfaction, and self-learning readiness. A two-group, randomized, controlled design was used, involving 112 participants. Descriptive statistics and analysis of covariance (ANCOVA) were performed. The teaching method was found to have no direct impact on knowledge acquisition, satisfaction, and self-learning readiness. However, motivation and teaching method had an interaction effect on knowledge acquisition by students. Among less motivated students, those in the intervention group performed better than those who received traditional training. These findings suggest that this blended-teaching method could better suit some students, depending on their degree of motivation and level of self-directed learning readiness.
Van Hemelen, Geert; Van Genechten, Maarten; Renier, Lieven; Desmedt, Maria; Verbruggen, Elric; Nadjmi, Nasser
2015-07-01
Throughout the history of computing, shortening the gap between the physical and digital world behind the screen has always been strived for. Recent advances in three-dimensional (3D) virtual surgery programs have reduced this gap significantly. Although 3D assisted surgery is now widely available for orthognathic surgery, one might still argue whether a 3D virtual planning approach is a better alternative to a conventional two-dimensional (2D) planning technique. The purpose of this study was to compare the accuracy of a traditional 2D technique and a 3D computer-aided prediction method. A double blind randomised prospective study was performed to compare the prediction accuracy of a traditional 2D planning technique versus a 3D computer-aided planning approach. The accuracy of the hard and soft tissue profile predictions using both planning methods was investigated. There was a statistically significant difference between 2D and 3D soft tissue planning (p < 0.05). The statistically significant difference found between 2D and 3D planning and the actual soft tissue outcome was not confirmed by a statistically significant difference between methods. The 3D planning approach provides more accurate soft tissue planning. However, the 2D orthognathic planning is comparable to 3D planning when it comes to hard tissue planning. This study provides relevant results for choosing between 3D and 2D planning in clinical practice. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Frequentist Model Averaging in Structural Equation Modelling.
Jin, Shaobo; Ankargren, Sebastian
2018-06-04
Model selection from a set of candidate models plays an important role in many structural equation modelling applications. However, traditional model selection methods introduce extra randomness that is not accounted for by post-model selection inference. In the current study, we propose a model averaging technique within the frequentist statistical framework. Instead of selecting an optimal model, the contributions of all candidate models are acknowledged. Valid confidence intervals and a [Formula: see text] test statistic are proposed. A simulation study shows that the proposed method is able to produce a robust mean-squared error, a better coverage probability, and a better goodness-of-fit test compared to model selection. It is an interesting compromise between model selection and the full model.
Fritscher, Karl; Schuler, Benedikt; Link, Thomas; Eckstein, Felix; Suhm, Norbert; Hänni, Markus; Hengg, Clemens; Schubert, Rainer
2008-01-01
Fractures of the proximal femur are one of the principal causes of mortality among elderly persons. Traditional methods for the determination of femoral fracture risk use methods for measuring bone mineral density. However, BMD alone is not sufficient to predict bone failure load for an individual patient and additional parameters have to be determined for this purpose. In this work an approach that uses statistical models of appearance to identify relevant regions and parameters for the prediction of biomechanical properties of the proximal femur will be presented. By using Support Vector Regression the proposed model based approach is capable of predicting two different biomechanical parameters accurately and fully automatically in two different testing scenarios.
Zhao, Jing-Xin; Su, Xiu-Yun; Xiao, Ruo-Xiu; Zhao, Zhe; Zhang, Li-Hai; Zhang, Li-Cheng; Tang, Pei-Fu
2016-11-01
We established a mathematical method to precisely calculate the radiographic anteversion (RA) and radiographic inclination (RI) angles of the acetabular cup based on anterior-posterior (AP) pelvic radiographs after total hip arthroplasty. Using Mathematica software, a mathematical model for an oblique cone was established to simulate how AP pelvic radiographs are obtained and to address the relationship between the two-dimensional and three-dimensional geometry of the opening circle of the cup. In this model, the vertex was the X-ray beam source, and the generatrix was the ellipse in radiographs projected from the opening circle of the acetabular cup. Using this model, we established a series of mathematical formulas to reveal the differences between the true RA and RI cup angles and the measurements results achieved using traditional methods and AP pelvic radiographs and to precisely calculate the RA and RI cup angles based on post-operative AP pelvic radiographs. Statistical analysis indicated that traditional methods should be used with caution if traditional measurements methods are used to calculate the RA and RI cup angles with AP pelvic radiograph. The entire calculation process could be performed by an orthopedic surgeon with mathematical knowledge of basic matrix and vector equations. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
High correlations between MRI brain volume measurements based on NeuroQuant® and FreeSurfer.
Ross, David E; Ochs, Alfred L; Tate, David F; Tokac, Umit; Seabaugh, John; Abildskov, Tracy J; Bigler, Erin D
2018-05-30
NeuroQuant ® (NQ) and FreeSurfer (FS) are commonly used computer-automated programs for measuring MRI brain volume. Previously they were reported to have high intermethod reliabilities but often large intermethod effect size differences. We hypothesized that linear transformations could be used to reduce the large effect sizes. This study was an extension of our previously reported study. We performed NQ and FS brain volume measurements on 60 subjects (including normal controls, patients with traumatic brain injury, and patients with Alzheimer's disease). We used two statistical approaches in parallel to develop methods for transforming FS volumes into NQ volumes: traditional linear regression, and Bayesian linear regression. For both methods, we used regression analyses to develop linear transformations of the FS volumes to make them more similar to the NQ volumes. The FS-to-NQ transformations based on traditional linear regression resulted in effect sizes which were small to moderate. The transformations based on Bayesian linear regression resulted in all effect sizes being trivially small. To our knowledge, this is the first report describing a method for transforming FS to NQ data so as to achieve high reliability and low effect size differences. Machine learning methods like Bayesian regression may be more useful than traditional methods. Copyright © 2018 Elsevier B.V. All rights reserved.
Ozaki, Vitor A.; Ghosh, Sujit K.; Goodwin, Barry K.; Shirota, Ricardo
2009-01-01
This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Paraná (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited. PMID:19890450
Defining the best quality-control systems by design and inspection.
Hinckley, C M
1997-05-01
Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.
Hernández-Morera, Pablo; Castaño-González, Irene; Travieso-González, Carlos M.; Mompeó-Corredera, Blanca; Ortega-Santana, Francisco
2016-01-01
Purpose To develop a digital image processing method to quantify structural components (smooth muscle fibers and extracellular matrix) in the vessel wall stained with Masson’s trichrome, and a statistical method suitable for small sample sizes to analyze the results previously obtained. Methods The quantification method comprises two stages. The pre-processing stage improves tissue image appearance and the vessel wall area is delimited. In the feature extraction stage, the vessel wall components are segmented by grouping pixels with a similar color. The area of each component is calculated by normalizing the number of pixels of each group by the vessel wall area. Statistical analyses are implemented by permutation tests, based on resampling without replacement from the set of the observed data to obtain a sampling distribution of an estimator. The implementation can be parallelized on a multicore machine to reduce execution time. Results The methods have been tested on 48 vessel wall samples of the internal saphenous vein stained with Masson’s trichrome. The results show that the segmented areas are consistent with the perception of a team of doctors and demonstrate good correlation between the expert judgments and the measured parameters for evaluating vessel wall changes. Conclusion The proposed methodology offers a powerful tool to quantify some components of the vessel wall. It is more objective, sensitive and accurate than the biochemical and qualitative methods traditionally used. The permutation tests are suitable statistical techniques to analyze the numerical measurements obtained when the underlying assumptions of the other statistical techniques are not met. PMID:26761643
Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.
2017-01-01
Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237
Pugh, Aaron L.
2014-01-01
Users of streamflow information often require streamflow statistics and basin characteristics at various locations along a stream. The USGS periodically calculates and publishes streamflow statistics and basin characteristics for streamflowgaging stations and partial-record stations, but these data commonly are scattered among many reports that may or may not be readily available to the public. The USGS also provides and periodically updates regional analyses of streamflow statistics that include regression equations and other prediction methods for estimating statistics for ungaged and unregulated streams across the State. Use of these regional predictions for a stream can be complex and often requires the user to determine a number of basin characteristics that may require interpretation. Basin characteristics may include drainage area, classifiers for physical properties, climatic characteristics, and other inputs. Obtaining these input values for gaged and ungaged locations traditionally has been time consuming, subjective, and can lead to inconsistent results.
An Exercise in Exploring Big Data for Producing Reliable Statistical Information.
Rey-Del-Castillo, Pilar; Cardeñosa, Jesús
2016-06-01
The availability of copious data about many human, social, and economic phenomena is considered an opportunity for the production of official statistics. National statistical organizations and other institutions are more and more involved in new projects for developing what is sometimes seen as a possible change of paradigm in the way statistical figures are produced. Nevertheless, there are hardly any systems in production using Big Data sources. Arguments of confidentiality, data ownership, representativeness, and others make it a difficult task to get results in the short term. Using Call Detail Records from Ivory Coast as an illustration, this article shows some of the issues that must be dealt with when producing statistical indicators from Big Data sources. A proposal of a graphical method to evaluate one specific aspect of the quality of the computed figures is also presented, demonstrating that the visual insight provided improves the results obtained using other traditional procedures.
Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J
2017-12-01
qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.
A statistical evaluation of non-ergodic variogram estimators
Curriero, F.C.; Hohn, M.E.; Liebhold, A.M.; Lele, S.R.
2002-01-01
Geostatistics is a set of statistical techniques that is increasingly used to characterize spatial dependence in spatially referenced ecological data. A common feature of geostatistics is predicting values at unsampled locations from nearby samples using the kriging algorithm. Modeling spatial dependence in sampled data is necessary before kriging and is usually accomplished with the variogram and its traditional estimator. Other types of estimators, known as non-ergodic estimators, have been used in ecological applications. Non-ergodic estimators were originally suggested as a method of choice when sampled data are preferentially located and exhibit a skewed frequency distribution. Preferentially located samples can occur, for example, when areas with high values are sampled more intensely than other areas. In earlier studies the visual appearance of variograms from traditional and non-ergodic estimators were compared. Here we evaluate the estimators' relative performance in prediction. We also show algebraically that a non-ergodic version of the variogram is equivalent to the traditional variogram estimator. Simulations, designed to investigate the effects of data skewness and preferential sampling on variogram estimation and kriging, showed the traditional variogram estimator outperforms the non-ergodic estimators under these conditions. We also analyzed data on carabid beetle abundance, which exhibited large-scale spatial variability (trend) and a skewed frequency distribution. Detrending data followed by robust estimation of the residual variogram is demonstrated to be a successful alternative to the non-ergodic approach.
A comparison of two microscale laboratory reporting methods in a secondary chemistry classroom
NASA Astrophysics Data System (ADS)
Martinez, Lance Michael
This study attempted to determine if there was a difference between the laboratory achievement of students who used a modified reporting method and those who used traditional laboratory reporting. The study also determined the relationships between laboratory performance scores and the independent variables score on the Group Assessment of Logical Thinking (GALT) test, chronological age in months, gender, and ethnicity for each of the treatment groups. The study was conducted using 113 high school students who were enrolled in first-year general chemistry classes at Pueblo South High School in Colorado. The research design used was the quasi-experimental Nonequivalent Control Group Design. The statistical treatment consisted of the Multiple Regression Analysis and the Analysis of Covariance. Based on the GALT, students in the two groups were generally in the concrete and transitional stages of the Piagetian cognitive levels. The findings of the study revealed that the traditional and the modified methods of laboratory reporting did not have any effect on the laboratory performance outcome of the subjects. However, the students who used the traditional method of reporting showed a higher laboratory performance score when evaluation was conducted using the New Standards rubric recommended by the state. Multiple Regression Analysis revealed that there was a significant relationship between the criterion variable student laboratory performance outcome of individuals who employed traditional laboratory reporting methods and the composite set of predictor variables. On the contrary, there was no significant relationship between the criterion variable student laboratory performance outcome of individuals who employed modified laboratory reporting methods and the composite set of predictor variables.
Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba
2012-01-01
Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035
Statistical Estimation of Heterogeneities: A New Frontier in Well Testing
NASA Astrophysics Data System (ADS)
Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.
2001-12-01
Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.
Jamali, Jamshid; Ayatollahi, Seyyed Mohammad Taghi
2015-01-01
Background: Nurses constitute the most providers of health care systems. Their mental health can affect the quality of services and patients’ satisfaction. General Health Questionnaire (GHQ-12) is a general screening tool used to detect mental disorders. Scoring method and determining thresholds for this questionnaire are debatable and the cut-off points can vary from sample to sample. This study was conducted to estimate the prevalence of mental disorders among Iranian nurses using GHQ-12 and also compare Latent Class Analysis (LCA) and K-means clustering with traditional scoring method. Methodology: A cross-sectional study was carried out in Fars and Bushehr provinces of southern Iran in 2014. Participants were 771 Iranian nurses, who filled out the GHQ-12 questionnaire. Traditional scoring method, LCA and K-means were used to estimate the prevalence of mental disorder among Iranian nurses. Cohen’s kappa statistic was applied to assess the agreement between the LCA and K-means with traditional scoring method of GHQ-12. Results: The nurses with mental disorder by scoring method, LCA and K-mean were 36.3% (n=280), 32.2% (n=248), and 26.5% (n=204), respectively. LCA and logistic regression revealed that the prevalence of mental disorder in females was significantly higher than males. Conclusion: Mental disorder in nurses was in a medium level compared to other people living in Iran. There was a little difference between prevalence of mental disorder estimated by scoring method, K-means and LCA. According to the advantages of LCA than K-means and different results in scoring method, we suggest LCA for classification of Iranian nurses according to their mental health outcomes using GHQ-12 questionnaire PMID:26622202
Jamali, Jamshid; Ayatollahi, Seyyed Mohammad Taghi
2015-10-01
Nurses constitute the most providers of health care systems. Their mental health can affect the quality of services and patients' satisfaction. General Health Questionnaire (GHQ-12) is a general screening tool used to detect mental disorders. Scoring method and determining thresholds for this questionnaire are debatable and the cut-off points can vary from sample to sample. This study was conducted to estimate the prevalence of mental disorders among Iranian nurses using GHQ-12 and also compare Latent Class Analysis (LCA) and K-means clustering with traditional scoring method. A cross-sectional study was carried out in Fars and Bushehr provinces of southern Iran in 2014. Participants were 771 Iranian nurses, who filled out the GHQ-12 questionnaire. Traditional scoring method, LCA and K-means were used to estimate the prevalence of mental disorder among Iranian nurses. Cohen's kappa statistic was applied to assess the agreement between the LCA and K-means with traditional scoring method of GHQ-12. The nurses with mental disorder by scoring method, LCA and K-mean were 36.3% (n=280), 32.2% (n=248), and 26.5% (n=204), respectively. LCA and logistic regression revealed that the prevalence of mental disorder in females was significantly higher than males. Mental disorder in nurses was in a medium level compared to other people living in Iran. There was a little difference between prevalence of mental disorder estimated by scoring method, K-means and LCA. According to the advantages of LCA than K-means and different results in scoring method, we suggest LCA for classification of Iranian nurses according to their mental health outcomes using GHQ-12 questionnaire.
ERIC Educational Resources Information Center
Pant, Mohan Dev
2011-01-01
The Burr families (Type III and Type XII) of distributions are traditionally used in the context of statistical modeling and for simulating non-normal distributions with moment-based parameters (e.g., Skew and Kurtosis). In educational and psychological studies, the Burr families of distributions can be used to simulate extremely asymmetrical and…
ERIC Educational Resources Information Center
Knight, Jennifer L.
This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…
ERIC Educational Resources Information Center
Stansbury, Jessica A.; Wheeler, Evangeline A.; Buckingham, Justin T.
2014-01-01
Technological advancements and growing dependence on media outlets as sources of information compete for the attention of individuals born in a rapidly expanding digital age. As a result, educators using traditional, nondigital teaching methods struggle with keeping students engaged in the classroom. The present study assessed the extent to which…
Towards Validation of an Adaptive Flight Control Simulation Using Statistical Emulation
NASA Technical Reports Server (NTRS)
He, Yuning; Lee, Herbert K. H.; Davies, Misty D.
2012-01-01
Traditional validation of flight control systems is based primarily upon empirical testing. Empirical testing is sufficient for simple systems in which a.) the behavior is approximately linear and b.) humans are in-the-loop and responsible for off-nominal flight regimes. A different possible concept of operation is to use adaptive flight control systems with online learning neural networks (OLNNs) in combination with a human pilot for off-nominal flight behavior (such as when a plane has been damaged). Validating these systems is difficult because the controller is changing during the flight in a nonlinear way, and because the pilot and the control system have the potential to co-adapt in adverse ways traditional empirical methods are unlikely to provide any guarantees in this case. Additionally, the time it takes to find unsafe regions within the flight envelope using empirical testing means that the time between adaptive controller design iterations is large. This paper describes a new concept for validating adaptive control systems using methods based on Bayesian statistics. This validation framework allows the analyst to build nonlinear models with modal behavior, and to have an uncertainty estimate for the difference between the behaviors of the model and system under test.
Bayes in biological anthropology.
Konigsberg, Lyle W; Frankenberg, Susan R
2013-12-01
In this article, we both contend and illustrate that biological anthropologists, particularly in the Americas, often think like Bayesians but act like frequentists when it comes to analyzing a wide variety of data. In other words, while our research goals and perspectives are rooted in probabilistic thinking and rest on prior knowledge, we often proceed to use statistical hypothesis tests and confidence interval methods unrelated (or tenuously related) to the research questions of interest. We advocate for applying Bayesian analyses to a number of different bioanthropological questions, especially since many of the programming and computational challenges to doing so have been overcome in the past two decades. To facilitate such applications, this article explains Bayesian principles and concepts, and provides concrete examples of Bayesian computer simulations and statistics that address questions relevant to biological anthropology, focusing particularly on bioarchaeology and forensic anthropology. It also simultaneously reviews the use of Bayesian methods and inference within the discipline to date. This article is intended to act as primer to Bayesian methods and inference in biological anthropology, explaining the relationships of various methods to likelihoods or probabilities and to classical statistical models. Our contention is not that traditional frequentist statistics should be rejected outright, but that there are many situations where biological anthropology is better served by taking a Bayesian approach. To this end it is hoped that the examples provided in this article will assist researchers in choosing from among the broad array of statistical methods currently available. Copyright © 2013 Wiley Periodicals, Inc.
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
Solomon, Daniel H.; Kremer, Joel; Curtis, Jeffrey R; Hochberg, Marc C.; Reed, George; Tsao, Peter; Farkouh, Michael E.; Setoguchi, Soko; Greenberg, Jeffrey D.
2010-01-01
Background Cardiovascular (CV) disease has a major impact on patients with rheumatoid arthritis (RA), however, the relative contributions of traditional CV risk factors and markers of RA severity are unclear. We examined the relative importance of traditional CV risk factors and RA markers in predicting CV events. Methods A prospective longitudinal cohort study was conducted in the setting of the CORRONA registry in the United States. Baseline data from subjects with RA enrolled in the CORRONA registry were examined to determine predictors of CV outcomes, including myocardial infarction (MI), stroke or transient ischemic attack (TIA). Possible predictors were of two types: traditional CV risk factors and markers of RA severity. The discriminatory value of these variables was assessed by calculating the area under the receiver operating characteristic curve (c-statistic) in logistic regression. We then assessed the incidence rate for CV events among subjects with an increasing number of traditional CV risk factors and/or RA severity markers. Results The cohort consisted of 10,156 patients with RA followed for a median of 22 months. We observed 76 primary CV events during follow-up for a composite event rate of 3.98 (95% CI 3.08 – 4.88) per 1,000 patient-years. The c-statistic improved from 0.57 for models with only CV risk factors to 0.67 for models with CV risk factors plus age and gender. The c-statistic improved further to 0.71 when markers of RA severity were also added. The incidence rate for CV events was 0 (95% CI 0 – 5.98) for persons without any CV risk factors or markers of RA severity, while in the group with two or more CV risk factors and 3 or more markers of RA severity the incidence was 7.47 (95% CI 4.21–10.73) per 1,000 person-years. Conclusions Traditional CV risk factors and markers of RA severity both contribute to models predicting CV events. Increasing numbers of both types of factors are associated with greater risk. PMID:20444756
Full information acquisition in scanning probe microscopy and spectroscopy
Jesse, Stephen; Belianinov, Alex; Kalinin, Sergei V.; Somnath, Suhas
2017-04-04
Apparatus and methods are described for scanning probe microscopy and spectroscopy based on acquisition of full probe response. The full probe response contains valuable information about the probe-sample interaction that is lost in traditional scanning probe microscopy and spectroscopy methods. The full probe response is analyzed post data acquisition using fast Fourier transform and adaptive filtering, as well as multivariate analysis. The full response data is further compressed to retain only statistically significant components before being permanently stored.
Latham, Daniel T; Hill, Grant M; Petray, Clayre K
2013-04-01
The purpose of this study was to assess whether a treadmill mile is an acceptable FitnessGram Test substitute for the traditional one-mile run for middle school boys and girls. Peak heart rate and perceived physical exertion of the participants were also measured to assess students' effort. 48 boys and 40 girls participated, with approximately 85% classified as Hispanic. Boys' mean time for the traditional one-mile run, as well as peak heart rate and perceived exertion, were statistically significantly faster and higher, respectively, than for the treadmill mile. Girls' treadmill mile times were not statistically significantly different from the traditional one-mile run. There were no statistically significant differences for girl's peak heart rate or perceived exertion. The results suggest that providing middle school students a choice of completing the FitnessGram mile run in either traditional one-mile run or treadmill one-mile format may positively affect performance.
Event-based internet biosurveillance: relation to epidemiological observation
2012-01-01
Background The World Health Organization (WHO) collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1) 2009 (April 2009 – January 2010) from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p < 0.05) correlations with WHO case count data; the sum of these 5 categories was highly correlated with WHO case data (r = 0.81, p < 0.0001), with expected differences observed among the 8 regions. Argus reported first confirmed cases on the same day as WHO for 21 of the first 64 countries reporting cases, and 1 to 16 days (average 1.5 days) ahead of WHO for 42 of those countries. Conclusion Confirmed pandemic (H1N1) 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events. PMID:22709988
Markov Logic Networks in the Analysis of Genetic Data
Sakhanenko, Nikita A.
2010-01-01
Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249
Lotfy, Hayam Mahmoud; Salem, Hesham; Abdelkawy, Mohammad; Samir, Ahmed
2015-04-05
Five spectrophotometric methods were successfully developed and validated for the determination of betamethasone valerate and fusidic acid in their binary mixture. Those methods are isoabsorptive point method combined with the first derivative (ISO Point--D1) and the recently developed and well established methods namely ratio difference (RD) and constant center coupled with spectrum subtraction (CC) methods, in addition to derivative ratio (1DD) and mean centering of ratio spectra (MCR). New enrichment technique called spectrum addition technique was used instead of traditional spiking technique. The proposed spectrophotometric procedures do not require any separation steps. Accuracy, precision and linearity ranges of the proposed methods were determined and the specificity was assessed by analyzing synthetic mixtures of both drugs. They were applied to their pharmaceutical formulation and the results obtained were statistically compared to that of official methods. The statistical comparison showed that there is no significant difference between the proposed methods and the official ones regarding both accuracy and precision. Copyright © 2015 Elsevier B.V. All rights reserved.
DisArticle: a web server for SVM-based discrimination of articles on traditional medicine.
Kim, Sang-Kyun; Nam, SeJin; Kim, SangHyun
2017-01-28
Much research has been done in Northeast Asia to show the efficacy of traditional medicine. While MEDLINE contains many biomedical articles including those on traditional medicine, it does not categorize those articles by specific research area. The aim of this study was to provide a method that searches for articles only on traditional medicine in Northeast Asia, including traditional Chinese medicine, from among the articles in MEDLINE. This research established an SVM-based classifier model to identify articles on traditional medicine. The TAK + HM classifier, trained with the features of title, abstract, keywords, herbal data, and MeSH, has a precision of 0.954 and a recall of 0.902. In particular, the feature of herbal data significantly increased the performance of the classifier. By using the TAK + HM classifier, a total of about 108,000 articles were discriminated as articles on traditional medicine from among all articles in MEDLINE. We also built a web server called DisArticle ( http://informatics.kiom.re.kr/disarticle ), in which users can search for the articles and obtain statistical data. Because much evidence-based research on traditional medicine has been published in recent years, it has become necessary to search for articles on traditional medicine exclusively in literature databases. DisArticle can help users to search for and analyze the research trends in traditional medicine.
Dynamic laser speckle analyzed considering inhomogeneities in the biological sample
NASA Astrophysics Data System (ADS)
Braga, Roberto A.; González-Peña, Rolando J.; Viana, Dimitri Campos; Rivera, Fernando Pujaico
2017-04-01
Dynamic laser speckle phenomenon allows a contactless and nondestructive way to monitor biological changes that are quantified by second-order statistics applied in the images in time using a secondary matrix known as time history of the speckle pattern (THSP). To avoid being time consuming, the traditional way to build the THSP restricts the data to a line or column. Our hypothesis is that the spatial restriction of the information could compromise the results, particularly when undesirable and unexpected optical inhomogeneities occur, such as in cell culture media. It tested a spatial random approach to collect the points to form a THSP. Cells in a culture medium and in drying paint, representing homogeneous samples in different levels, were tested, and a comparison with the traditional method was carried out. An alternative random selection based on a Gaussian distribution around a desired position was also presented. The results showed that the traditional protocol presented higher variation than the outcomes using the random method. The higher the inhomogeneity of the activity map, the higher the efficiency of the proposed method using random points. The Gaussian distribution proved to be useful when there was a well-defined area to monitor.
Surveying Europe's Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA.
Vörös, Judit; Márton, Orsolya; Schmidt, Benedikt R; Gál, Júlia Tünde; Jelić, Dušan
2017-01-01
In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence.
Tanoue, Naomi
2007-10-01
For any kind of research, "Research Design" is the most important. The design is used to structure the research, to show how all of the major parts of the research project. It is necessary for all the researchers to begin the research after planning research design for what is the main theme, what is the background and reference, what kind of data is needed, and what kind of analysis is needed. It seems to be a roundabout route, but, in fact, it will be a shortcut. The research methods must be appropriate to the objectives of the study. Regarding the hypothesis-testing research that is the traditional style of the research, the research design based on statistics is undoubtedly necessary considering that the research basically proves "a hypothesis" with data and statistics theory. On the subject of the clinical trial, which is the clinical version of the hypothesis-testing research, the statistical method must be mentioned in a clinical trial planning. This report describes the basis of the research design for a prosthodontics study.
Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.
2013-01-01
Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890
Current Trends in Modeling Research for Turbulent Aerodynamic Flows
NASA Technical Reports Server (NTRS)
Gatski, Thomas B.; Rumsey, Christopher L.; Manceau, Remi
2007-01-01
The engineering tools of choice for the computation of practical engineering flows have begun to migrate from those based on the traditional Reynolds-averaged Navier-Stokes approach to methodologies capable, in theory if not in practice, of accurately predicting some instantaneous scales of motion in the flow. The migration has largely been driven by both the success of Reynolds-averaged methods over a wide variety of flows as well as the inherent limitations of the method itself. Practitioners, emboldened by their ability to predict a wide-variety of statistically steady, equilibrium turbulent flows, have now turned their attention to flow control and non-equilibrium flows, that is, separation control. This review gives some current priorities in traditional Reynolds-averaged modeling research as well as some methodologies being applied to a new class of turbulent flow control problems.
2014-01-01
Background Every social grouping in the world has its own cultural practices and beliefs which guide its members on how they should live or behave. Harmful traditional practices that affect children are Female genital mutilation, Milk teeth extraction, Food taboo, Uvula cutting, keeping babies out of exposure to sun, and Feeding fresh butter to new born babies. The objective of this study was to assess factors associated with harmful traditional practices among children less than 5 years of age in Axum town, North Ethiopia. Methods Community based cross sectional study was conducted in 752 participants who were selected using multi stage sampling; Simple random sampling method was used to select ketenas from all kebelles of Axum town. After proportional allocation of sample size, systematic random sampling method was used to get the study participants. Data was collected using interviewer administered Tigrigna version questionnaire, it was entered and analyzed using SPSS version 16. Descriptive statistics was calculated and logistic regressions were used to analyze the data. Results Out of the total sample size 50.7% children were females, the mean age of children was 26.28 months and majority of mothers had no formal education. About 87.8% mothers had performed at least one traditional practice to their children; uvula cutting was practiced on 86.9% children followed by milk teeth extraction 12.5% and eye borrows incision 2.4% children. Fear of swelling, pus and rapture of the uvula was the main reason to perform uvula cutting. Conclusion The factors associated with harmful traditional practices were educational status, occupation, religion of mothers and harmful traditional practices performed on the mothers. PMID:24952584
ERIC Educational Resources Information Center
Bradford, Jennifer; Mowder, Denise; Bohte, Joy
2016-01-01
The current project conducted an assessment of specific, directed use of student-centered teaching techniques in a criminal justice and criminology research methods and statistics class. The project sought to ascertain to what extent these techniques improved or impacted student learning and engagement in this traditionally difficult course.…
ERIC Educational Resources Information Center
de Jong, N.; Verstegen, D. M. L.; Tan, F. E. S.; O'Connor, S. J.
2013-01-01
This case-study compared traditional, face-to-face classroom-based teaching with asynchronous online learning and teaching methods in two sets of students undertaking a problem-based learning module in the multilevel and exploratory factor analysis of longitudinal data as part of a Masters degree in Public Health at Maastricht University. Students…
ERIC Educational Resources Information Center
Weltman, David; Whiteside, Mary
2010-01-01
This research shows that active learning is not universally effective and, in fact, may inhibit learning for certain types of students. The results of this study show that as increased levels of active learning are utilized, student test scores decrease for those with a high grade point average. In contrast, test scores increase as active learning…
Liu, Wensong; Yang, Jie; Zhao, Jinqi; Shi, Hongtao; Yang, Le
2018-02-12
The traditional unsupervised change detection methods based on the pixel level can only detect the changes between two different times with same sensor, and the results are easily affected by speckle noise. In this paper, a novel method is proposed to detect change based on time-series data from different sensors. Firstly, the overall difference image of the time-series PolSAR is calculated by omnibus test statistics, and difference images between any two images in different times are acquired by R j test statistics. Secondly, the difference images are segmented with a Generalized Statistical Region Merging (GSRM) algorithm which can suppress the effect of speckle noise. Generalized Gaussian Mixture Model (GGMM) is then used to obtain the time-series change detection maps in the final step of the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection using time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can not only detect the time-series change from different sensors, but it can also better suppress the influence of speckle noise and improve the overall accuracy and Kappa coefficient.
Applications of "Integrated Data Viewer'' (IDV) in the classroom
NASA Astrophysics Data System (ADS)
Nogueira, R.; Cutrim, E. M.
2006-06-01
Conventionally, weather products utilized in synoptic meteorology reduce phenomena occurring in four dimensions to a 2-dimensional form. This constitutes a road-block for non-atmospheric-science majors who need to take meteorology as a non-mathematical and complementary course to their major programs. This research examines the use of Integrated Data Viewer-IDV as a teaching tool, as it allows a 4-dimensional representation of weather products. IDV was tested in the teaching of synoptic meteorology, weather analysis, and weather map interpretation to non-science students in the laboratory sessions of an introductory meteorology class at Western Michigan University. Comparison of student exam scores according to the laboratory teaching techniques, i.e., traditional lab manual and IDV was performed for short- and long-term learning. Results of the statistical analysis show that the Fall 2004 students in the IDV-based lab session retained learning. However, in the Spring 2005 the exam scores did not reflect retention in learning when compared with IDV-based and MANUAL-based lab scores (short term learning, i.e., exam taken one week after the lab exercise). Testing the long-term learning, seven weeks between the two exams in the Spring 2005, show no statistically significant difference between IDV-based group scores and MANUAL-based group scores. However, the IDV group obtained exam score average slightly higher than the MANUAL group. Statistical testing of the principal hypothesis in this study, leads to the conclusion that the IDV-based method did not prove to be a better teaching tool than the traditional paper-based method. Future studies could potentially find significant differences in the effectiveness of both manual and IDV methods if the conditions had been more controlled. That is, students in the control group should not be exposed to the weather analysis using IDV during lecture.
NASA Astrophysics Data System (ADS)
Machicoane, Nathanaël; López-Caballero, Miguel; Bourgoin, Mickael; Aliseda, Alberto; Volk, Romain
2017-10-01
We present a method to improve the accuracy of velocity measurements for fluid flow or particles immersed in it, based on a multi-time-step approach that allows for cancellation of noise in the velocity measurements. Improved velocity statistics, a critical element in turbulent flow measurements, can be computed from the combination of the velocity moments computed using standard particle tracking velocimetry (PTV) or particle image velocimetry (PIV) techniques for data sets that have been collected over different values of time intervals between images. This method produces Eulerian velocity fields and Lagrangian velocity statistics with much lower noise levels compared to standard PIV or PTV measurements, without the need of filtering and/or windowing. Particle displacement between two frames is computed for multiple different time-step values between frames in a canonical experiment of homogeneous isotropic turbulence. The second order velocity structure function of the flow is computed with the new method and compared to results from traditional measurement techniques in the literature. Increased accuracy is also demonstrated by comparing the dissipation rate of turbulent kinetic energy measured from this function against previously validated measurements.
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
The Problem Solving Method in Teaching Physics in Elementary School
NASA Astrophysics Data System (ADS)
Jandrić, Gordana Hajduković; Obadović, Dušanka Ž.; Stojanović, Maja
2010-01-01
The most of the teachers ask if there is a "best" known way to teach. The most effective teaching method depends on the specific goals of the course and the needs of the students. An investigation has been carried out to compare the effect of teaching selected physics topics using problem-solving method on the overall achievements of the acquired knowledge and teaching the same material by traditional teaching method. The investigation was performed as a pedagogical experiment of the type of parallel groups with randomly chosen sample of students attending grades eight. The control and experimental groups were equalized in the relevant pedagogical parameters. The obtained results were treated statistically. The comparison showed a significant difference in respect of the speed of acquiring knowledge, the problem-solving teaching being advantageous over traditional methodDo not replace the word "abstract," but do replace the rest of this text. If you must insert a hard line break, please use Shift+Enter rather than just tapping your "Enter" key. You may want to print this page and refer to it as a style sample before you begin working on your paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés
The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of themore » intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.« less
Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art
Redies, Christoph; Brachmann, Anselm
2017-01-01
Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time. PMID:29118692
Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art.
Redies, Christoph; Brachmann, Anselm
2017-01-01
Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time.
Statistically Modeling I-V Characteristics of CNT-FET with LASSO
NASA Astrophysics Data System (ADS)
Ma, Dongsheng; Ye, Zuochang; Wang, Yan
2017-08-01
With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.
Flow Equation Approach to the Statistics of Nonlinear Dynamical Systems
NASA Astrophysics Data System (ADS)
Marston, J. B.; Hastings, M. B.
2005-03-01
The probability distribution function of non-linear dynamical systems is governed by a linear framework that resembles quantum many-body theory, in which stochastic forcing and/or averaging over initial conditions play the role of non-zero . Besides the well-known Fokker-Planck approach, there is a related Hopf functional methodootnotetextUriel Frisch, Turbulence: The Legacy of A. N. Kolmogorov (Cambridge University Press, 1995) chapter 9.5.; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we investigate the method of continuous unitary transformationsootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994). (also known as the flow equation approachootnotetextF. Wegner, Ann. Phys. 3, 77 (1994).), suitably generalized to the diagonalization of non-Hermitian matrices. Comparison to the more traditional cumulant expansion method is illustrated with low-dimensional attractors. The treatment of high-dimensional dynamical systems is also discussed.
Sound source measurement by using a passive sound insulation and a statistical approach
NASA Astrophysics Data System (ADS)
Dragonetti, Raffaele; Di Filippo, Sabato; Mercogliano, Francesco; Romano, Rosario A.
2015-10-01
This paper describes a measurement technique developed by the authors that allows carrying out acoustic measurements inside noisy environments reducing background noise effects. The proposed method is based on the integration of a traditional passive noise insulation system with a statistical approach. The latter is applied to signals picked up by usual sensors (microphones and accelerometers) equipping the passive sound insulation system. The statistical approach allows improving of the sound insulation given only by the passive sound insulation system at low frequency. The developed measurement technique has been validated by means of numerical simulations and measurements carried out inside a real noisy environment. For the case-studies here reported, an average improvement of about 10 dB has been obtained in a frequency range up to about 250 Hz. Considerations on the lower sound pressure level that can be measured by applying the proposed method and the measurement error related to its application are reported as well.
A comparison of high-frequency cross-correlation measures
NASA Astrophysics Data System (ADS)
Precup, Ovidiu V.; Iori, Giulia
2004-12-01
On a high-frequency scale the time series are not homogeneous, therefore standard correlation measures cannot be directly applied to the raw data. There are two ways to deal with this problem. The time series can be homogenised through an interpolation method (An Introduction to High-Frequency Finance, Academic Press, NY, 2001) (linear or previous tick) and then the Pearson correlation statistic computed. Recently, methods that can handle raw non-synchronous time series have been developed (Int. J. Theor. Appl. Finance 6(1) (2003) 87; J. Empirical Finance 4 (1997) 259). This paper compares two traditional methods that use interpolation with an alternative method applied directly to the actual time series.
Quantification of heterogeneity observed in medical images.
Brooks, Frank J; Grigsby, Perry W
2013-03-02
There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.
Milic, Natasa M; Trajkovic, Goran Z; Bukumiric, Zoran M; Cirkovic, Andja; Nikolic, Ivan M; Milin, Jelena S; Milic, Nikola V; Savic, Marko D; Corac, Aleksandar M; Marinkovic, Jelena M; Stanisavljevic, Dejana M
2016-01-01
Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics.
A Comparison of Student Understanding of Seasons Using Inquiry and Didactic Teaching Methods
NASA Astrophysics Data System (ADS)
Ashcraft, Paul G.
2006-02-01
Student performance on open-ended questions concerning seasons in a university physical science content course was examined to note differences between classes that experienced inquiry using a 5-E lesson planning model and those that experienced the same content with a traditional, didactic lesson. The class examined is a required content course for elementary education majors and understanding the seasons is part of the university's state's elementary science standards. The two self-selected groups of students showed no statistically significant differences in pre-test scores, while there were statistically significant differences between the groups' post-test scores with those who participated in inquiry-based activities scoring higher. There were no statistically significant differences between the pre-test and the post-test for the students who experienced didactic teaching, while there were statistically significant improvements for the students who experienced the 5-E lesson.
ERIC Educational Resources Information Center
Arena, Dylan A.; Schwartz, Daniel L.
2014-01-01
Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…
ERIC Educational Resources Information Center
Alpay, Nimet; Ratvasky, Pamela; Koehler, Natalya; LeVally, Carolyn; Washington, Tawana
2017-01-01
This case study investigated the impact of the Statistical Concepts course redesign on the retention, performance, and satisfaction of non-traditional undergraduate students. The redesign used a systematic approach and has been yielding positive impacts over 5 trimesters. Student attrition rates on average decreased by 12% and the number of…
Course Format Effects on Learning Outcomes in an Introductory Statistics Course
ERIC Educational Resources Information Center
Sami, Fary
2011-01-01
The purpose of this study was to determine if course format significantly impacted student learning and course completion rates in an introductory statistics course taught at Harford Community College. In addition to the traditional lecture format, the College offers an online, and a hybrid (blend of traditional and online) version of this class.…
Traditional Nurse Triage vs. Physician Tele-Presence in a Pediatric Emergency Department
Marconi, Greg P.; Chang, Todd; Pham, Phung K.; Grajower, Daniel N.; Nager, Alan L.
2014-01-01
Objectives To compare traditional nurse triage (TNT) in a Pediatric Emergency Department (PED) to physician tele-presence (PTP). Methods Prospective, 2×2 crossover study with random assignment using a sample of walk-in patients seeking care in a PED at a large, tertiary care children’s hospital, from May 2012 to January 2013. Outcomes of triage times, documentation errors, triage scores, and survey responses were compared between TNT and PTP. Comparison between PTP to actual treating PED physicians regarding the accuracy of ordering blood and urine tests, throat cultures, and radiologic imaging was also studied. Results Paired samples t-tests showed a statistically significant difference in triage time between TNT and PTP (p=0.03), but no significant difference in documentation errors (p=0.10). Triage scores of TNT were 71% accurate, compared to PTP, which were 95% accurate. Both parents and children had favorable scores regarding PTP and the majority indicated they would prefer PTP again at their next PED visit. PTP diagnostic ordering was comparable to the actual PED physician ordering, showing no statistical differences. Conclusions Utilizing physician tele-presence technology to remotely perform triage is a feasible alternative to traditional nurse triage, with no clinically significant differences in time, triage scores, errors and patient and parent satisfaction. PMID:24445223
NASA Astrophysics Data System (ADS)
Ma, Nan; Jena, Debdeep
2015-03-01
In this work, the consequence of the high band-edge density of states on the carrier statistics and quantum capacitance in transition metal dichalcogenide two-dimensional semiconductor devices is explored. The study questions the validity of commonly used expressions for extracting carrier densities and field-effect mobilities from the transfer characteristics of transistors with such channel materials. By comparison to experimental data, a new method for the accurate extraction of carrier densities and mobilities is outlined. The work thus highlights a fundamental difference between these materials and traditional semiconductors that must be considered in future experimental measurements.
Using data warehousing and OLAP in public health care.
Hristovski, D; Rogac, M; Markota, M
2000-01-01
The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches.
Using data warehousing and OLAP in public health care.
Hristovski, D.; Rogac, M.; Markota, M.
2000-01-01
The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches. PMID:11079907
Modelling the effect of structural QSAR parameters on skin penetration using genetic programming
NASA Astrophysics Data System (ADS)
Chung, K. K.; Do, D. Q.
2010-09-01
In order to model relationships between chemical structures and biological effects in quantitative structure-activity relationship (QSAR) data, an alternative technique of artificial intelligence computing—genetic programming (GP)—was investigated and compared to the traditional method—statistical. GP, with the primary advantage of generating mathematical equations, was employed to model QSAR data and to define the most important molecular descriptions in QSAR data. The models predicted by GP agreed with the statistical results, and the most predictive models of GP were significantly improved when compared to the statistical models using ANOVA. Recently, artificial intelligence techniques have been applied widely to analyse QSAR data. With the capability of generating mathematical equations, GP can be considered as an effective and efficient method for modelling QSAR data.
Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan
2017-09-01
In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Quasi-Monte Carlo Methods Applied to Tau-Leaping in Stochastic Biological Systems.
Beentjes, Casper H L; Baker, Ruth E
2018-05-25
Quasi-Monte Carlo methods have proven to be effective extensions of traditional Monte Carlo methods in, amongst others, problems of quadrature and the sample path simulation of stochastic differential equations. By replacing the random number input stream in a simulation procedure by a low-discrepancy number input stream, variance reductions of several orders have been observed in financial applications. Analysis of stochastic effects in well-mixed chemical reaction networks often relies on sample path simulation using Monte Carlo methods, even though these methods suffer from typical slow [Formula: see text] convergence rates as a function of the number of sample paths N. This paper investigates the combination of (randomised) quasi-Monte Carlo methods with an efficient sample path simulation procedure, namely [Formula: see text]-leaping. We show that this combination is often more effective than traditional Monte Carlo simulation in terms of the decay of statistical errors. The observed convergence rate behaviour is, however, non-trivial due to the discrete nature of the models of chemical reactions. We explain how this affects the performance of quasi-Monte Carlo methods by looking at a test problem in standard quadrature.
Ferreira, Sandro S.; Krinski, Kleverton; Alves, Ragami C.; Benites, Mariana L.; Redkva, Paulo E.; Elsangedy, Hassan M.; Buzzachera, Cosme F.; Souza-Junior, Tácito P.; da Silva, Sergio G.
2014-01-01
The rating of perceived exertion (RPE) is ability to detect and interpret organic sensations while performing exercises. This method has been used to measure the level of effort that is felt during weight-training at a given intensity. The purpose of this investigation was to compare session RPE values with those of traditional RPE measurements for different weight-training muscle actions, performed together or separately. Fourteen women with no former weight-training experience were recruited for the investigation. All participants completed five sessions of exercise: familiarization, maximum force, concentric-only (CONC-only), eccentric-only (ECC-only), and dynamic (DYN = CONC + ECC). The traditional RPE method was measured after each series of exercises, and the session RPE was measured 30 min after the end of the training session. The statistical analyses used were the paired t-test, one-way analysis of variance, and repeated measures analysis of variance. Significant differences between traditional RPE and session RPE for DYN, CONC, and ECC exercises were not found. This investigation demonstrated that session RPE is similar to traditional RPE in terms of weight-training involving concentric, eccentric, or dynamic muscle exercises, and that it can be used to prescribe and monitor weight-training sessions in older subjects. PMID:24834354
Phaser crystallographic software.
McCoy, Airlie J; Grosse-Kunstleve, Ralf W; Adams, Paul D; Winn, Martyn D; Storoni, Laurent C; Read, Randy J
2007-08-01
Phaser is a program for phasing macromolecular crystal structures by both molecular replacement and experimental phasing methods. The novel phasing algorithms implemented in Phaser have been developed using maximum likelihood and multivariate statistics. For molecular replacement, the new algorithms have proved to be significantly better than traditional methods in discriminating correct solutions from noise, and for single-wavelength anomalous dispersion experimental phasing, the new algorithms, which account for correlations between F(+) and F(-), give better phases (lower mean phase error with respect to the phases given by the refined structure) than those that use mean F and anomalous differences DeltaF. One of the design concepts of Phaser was that it be capable of a high degree of automation. To this end, Phaser (written in C++) can be called directly from Python, although it can also be called using traditional CCP4 keyword-style input. Phaser is a platform for future development of improved phasing methods and their release, including source code, to the crystallographic community.
Design of Clinical Support Systems Using Integrated Genetic Algorithm and Support Vector Machine
NASA Astrophysics Data System (ADS)
Chen, Yung-Fu; Huang, Yung-Fa; Jiang, Xiaoyi; Hsu, Yuan-Nian; Lin, Hsuan-Hung
Clinical decision support system (CDSS) provides knowledge and specific information for clinicians to enhance diagnostic efficiency and improving healthcare quality. An appropriate CDSS can highly elevate patient safety, improve healthcare quality, and increase cost-effectiveness. Support vector machine (SVM) is believed to be superior to traditional statistical and neural network classifiers. However, it is critical to determine suitable combination of SVM parameters regarding classification performance. Genetic algorithm (GA) can find optimal solution within an acceptable time, and is faster than greedy algorithm with exhaustive searching strategy. By taking the advantage of GA in quickly selecting the salient features and adjusting SVM parameters, a method using integrated GA and SVM (IGS), which is different from the traditional method with GA used for feature selection and SVM for classification, was used to design CDSSs for prediction of successful ventilation weaning, diagnosis of patients with severe obstructive sleep apnea, and discrimination of different cell types form Pap smear. The results show that IGS is better than methods using SVM alone or linear discriminator.
Keedy, Alexander W; Durack, Jeremy C; Sandhu, Parmbir; Chen, Eric M; O'Sullivan, Patricia S; Breiman, Richard S
2011-01-01
This study was designed to determine whether an interactive three-dimensional presentation depicting liver and biliary anatomy is more effective for teaching medical students than a traditional textbook format presentation of the same material. Forty-six medical students volunteered for participation in this study. Baseline demographic information, spatial ability, and knowledge of relevant anatomy were measured. Participants were randomized into two groups and presented with a computer-based interactive learning module comprised of animations and still images to highlight various anatomical structures (3D group), or a computer-based text document containing the same images and text without animation or interactive features (2D group). Following each teaching module, students completed a satisfaction survey and nine-item anatomic knowledge post-test. The 3D group scored higher on the post-test than the 2D group, with a mean score of 74% and 64%, respectively; however, when baseline differences in pretest scores were accounted for, this difference was not statistically significant (P = 0.33). Spatial ability did not statistically significantly correlate with post-test scores for the 3D group or the 2D group. In the post-test satisfaction survey the 3D group expressed a statistically significantly higher overall satisfaction rating compared to students in the 2D control group (4.5 versus 3.7 out of 5, P = 0.02). While the interactive 3D multimedia module received higher satisfaction ratings from students, it neither enhanced nor inhibited learning of complex hepatobiliary anatomy compared to an informationally equivalent traditional textbook style approach. . Copyright © 2011 American Association of Anatomists.
Virtual and stereoscopic anatomy: when virtual reality meets medical education.
de Faria, Jose Weber Vieira; Teixeira, Manoel Jacobsen; de Moura Sousa Júnior, Leonardo; Otoch, Jose Pinhata; Figueiredo, Eberval Gadelha
2016-11-01
OBJECTIVE The authors sought to construct, implement, and evaluate an interactive and stereoscopic resource for teaching neuroanatomy, accessible from personal computers. METHODS Forty fresh brains (80 hemispheres) were dissected. Images of areas of interest were captured using a manual turntable and processed and stored in a 5337-image database. Pedagogic evaluation was performed in 84 graduate medical students, divided into 3 groups: 1 (conventional method), 2 (interactive nonstereoscopic), and 3 (interactive and stereoscopic). The method was evaluated through a written theory test and a lab practicum. RESULTS Groups 2 and 3 showed the highest mean scores in pedagogic evaluations and differed significantly from Group 1 (p < 0.05). Group 2 did not differ statistically from Group 3 (p > 0.05). Size effects, measured as differences in scores before and after lectures, indicate the effectiveness of the method. ANOVA results showed significant difference (p < 0.05) between groups, and the Tukey test showed statistical differences between Group 1 and the other 2 groups (p < 0.05). No statistical differences between Groups 2 and 3 were found in the practicum. However, there were significant differences when Groups 2 and 3 were compared with Group 1 (p < 0.05). CONCLUSIONS The authors conclude that this method promoted further improvement in knowledge for students and fostered significantly higher learning when compared with traditional teaching resources.
Statistical reporting inconsistencies in experimental philosophy
Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan
2018-01-01
Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220
Esthetic perception of orthodontic appliances by Brazilian children and adolescents
Kuhlman, Deise Caldas; de Lima, Tatiana Araújo; Duplat, Candice Belchior; Capelli, Jonas
2016-01-01
ABSTRACT Objective: The objective of this present study was to understand how children and adolescents perceive esthetic attractiveness of a variety of orthodontic appliances. It also analyzed preferences according to patients' age, sex and socioeconomic status. Methods: A photograph album consisting of eight photographs of different orthodontic appliances and clear tray aligners placed in a consenting adult with pleasing smile was used. A sample of children or adolescents aged between 8 and 17 years old (n = 276) was asked to rate each image for its attractiveness on a visual analog scale. Comparisons between the appliances attractiveness were performed by means of nonparametric statistics with Friedman's test followed by Dunn's multiple comparison post-hoc test. Correlation between appliances and individuals' socioeconomic status, age, sex, and esthetic perception was assessed by means of Spearman's correlation analysis. Results: Attractiveness ratings of orthodontic appliances varied nonsignificantly for children in the following hierarchy: traditional metallic brackets with green elastomeric ligatures > traditional metallic brackets with gray elastomeric ligatures > sapphire esthetic brackets; and for adolescents, as follows: sapphire esthetic brackets > clear aligner without attachments > traditional metallic brackets with green elastomeric ligatures. The correlation between individuals' socioeconomic status and esthetic perception of a given appliance was negative and statistically significant for appliances such as the golden orthodontic brackets and traditional metallic brackets with green elastomeric ligatures. Conclusion: Metal appliances were considered very attractive, whereas aligners were classified as less attractive by children and adolescents. The correlation between esthetic perception and socioeconomic status revealed that individuals with a higher socioeconomic level judged esthetics as the most attractive attribute. For those with higher economic status, golden orthodontic brackets and traditional metallic brackets with green elastomeric ligatures were assessed as the worst esthetic option. PMID:27901230
Eliminating traditional reference services in an academic health sciences library: a case study
Schulte, Stephanie J
2011-01-01
Question: How were traditional librarian reference desk services successfully eliminated at one health sciences library? Setting: The analysis was done at an academic health sciences library at a major research university. Method: A gap analysis was performed, evaluating changes in the first eleven months through analysis of reference transaction and instructional session data. Main Results: Substantial increases were seen in the overall number of specialized reference transactions and those conducted by librarians lasting more than thirty minutes. The number of reference transactions overall increased after implementing the new model. Several new small-scale instructional initiatives began, though perhaps not directly related to the new model. Conclusion: Traditional reference desk services were eliminated at one academic health sciences library without negative impact on reference and instructional statistics. Eliminating ties to the confines of the physical library due to staffing reference desk hours removed one significant barrier to a more proactive liaison program. PMID:22022221
Estimation of the vortex length scale and intensity from two-dimensional samples
NASA Technical Reports Server (NTRS)
Reuss, D. L.; Cheng, W. P.
1992-01-01
A method is proposed for estimating flow features that influence flame wrinkling in reciprocating internal combustion engines, where traditional statistical measures of turbulence are suspect. Candidate methods were tested in a computed channel flow where traditional turbulence measures are valid and performance can be rationally evaluated. Two concepts are tested. First, spatial filtering is applied to the two-dimensional velocity distribution and found to reveal structures corresponding to the vorticity field. Decreasing the spatial-frequency cutoff of the filter locally changes the character and size of the flow structures that are revealed by the filter. Second, vortex length scale and intensity is estimated by computing the ensemble-average velocity distribution conditionally sampled on the vorticity peaks. The resulting conditionally sampled 'average vortex' has a peak velocity less than half the rms velocity and a size approximately equal to the two-point-correlation integral-length scale.
Evaluating the use of simulation with beginning nursing students.
Alfes, Celeste M
2011-02-01
The purpose of this quasi-experimental study was to evaluate and compare the effectiveness of simulation versus a traditional skills laboratory method in promoting self-confidence and satisfaction with learning among beginning nursing students. A single convenience sample of 63 first-semester baccalaureate nursing students learning effective comfort care measures were recruited to compare the two teaching methods. Students participating in the simulation experience were statistically more confident than students participating in the traditional group. There was a slight, nonsignificant difference in satisfaction with learning between the two groups. Bivariate analysis revealed a significant positive relationship between self-confidence and satisfaction. Students in both groups reported higher levels of self-confidence following the learning experiences. Findings may influence the development of simulation experiences for beginning nursing students and encourage the implementation of simulation as a strand from beginning to end in nursing curricula. Copyright 2011, SLACK Incorporated.
Scalable privacy-preserving data sharing methodology for genome-wide association studies.
Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline
2014-08-01
The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.
Learning style-based teaching harvests a superior comprehension of respiratory physiology.
Anbarasi, M; Rajkumar, G; Krishnakumar, S; Rajendran, P; Venkatesan, R; Dinesh, T; Mohan, J; Venkidusamy, S
2015-09-01
Students entering medical college generally show vast diversity in their school education. It becomes the responsibility of teachers to motivate students and meet the needs of all diversities. One such measure is teaching students in their own preferred learning style. The present study was aimed to incorporate a learning style-based teaching-learning program for medical students and to reveal its significance and utility. Learning styles of students were assessed online using the visual-auditory-kinesthetic (VAK) learning style self-assessment questionnaire. When respiratory physiology was taught, students were divided into three groups, namely, visual (n = 34), auditory (n = 44), and kinesthetic (n = 28), based on their learning style. A fourth group (the traditional group; n = 40) was formed by choosing students randomly from the above three groups. Visual, auditory, and kinesthetic groups were taught following the appropriate teaching-learning strategies. The traditional group was taught via the routine didactic lecture method. The effectiveness of this intervention was evaluated by a pretest and two posttests, posttest 1 immediately after the intervention and posttest 2 after a month. In posttest 1, one-way ANOVA showed a significant statistical difference (P=0.005). Post hoc analysis showed significance between the kinesthetic group and traditional group (P=0.002). One-way ANOVA showed a significant difference in posttest 2 scores (P < 0.0001). Post hoc analysis showed significance between the three learning style-based groups compared with the traditional group [visual vs. traditional groups (p=0.002), auditory vs. traditional groups (p=0.03), and Kinesthetic vs. traditional groups (p=0.001)]. This study emphasizes that teaching methods tailored to students' style of learning definitely improve their understanding, performance, and retrieval of the subject. Copyright © 2015 The American Physiological Society.
Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.
2014-01-01
Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060
Polynomial probability distribution estimation using the method of moments
Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949
Polynomial probability distribution estimation using the method of moments.
Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.
Wang, Jie; Zeng, Hao-Long; Du, Hongying; Liu, Zeyuan; Cheng, Ji; Liu, Taotao; Hu, Ting; Kamal, Ghulam Mustafa; Li, Xihai; Liu, Huili; Xu, Fuqiang
2018-03-01
Metabolomics generate a profile of small molecules from cellular/tissue metabolism, which could directly reflect the mechanisms of complex networks of biochemical reactions. Traditional metabolomics methods, such as OPLS-DA, PLS-DA are mainly used for binary class discrimination. Multiple groups are always involved in the biological system, especially for brain research. Multiple brain regions are involved in the neuronal study of brain metabolic dysfunctions such as alcoholism, Alzheimer's disease, etc. In the current study, 10 different brain regions were utilized for comparative studies between alcohol preferring and non-preferring rats, male and female rats respectively. As many classes are involved (ten different regions and four types of animals), traditional metabolomics methods are no longer efficient for showing differentiation. Here, a novel strategy based on the decision tree algorithm was employed for successfully constructing different classification models to screen out the major characteristics of ten brain regions at the same time. Subsequently, this method was also utilized to select the major effective brain regions related to alcohol preference and gender difference. Compared with the traditional multivariate statistical methods, the decision tree could construct acceptable and understandable classification models for multi-class data analysis. Therefore, the current technology could also be applied to other general metabolomics studies involving multi class data. Copyright © 2017 Elsevier B.V. All rights reserved.
Fatusić, Zlatan; Hudić, Igor
2009-02-01
To evaluate the incidence of peritoneal adhesions as a post-operative complication after caesarean section following the Misgav Ladach method and compare it with peritoneal adhesions following traditional caesarean section methods (Pfannenstiel-Dörffler, low midline laparotomy-Dörffler). The analysis is retrospective and is based on medical documentation of the Clinic for Gynecology and Obstetrics, University Clinical Centre, Tuzla, Bosnia and Herzegovina (data from 1 January 2001 to 31 December 2005). We analysed previous caesarean section dependent on caesarean section method (200 by Misgav Ladach method, 100 by Pfannenstiel-Dörffler method and 100 caesarean section by low midline laparotomy-Dörffler). Adhesion scores were assigned using a previously validated scoring system. We found statistically significant difference (p < 0.05) in incidence of peritoneal adhesions in second and third caesarean section between Misgav Ladach method and the Pfannestiel-Dörffler and low midline laparotomy-Dörffler method. Difference in incidence of peritoneal adhesions between low midline laparotomy-Dörffler and Pfannenstiel-Dörffler method was not statistically different (p > 0.05). The mean pelvic adhesion score was statistically lower in Misgav Ladach group (0.43 +/- 0.79) than the mean score in the Pfannestiel-Dörffler (0.71 +/- 1.27) and low midline laparotomy-Dörffler groups (0.99 +/- 1.49) (p < 0.05). Our study showed that Misgav Ladach method of caesarean section makes possible lower incidence of peritoneal adhesions as post-operative complication of previous caesarean section.
NASA Astrophysics Data System (ADS)
Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha
2015-01-01
Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.
Surveying Europe’s Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA
Márton, Orsolya; Schmidt, Benedikt R.; Gál, Júlia Tünde; Jelić, Dušan
2017-01-01
In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence. PMID:28129383
Mean-Reverting Portfolio With Budget Constraint
NASA Astrophysics Data System (ADS)
Zhao, Ziping; Palomar, Daniel P.
2018-05-01
This paper considers the mean-reverting portfolio design problem arising from statistical arbitrage in the financial markets. We first propose a general problem formulation aimed at finding a portfolio of underlying component assets by optimizing a mean-reversion criterion characterizing the mean-reversion strength, taking into consideration the variance of the portfolio and an investment budget constraint. Then several specific problems are considered based on the general formulation, and efficient algorithms are proposed. Numerical results on both synthetic and market data show that our proposed mean-reverting portfolio design methods can generate consistent profits and outperform the traditional design methods and the benchmark methods in the literature.
Choi, Eunyoung; Lindquist, Ruth; Song, Yeoungsuk
2014-01-01
Problem-based learning (PBL) is a method widely used in nursing education to develop students' critical thinking skills to solve practice problems independently. Although PBL has been used in nursing education in Korea for nearly a decade, few studies have examined its effects on Korean nursing students' learning outcomes, and few Korean studies have examined relationships among these outcomes. The objectives of this study are to examine outcome abilities including critical thinking, problem-solving, and self-directed learning of nursing students receiving PBL vs. traditional lecture, and to examine correlations among these outcome abilities. A quasi-experimental non-equivalent group pretest-posttest design was used. First-year nursing students (N=90) were recruited from two different junior colleges in two cities (GY and GJ) in South Korea. In two selected educational programs, one used traditional lecture methods, while the other used PBL methods. Standardized self-administered questionnaires of critical thinking, problem-solving, and self-directed learning abilities were administered before and at 16weeks (after instruction). Learning outcomes were significantly positively correlated, however outcomes were not statistically different between groups. Students in the PBL group improved across all abilities measured, while student scores in the traditional lecture group decreased in problem-solving and self-directed learning. Critical thinking was positively associated with problem-solving and self-directed learning (r=.71, and r=.50, respectively, p<.001); problem-solving was positively associated with self-directed learning (r=.75, p<.001). Learning outcomes of PBL were not significantly different from traditional lecture in this small underpowered study, despite positive trends. Larger studies are recommended to study effects of PBL on critical student abilities. Copyright © 2013 Elsevier Ltd. All rights reserved.
Traditional and Atypical Presentations of Anxiety in Youth with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Kerns, Connor Morrow; Kendall, Philip C.; Berry, Leandra; Souders, Margaret C.; Franklin, Martin E.; Schultz, Robert T.; Miller, Judith; Herrington, John
2014-01-01
We assessed anxiety consistent (i.e., "traditional") and inconsistent (i.e., "atypical") with diagnostic and statistical manual (DSM) definitions in autism spectrum disorder (ASD). Differential relationships between traditional anxiety, atypical anxiety, child characteristics, anxiety predictors and ASD-symptomology were…
Fabrication of Josephson Junction without shadow evaporation
NASA Astrophysics Data System (ADS)
Wu, Xian; Ku, Hsiangsheng; Long, Junling; Pappas, David
We developed a new method of fabricating Josephson Junction (Al/AlOX/Al) without shadow evaporation. Statistics from room temperature junction resistance and measurement of qubits are presented. Unlike the traditional ``Dolan Bridge'' technique, this method requires two individual lithographies and straight evaporations of Al. Argon RF plasma is used to remove native AlOX after the first evaporation, followed by oxidation and second Al evaporation. Junction resistance measured at room temperature shows linear dependence on Pox (oxidation pressure), √{tox} (oxidation time), and inverse proportional to junction area. We have seen 100% yield of qubits made with this method. This method is promising because it eliminates angle dependence during Junction fabrication, facilitates large scale qubits fabrication.
Improving Maintenance Data Collection Via Point-of- Maintenance (POMX) Implementation
2006-03-01
accurate documentation, (3) identifying and correcting the root causes for poor data integrity, and (4) educating the unit on the critical need for data ...the validity of the results. The data in this study were analyzed using the SAS JMP 6.0 statistical software package. The results for the tests...traditional keyboard data entry methods at a computer terminal. These terminals are typically located in the aircraft maintenance unit (AMU) facility , away
Improving Maintenance Data Collection Via Point-Of-Maintenance (POMX) Implementation
2006-03-01
accurate documentation, (3) identifying and correcting the root causes for poor data integrity, and (4) educating the unit on the critical need for data ...the validity of the results. The data in this study were analyzed using the SAS JMP 6.0 statistical software package. The results for the tests...traditional keyboard data entry methods at a computer terminal. These terminals are typically located in the aircraft maintenance unit (AMU) facility , away
Computing Science and Statistics. Volume 24. Graphics and Visualization
1993-03-01
the dough , turbulent fluid flow, the time between drips of behavior changes radically when the population growth water from a faucet, Brownian motion... cookie which clearly is the discrete parameter analogue of continuous param- appropriate as after dinner fun. eter time series analysis". I strongly...methods. Your fortune cookie of the night reads: One problem that statisticians traditionally seem to "uYou have good friends who will come to your aid in
Hageman, W J; Arrindell, W A
1999-12-01
Based on a secondary analysis of the Jacobson and Truax [Jacobson, N.S. & Truax, P. (1991). a statistical approach to defining meaningful change in psychotherapy research. Journal of Consulting and Clinical Psychology, 59, 12-19.] data using both their own traditional approach and the refined method advanced by Hageman and Arrindell [Hageman, W.J.J.M., & Arrindell, W.A. (1999). Establishing clinically significant change: increment of precision and the distinction between individual and group level of analysis. Behaviour Research and Therapy, 37, 1169-1193], McGlinchey and Jacobson [McGlinchey, J. B., & Jacobson, N. S. (1999). Clinically significant but impractical? A response to Hageman and Arrindell. Behaviour Research and Therapy, 37, 1211-1217.] reported practically identical findings on reliable and clinically significant change across the two approaches. This led McGlinchey and Jacobson to conclude that there is little practical gain in utilizing the refined method over the traditional approach. Close inspection of the data used by McGlinchey and Jacobson however revealed a serious mistake with respect to the value of the standard error of measurement that was employed in their calculations. When the proper index value was utilised, further re-analysis by the present authors disclosed clear differences (i.e. different classifications of S's) across the two approaches. Importantly, these differences followed exactly the same pattern as depicted in Table 2 in Hageman and Arrindell (1999). The theoretical advantages of the refined method, i.e. enhanced precision, appropriate distinction between analysis at the individual and group levels, and maximal comparability of findings across studies, exceed those of the traditional method. Application of the refined method may be carried out within approximately half an hour, which not only supports its practical manageability, but also challenges the suggestion of McGlinchey and Jacobson (1999) that the relevant method would be too complex (impractical) for the average scientist. The reader is offered the opportunity of obtaining an SPSS setup in the form of an ASCII text file by means of which the relevant calculations can be carried out. The ways in which the valuable commentaries by Hsu [Hsu, L. M. (1999). A comparison of three methods of identifying reliable and clinically significant client changes: commentary on Hageman and Arrindell. Behaviour Research and Therapy, 37, 1195-1202.] and Speer [Speer, D. C. (1999). What is the role of two-wave designs in clinical research? Comment on Hageman and Arrindell. Behaviour Research and Therapy, 37, 1203-1210.) contribute to a better understanding of the technical/statistical backgrounds of the traditional and refined methods were also discussed.
SADEGHI, ROYA; SEDAGHAT, MOHAMMAD MEHDI; SHA AHMADI, FARAMARZ
2014-01-01
Introduction: Blended learning, a new approach in educational planning, is defined as an applying more than one method, strategy, technique or media in education. Todays, due to the development of infrastructure of Internet networks and the access of most of the students, the Internet can be utilized along with traditional and conventional methods of training. The aim of this study was to compare the students’ learning and satisfaction in combination of lecture and e-learning with conventional lecture methods. Methods: This quasi-experimental study is conducted among the sophomore students of Public Health School, Tehran University of Medical Science in 2012-2013. Four classes of the school are randomly selected and are divided into two groups. Education in two classes (45 students) was in the form of lecture method and in the other two classes (48 students) was blended method with e-Learning and lecture methods. The students’ knowledge about tuberculosis in two groups was collected and measured by using pre and post-test. This step has been done by sending self-reported electronic questionnaires to the students' email addresses through Google Document software. At the end of educational programs, students' satisfaction and comments about two methods were also collected by questionnaires. Statistical tests such as descriptive methods, paired t-test, independent t-test and ANOVA were done through the SPSS 14 software, and p≤0.05 was considered as significant difference. Results: The mean scores of the lecture and blended groups were 13.18±1.37 and 13.35±1.36, respectively; the difference between the pre-test scores of the two groups was not statistically significant (p=0.535). Knowledge scores increased in both groups after training, and the mean and standard deviation of knowledge scores of the lectures and combined groups were 16.51±0.69 and 16.18±1.06, respectively. The difference between the post-test scores of the two groups was not statistically significant (p=0.112). Students’ satisfaction in blended learning method was higher than lecture method. Conclusion: The results revealed that the blended method is effective in increasing the students' learning rate. E-learning can be used to teach some courses and might be considered as economic aspects. Since in universities of medical sciences in the country, the majority of students have access to the Internet and email address, using e-learning could be used as a supplement to traditional teaching methods or sometimes as educational alternative method because this method of teaching increases the students’ knowledge, satisfaction and attention. PMID:25512938
Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang
2016-03-01
In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.
Radiomic analysis in prediction of Human Papilloma Virus status.
Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu
2017-12-01
Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.
Statistical Optimality in Multipartite Ranking and Ordinal Regression.
Uematsu, Kazuki; Lee, Yoonkyung
2015-05-01
Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.
Pattin, Kristine A.; White, Bill C.; Barney, Nate; Gui, Jiang; Nelson, Heather H.; Kelsey, Karl R.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.
2008-01-01
Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free data mining method for detecting, characterizing, and interpreting epistasis in the absence of significant main effects in genetic and epidemiologic studies of complex traits such as disease susceptibility. The goal of MDR is to change the representation of the data using a constructive induction algorithm to make nonadditive interactions easier to detect using any classification method such as naïve Bayes or logistic regression. Traditionally, MDR constructed variables have been evaluated with a naïve Bayes classifier that is combined with 10-fold cross validation to obtain an estimate of predictive accuracy or generalizability of epistasis models. Traditionally, we have used permutation testing to statistically evaluate the significance of models obtained through MDR. The advantage of permutation testing is that it controls for false-positives due to multiple testing. The disadvantage is that permutation testing is computationally expensive. This is in an important issue that arises in the context of detecting epistasis on a genome-wide scale. The goal of the present study was to develop and evaluate several alternatives to large-scale permutation testing for assessing the statistical significance of MDR models. Using data simulated from 70 different epistasis models, we compared the power and type I error rate of MDR using a 1000-fold permutation test with hypothesis testing using an extreme value distribution (EVD). We find that this new hypothesis testing method provides a reasonable alternative to the computationally expensive 1000-fold permutation test and is 50 times faster. We then demonstrate this new method by applying it to a genetic epidemiology study of bladder cancer susceptibility that was previously analyzed using MDR and assessed using a 1000-fold permutation test. PMID:18671250
Efficient statistically accurate algorithms for the Fokker-Planck equation in large dimensions
NASA Astrophysics Data System (ADS)
Chen, Nan; Majda, Andrew J.
2018-02-01
Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace and is therefore computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O (100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.
Stand-Biased Versus Seated Classrooms and Childhood Obesity: A Randomized Experiment in Texas
Wendel, Monica L.; Zhao, Hongwei; Jeffrey, Christina
2016-01-01
Objectives. To measure changes in body mass index (BMI) percentiles among third- and fourth-grade students in stand-biased classrooms and traditional seated classrooms in 3 Texas elementary schools. Methods. Research staff recorded the height and weight of 380 students in 24 classrooms across the 3 schools at the beginning (2011–2012) and end (2012–2013) of the 2-year study. Results. After adjustment for grade, race/ethnicity, and gender, there was a statistically significant decrease in BMI percentile in the group that used stand-biased desks for 2 consecutive years relative to the group that used standard desks during both years. Mean BMI increased by 0.1 and 0.4 kilograms per meter squared in the treatment and control groups, respectively. The between-group difference in BMI percentile change was 5.24 (SE = 2.50; P = .037). No other covariates had a statistically significant impact on BMI percentile changes. Conclusions. Changing a classroom to a stand-biased environment had a significant effect on students’ BMI percentile, indicating the need to redesign traditional classroom environments. PMID:27552276
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
The difference between a dynamic and mechanical approach to stroke treatment.
Helgason, Cathy M
2007-06-01
The current classification of stroke is based on causation, also called pathogenesis, and relies on binary logic faithful to the Aristotelian tradition. Accordingly, a pathology is or is not the cause of the stroke, is considered independent of others, and is the target for treatment. It is the subject for large double-blind randomized clinical therapeutic trials. The scientific view behind clinical trials is the fundamental concept that information is statistical, and causation is determined by probabilities. Therefore, the cause and effect relation will be determined by probability-theory-based statistics. This is the basis of evidence-based medicine, which calls for the results of such trials to be the basis for physician decisions regarding diagnosis and treatment. However, there are problems with the methodology behind evidence-based medicine. Calculations using probability-theory-based statistics regarding cause and effect are performed within an automatic system where there are known inputs and outputs. This method of research provides a framework of certainty with no surprise elements or outcomes. However, it is not a system or method that will come up with previously unknown variables, concepts, or universal principles; it is not a method that will give a new outcome; and it is not a method that allows for creativity, expertise, or new insight for problem solving.
Sun, Chenjing; Qi, Xiaokun
2018-01-01
Lumbar puncture (LP) is an essential part of adult neurology residency training. Technologic as well as nontechnologic training is needed. However, current assessment tools mostly focus on the technologic aspects of LP. We propose a training method-problem- and simulator-based learning (PSBL)-in LP residency training to develop overall skills of neurology residents. We enrolled 60 neurology postgraduate-year-1 residents from our standardized residents training center and randomly divided them into 2 groups: traditional teaching group and PSBL group. After training, we assessed the extent that the residents were ready to perform LP and tracked successful LPs performed by the residents. We then asked residents to complete questionnaires about the training models. Performance scores and the results of questionnaires were compared between the 2 groups. Students and faculty concluded that PSBL provided a more effective learning experience than the traditional teaching model. Although no statistical difference was found in the pretest, posttest, and improvement rate scores between the 2 groups, based on questionnaire scores and number of successful LPs after training, the PSBL group showed a statistically significant improvement compared with the traditional group. Findings indicated that nontechnical elements, such as planning before the procedure and controlling uncertainties during the procedure, are more crucial than technical elements. Compared with traditional teaching model, PSBL for LP training can develop overall surgical skills, including technical and nontechnical elements, improving performance. Residents in the PSBL group were more confident and effective in performing LP. Copyright © 2017 Elsevier Inc. All rights reserved.
Isufi, Almira; Plotino, Gianluca; Grande, Nicola Maria; Ioppolo, Pietro; Testarelli, Luca; Bedini, Rossella; Al-Sudani, Dina; Gambarini, Gianluca
2016-01-01
Summary Aim To determine and compare the fracture resistance of endodontically treated teeth restored with a bulk fill flowable material (SDR) and a traditional resin composite. Methods Thirty maxillary and 30 mandibular first molars were selected based on similar dimensions. After cleaning, shaping and filling of the root canals and adhesive procedures, specimens were assigned to 3 subgroups for each tooth type (n=10): Group A: control group, including intact teeth; Group B: access cavities were restored with a traditional resin composite (EsthetX; Dentsply-Italy, Rome, Italy); Group C: access cavities were restored with a bulk fill flowable composite (SDR; Dentsply-Italy), except 1.5 mm layer of the occlusal surface that was restored with the same resin composite as Group B. The specimens were subjected to compressive force in a material static-testing machine until fracture occurred, the maximum fracture load of the specimens was measured (N) and the type of fracture was recorded as favorable or unfavorable. Data were statistically analyzed with one-way analysis of variance (ANOVA) and Bonferroni tests (P<0.05). Results No statistically significant differences were found among groups (P<0.05). Fracture resistance of endodontically treated teeth restored with a traditional resin composite and with a bulk fill flowable composite (SDR) was similar in both maxillary and mandibular molars and showed no significant decrease in fracture resistance compared to intact specimens. Conclusions No significant difference was observed in the mechanical fracture resistance of endodontically treated molars restored with traditional resin composite restorations compared to bulk fill flowable composite restorations. PMID:27486505
Waugh, Mark H.; Hopwood, Christopher J.; Krueger, Robert F.; Morey, Leslie C.; Pincus, Aaron L.; Wright, Aidan G. C.
2016-01-01
The Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-5) Section III Alternative Model for Personality Disorders (AMPD; APA, 2013) represents an innovative system for simultaneous psychiatric classification and psychological assessment of personality disorders (PD). The AMPD combines major paradigms of personality assessment and provides an original, heuristic, flexible, and practical framework that enriches clinical thinking and practice. Origins, emerging research, and clinical application of the AMPD for diagnosis and psychological assessment are reviewed. The AMPD integrates assessment and research traditions, facilitates case conceptualization, is easy to learn and use, and assists in providing patient feedback. New as well as existing tests and psychometric methods may be used to operationalize the AMPD for clinical assessments. PMID:28450760
Waugh, Mark H; Hopwood, Christopher J; Krueger, Robert F; Morey, Leslie C; Pincus, Aaron L; Wright, Aidan G C
2017-04-01
The Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-5) Section III Alternative Model for Personality Disorders (AMPD; APA, 2013) represents an innovative system for simultaneous psychiatric classification and psychological assessment of personality disorders (PD). The AMPD combines major paradigms of personality assessment and provides an original, heuristic, flexible, and practical framework that enriches clinical thinking and practice. Origins, emerging research, and clinical application of the AMPD for diagnosis and psychological assessment are reviewed. The AMPD integrates assessment and research traditions, facilitates case conceptualization, is easy to learn and use, and assists in providing patient feedback. New as well as existing tests and psychometric methods may be used to operationalize the AMPD for clinical assessments.
Sadeghi, Roya; Sedaghat, Mohammad Mehdi; Sha Ahmadi, Faramarz
2014-10-01
Blended learning, a new approach in educational planning, is defined as an applying more than one method, strategy, technique or media in education. Todays, due to the development of infrastructure of Internet networks and the access of most of the students, the Internet can be utilized along with traditional and conventional methods of training. The aim of this study was to compare the students' learning and satisfaction in combination of lecture and e-learning with conventional lecture methods. This quasi-experimental study is conducted among the sophomore students of Public Health School, Tehran University of Medical Science in 2012-2013. Four classes of the school are randomly selected and are divided into two groups. Education in two classes (45 students) was in the form of lecture method and in the other two classes (48 students) was blended method with e-Learning and lecture methods. The students' knowledge about tuberculosis in two groups was collected and measured by using pre and post-test. This step has been done by sending self-reported electronic questionnaires to the students' email addresses through Google Document software. At the end of educational programs, students' satisfaction and comments about two methods were also collected by questionnaires. Statistical tests such as descriptive methods, paired t-test, independent t-test and ANOVA were done through the SPSS 14 software, and p≤0.05 was considered as significant difference. The mean scores of the lecture and blended groups were 13.18±1.37 and 13.35±1.36, respectively; the difference between the pre-test scores of the two groups was not statistically significant (p=0.535). Knowledge scores increased in both groups after training, and the mean and standard deviation of knowledge scores of the lectures and combined groups were 16.51±0.69 and 16.18±1.06, respectively. The difference between the post-test scores of the two groups was not statistically significant (p=0.112). Students' satisfaction in blended learning method was higher than lecture method. The results revealed that the blended method is effective in increasing the students' learning rate. E-learning can be used to teach some courses and might be considered as economic aspects. Since in universities of medical sciences in the country, the majority of students have access to the Internet and email address, using e-learning could be used as a supplement to traditional teaching methods or sometimes as educational alternative method because this method of teaching increases the students' knowledge, satisfaction and attention.
Problematizing Statistical Literacy: An Intersection of Critical and Statistical Literacies
ERIC Educational Resources Information Center
Weiland, Travis
2017-01-01
In this paper, I problematize traditional notions of statistical literacy by juxtaposing it with critical literacy. At the school level statistical literacy is vitally important for students who are preparing to become citizens in modern societies that are increasingly shaped and driven by data based arguments. The teaching of statistics, which is…
Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li
2017-10-01
To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.
Ten-Doménech, Isabel; Beltrán-Iturat, Eduardo; Herrero-Martínez, José Manuel; Sancho-Llopis, Juan Vicente; Simó-Alfonso, Ernesto Francisco
2015-06-24
In this work, a method for the separation of triacylglycerols (TAGs) present in human milk and from other mammalian species by reversed-phase high-performance liquid chromatography using a core-shell particle packed column with UV and evaporative light-scattering detectors is described. Under optimal conditions, a mobile phase containing acetonitrile/n-pentanol at 10 °C gave an excellent resolution among more than 50 TAG peaks. A small-scale method for fat extraction in these milks (particularly of interest for human milk samples) using minimal amounts of sample and reagents was also developed. The proposed extraction protocol and the traditional method were compared, giving similar results, with respect to the total fat and relative TAG contents. Finally, a statistical study based on linear discriminant analysis on the TAG composition of different types of milks (human, cow, sheep, and goat) was carried out to differentiate the samples according to their mammalian origin.
Bovet, Alexandre; Morone, Flaviano; Makse, Hernán A
2018-06-06
Measuring and forecasting opinion trends from real-time social media is a long-standing goal of big-data analytics. Despite the large amount of work addressing this question, there has been no clear validation of online social media opinion trend with traditional surveys. Here we develop a method to infer the opinion of Twitter users by using a combination of statistical physics of complex networks and machine learning based on hashtags co-occurrence to build an in-domain training set of the order of a million tweets. We validate our method in the context of 2016 US Presidential Election by comparing the Twitter opinion trend with the New York Times National Polling Average, representing an aggregate of hundreds of independent traditional polls. The Twitter opinion trend follows the aggregated NYT polls with remarkable accuracy. We investigate the dynamics of the social network formed by the interactions among millions of Twitter supporters and infer the support of each user to the presidential candidates. Our analytics unleash the power of Twitter to uncover social trends from elections, brands to political movements, and at a fraction of the cost of traditional surveys.
NASA Astrophysics Data System (ADS)
Haynes, James Christopher
Scope and Method of Study. The purpose of this study was to determine if a science-enhanced curriculum produced by the Center for Agricultural and Environmental Research and Training (CAERT) taught in a secondary level animal science or horticulture course would improve students' understanding of selected scientific principles significantly, when compared to students who were instructed using a traditional curriculum. A secondary purpose was to determine the effect that the science-enhanced CAERT curriculum would have on students' agricultural knowledge when compared to students who were instructed using a traditional curriculum. The design of the study was ex post facto, causal comparative because no random assignment of the treatment group occurred. Findings and Conclusions. No statistically significant difference was found between the treatment and comparison groups regarding science achievement. However, the mean score of the treatment group was slightly larger than the comparison group indicating a slightly higher achievement level; a "Small" effect size (d = .16) for this difference was calculated. It was determined that a statistically significant difference (p < .05) existed in agriculture competency scores in animal science (p = .001) and horticulture (p = .000) as a result of the treatment. Moreover, this was considered to be a "very large" effect (d = 1.18) in animal science and a "large" effect (d = .92) in horticulture. When considering student achievement in science, this study found that the use of the science-enhanced CAERT curriculum did not result in a statistically significant increase (p < .05) in student performance as determined by the TerraNova3 science proficiency examination. However, students who were instructed using the CAERT curriculum scored better overall than those who were instructed using a "traditional" curriculum.
Upton, Gabrielle A; Tinley, Paul; Al-Aubaidy, Hayder; Crawford, Rachel
This pilot study aimed to investigate and compare the perceived pain relief effectiveness of two different modes of TENS in people with painful diabetic neuropathy (PDN). A cross-over study was conducted at Charles Sturt University, Orange. Five participants with PDN were assessed with a McGill Pain Questionnaire before and after each of the two TENS treatments. Participants were randomly allocated to Traditional TENS (80Hz, 200ms) or Acupuncture-like TENS (2Hz, 200ms) and the treatments were applied daily for 30min over ten days. Following a seven day washout period, the alternate mode of TENS was carried out using the same method. Wilcoxon Signed Rank tests were used to statistically analyse the results. All five participants reported personally meaningful pain relief during one or both of the TENS treatments. The Wilcoxon signed rank testing showed no statistical significance, p=1, likely due to the small sample size. Acupuncture-like TENS had a large effect size (z=-1.625, r=0.514), whilst Traditional TENS produced a medium effect size (z=-1.214, r=0.384). No adverse effects were reported. Acupuncture-like TENS may be more effective for PDN than traditional TENS. A larger scale replication of this pilot study is warranted. Copyright © 2016 Diabetes India. Published by Elsevier Ltd. All rights reserved.
Teaching Statistics Online Using "Excel"
ERIC Educational Resources Information Center
Jerome, Lawrence
2011-01-01
As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…
Fienen, Michael N.; Selbig, William R.
2012-01-01
A new sample collection system was developed to improve the representation of sediment entrained in urban storm water by integrating water quality samples from the entire water column. The depth-integrated sampler arm (DISA) was able to mitigate sediment stratification bias in storm water, thereby improving the characterization of suspended-sediment concentration and particle size distribution at three independent study locations. Use of the DISA decreased variability, which improved statistical regression to predict particle size distribution using surrogate environmental parameters, such as precipitation depth and intensity. The performance of this statistical modeling technique was compared to results using traditional fixed-point sampling methods and was found to perform better. When environmental parameters can be used to predict particle size distributions, environmental managers have more options when characterizing concentrations, loads, and particle size distributions in urban runoff.
Automated sampling assessment for molecular simulations using the effective sample size
Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.
2010-01-01
To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418
Confounding in statistical mediation analysis: What it is and how to address it.
Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P
2017-11-01
Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Brady, Amie M.G.; Bushon, Rebecca N.; Plona, Meg B.
2009-01-01
The Cuyahoga River within Cuyahoga Valley National Park (CVNP) in Ohio is often impaired for recreational use because of elevated concentrations of bacteria, which are indicators of fecal contamination. During the recreational seasons (May through August) of 2004 through 2007, samples were collected at two river sites, one upstream of and one centrally-located within CVNP. Bacterial concentrations and turbidity were determined, and streamflow at time of sampling and rainfall amounts over the previous 24 hours prior to sampling were ascertained. Statistical models to predict Escherichia coli (E. coli) concentrations were developed for each site (with data from 2004 through 2006) and tested during an independent year (2007). At Jaite, a sampling site near the center of CVNP, the predictive model performed better than the traditional method of determining the current day's water quality using the previous day's E. coli concentration. During 2007, the Jaite model, based on turbidity, produced more correct responses (81 percent) and fewer false negatives (3.2 percent) than the traditional method (68 and 26 percent, respectively). At Old Portage, a sampling site just upstream from CVNP, a predictive model with turbidity and rainfall as explanatory variables did not perform as well as the traditional method. The Jaite model was used to estimate water quality at three other sites in the park; although it did not perform as well as the traditional method, it performed well - yielding between 68 and 91 percent correct responses. Further research would be necessary to determine whether using the Jaite model to predict recreational water quality elsewhere on the river would provide accurate results.
NASA Astrophysics Data System (ADS)
He, Xiulan; Sonnenborg, Torben O.; Jørgensen, Flemming; Jensen, Karsten H.
2017-03-01
Stationarity has traditionally been a requirement of geostatistical simulations. A common way to deal with non-stationarity is to divide the system into stationary sub-regions and subsequently merge the realizations for each region. Recently, the so-called partition approach that has the flexibility to model non-stationary systems directly was developed for multiple-point statistics simulation (MPS). The objective of this study is to apply the MPS partition method with conventional borehole logs and high-resolution airborne electromagnetic (AEM) data, for simulation of a real-world non-stationary geological system characterized by a network of connected buried valleys that incise deeply into layered Miocene sediments (case study in Denmark). The results show that, based on fragmented information of the formation boundaries, the MPS partition method is able to simulate a non-stationary system including valley structures embedded in a layered Miocene sequence in a single run. Besides, statistical information retrieved from the AEM data improved the simulation of the geology significantly, especially for the deep-seated buried valley sediments where borehole information is sparse.
Garrido-Acosta, Osvaldo; Meza-Toledo, Sergio Enrique; Anguiano-Robledo, Liliana; Valencia-Hernández, Ignacio; Chamorro-Cevallos, Germán
2014-01-01
We determined the median effective dose (ED50) values for the anticonvulsants phenobarbital and sodium valproate using a modification of Lorke's method. This modification allowed appropriate statistical analysis and the use of a smaller number of mice per compound tested. The anticonvulsant activities of phenobarbital and sodium valproate were evaluated in male CD1 mice by maximal electroshock (MES) and intraperitoneal administration of pentylenetetrazole (PTZ). The anticonvulsant ED50 values were obtained through modifications of Lorke's method that involved changes in the selection of the three first doses in the initial test and the fourth dose in the second test. Furthermore, a test was added to evaluate the ED50 calculated by the modified Lorke's method, allowing statistical analysis of the data and determination of the confidence limits for ED50. The ED50 for phenobarbital against MES- and PTZ-induced seizures was 16.3mg/kg and 12.7mg/kg, respectively. The sodium valproate values were 261.2mg/kg and 159.7mg/kg, respectively. These results are similar to those found using the traditional methods of finding ED50, suggesting that the modifications made to Lorke's method generate equal results using fewer mice while increasing confidence in the statistical analysis. This adaptation of Lorke's method can be used to determine median letal dose (LD50) or ED50 for compounds with other pharmacological activities. Copyright © 2014 Elsevier Inc. All rights reserved.
Gaudin, Véronique Laberge; Receveur, Olivier; Walz, Leah; Girard, Félix; Potvin, Louise
2014-01-01
The Aboriginal nations of Canada have higher incidences of chronic diseases, coinciding with profound changes in their environment, lifestyle and diet. Traditional foods can protect against the risks of chronic disease. However, their consumption is in decline, and little is known about the complex mechanisms underlying this trend. To identify the factors involved in traditional food consumption by Cree Aboriginal people living in 3 communities in northern Quebec, Canada. Design. A mixed methods explanatory design, including focus group interviews to interpret the results of logistic regression. This study includes a secondary data analysis of a cross-sectional survey of 3 Cree communities (n=374) and 4 focus group interviews (n=23). In the first, quantitative phase of the study, data were collected using a food-frequency questionnaire along with a structured questionnaire. Subsequently, the focus group interviews helped explain and build on the results of logistic regressions. People who consume traditional food 3 days or more weekly were more likely to be 40 years old and over, to walk 30 minutes or more per day, not to have completed their schooling, to live in Mistissini and to be a hunter (p<0.05 for all comparisons). The focus group participants provided explanations for the quantitative analysis results or completed them. For example, although no statistical association was found, focus group participants believed that employment acts as both a facilitator and a barrier to traditional food consumption, rendering the effect undetectable. In addition, focus group participants suggested that traditional food consumption is the result of multiple interconnected influences, including individual, family, community and environmental influences, rather than a single factor. This study sheds light on a number of factors that are unique to traditional foods, factors that have been understudied to date. Efforts to promote and maintain traditional food consumption could improve the overall health and wellbeing of Cree communities.
Factors influencing the use of antenatal care in rural West Sumatra, Indonesia
2012-01-01
Background Every year, nearly half a million women and girls needlessly die as a result of complications during pregnancy, childbirth or the 6 weeks following delivery. Almost all (99%) of these deaths occur in developing countries. The study aim was to describe the factors related to low visits for antenatal care (ANC) services among pregnant women in Indonesia. Method A total of 145 of 200 married women of reproductive age who were pregnant or had experienced birth responded to the questionnaire about their ANC visits. We developed a questionnaire containing 35 items and four sections. Section one and two included the women's socio demographics, section three about basic knowledge of pregnancy and section four contained two subsections about preferences about midwives and preferences about Traditional Birth Attendant (TBA) and the second subsections were traditional beliefs. Data were collected using a convenience sampling strategy during July and August 2010, from 10 villages in the Tanjung Emas. Multiple regression analysis was used for preference for types of providers. Results Three-quarter of respondents (77.9%) received ANC more than four times. The other 22.1% received ANC less than four times. 59.4% received ANC visits during pregnancy, which was statistically significant compared to multiparous (p = 0.001). Women who were encouraged by their family to receive ANC had statistically significant higher traditional belief scores compared to those who encouraged themselves (p = 0.003). Preference for TBAs was most strongly affected by traditional beliefs (p < 0.001). On the contrary, preference for midwives was negatively correlated with traditional beliefs (p < 0.001). Conclusions Parity was the factor influencing women's receiving less than the recommended four ANC visits during pregnancy. Women who were encouraged by their family to get ANC services had higher traditional beliefs score than women who encouraged themselves. Moreover, traditional beliefs followed by lower income families had the greater influence over preferring TBAs, with the opposite trend for preferring midwives. Increased attention needs to be given to the women; it also very important for exploring women's perceptions about health services that they received. PMID:22353252
One output function: a misconception of students studying digital systems - a case study
NASA Astrophysics Data System (ADS)
Trotskovsky, E.; Sabag, N.
2015-05-01
Background:Learning processes are usually characterized by students' misunderstandings and misconceptions. Engineering educators intend to help their students overcome their misconceptions and achieve correct understanding of the concept. This paper describes a misconception in digital systems held by many students who believe that combinational logic circuits should have only one output. Purpose:The current study aims to investigate the roots of the misconception about one-output function and the pedagogical methods that can help students overcome the misconception. Sample:Three hundred and eighty-one students in the Departments of Electrical and Electronics and Mechanical Engineering at an academic engineering college, who learned the same topics of a digital combinational system, participated in the research. Design and method:In the initial research stage, students were taught according to traditional method - first to design a one-output combinational logic system, and then to implement a system with a number of output functions. In the main stage, an experimental group was taught using a new method whereby they were shown how to implement a system with several output functions, prior to learning about one-output systems. A control group was taught using the traditional method. In the replication stage (the third stage), an experimental group was taught using the new method. A mixed research methodology was used to examine the results of the new learning method. Results:Quantitative research showed that the new teaching approach resulted in a statistically significant decrease in student errors, and qualitative research revealed students' erroneous thinking patterns. Conclusions:It can be assumed that the traditional teaching method generates an incorrect mental model of the one-output function among students. The new pedagogical approach prevented the creation of an erroneous mental model and helped students develop the correct conceptual understanding.
Gebrekirstos, Kahsu; Abebe, Mesfin; Fantahun, Atsede
2014-06-21
Every social grouping in the world has its own cultural practices and beliefs which guide its members on how they should live or behave. Harmful traditional practices that affect children are Female genital mutilation, Milk teeth extraction, Food taboo, Uvula cutting, keeping babies out of exposure to sun, and Feeding fresh butter to new born babies. The objective of this study was to assess factors associated with harmful traditional practices among children less than 5 years of age in Axum town, North Ethiopia. Community based cross sectional study was conducted in 752 participants who were selected using multi stage sampling; Simple random sampling method was used to select ketenas from all kebelles of Axum town. After proportional allocation of sample size, systematic random sampling method was used to get the study participants. Data was collected using interviewer administered Tigrigna version questionnaire, it was entered and analyzed using SPSS version 16. Descriptive statistics was calculated and logistic regressions were used to analyze the data. Out of the total sample size 50.7% children were females, the mean age of children was 26.28 months and majority of mothers had no formal education. About 87.8% mothers had performed at least one traditional practice to their children; uvula cutting was practiced on 86.9% children followed by milk teeth extraction 12.5% and eye borrows incision 2.4% children. Fear of swelling, pus and rapture of the uvula was the main reason to perform uvula cutting. The factors associated with harmful traditional practices were educational status, occupation, religion of mothers and harmful traditional practices performed on the mothers.
Golding, Maryanne; Pearce, Wendy; Seymour, John; Cooper, Alison; Ching, Teresa; Dillon, Harvey
2007-02-01
Finding ways to evaluate the success of hearing aid fittings in young infants has increased in importance with the implementation of hearing screening programs. Cortical auditory evoked potentials (CAEP) can be recorded in infants and provides evidence for speech detection at the cortical level. The validity of this technique as a tool of hearing aid evaluation needs, however, to be demonstrated. The present study examined the relationship between the presence/absence of CAEPs to speech stimuli and the outcomes of a parental questionnaire in young infants who were fitted with hearing aids. The presence/absence of responses was determined by an experienced examiner as well as by a statistical measure, Hotelling's T(2). A statistically significant correlation between CAEPs and questionnaire scores was found using the examiner's grading (rs = 0.45) and using the statistical grading (rs = 0.41), and there was reasonably good agreement between traditional response detection methods and the statistical analysis.
Environmental Health Practice: Statistically Based Performance Measurement
Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.
2007-01-01
Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709
Unbiased estimators for spatial distribution functions of classical fluids
NASA Astrophysics Data System (ADS)
Adib, Artur B.; Jarzynski, Christopher
2005-01-01
We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.
NASA Astrophysics Data System (ADS)
Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna
2016-11-01
Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.
Perception-based road hazard identification with Internet support.
Tarko, Andrew P; DeSalle, Brian R
2003-01-01
One of the most important tasks faced by highway agencies is identifying road hazards. Agencies use crash statistics to detect road intersections and segments where the frequency of crashes is excessive. With the crash-based method, a dangerous intersection or segment can be pointed out only after a sufficient number of crashes occur. A more proactive method is needed, and motorist complaints may be able to assist agencies in detecting road hazards before crashes occur. This paper investigates the quality of safety information reported by motorists and the effectiveness of hazard identification based on motorist reports, which were collected with an experimental Internet website. It demonstrates that the intersections pointed out by motorists tended to have more crashes than other intersections. The safety information collected through the website was comparable to 2-3 months of crash data. It was concluded that although the Internet-based method could not substitute for the traditional crash-based methods, its joint use with crash statistics might be useful in detecting new hazards where crash data had been collected for a short time.
Recent development of risk-prediction models for incident hypertension: An updated systematic review
Xiao, Lei; Liu, Ya; Wang, Zuoguang; Li, Chuang; Jin, Yongxin; Zhao, Qiong
2017-01-01
Background Hypertension is a leading global health threat and a major cardiovascular disease. Since clinical interventions are effective in delaying the disease progression from prehypertension to hypertension, diagnostic prediction models to identify patient populations at high risk for hypertension are imperative. Methods Both PubMed and Embase databases were searched for eligible reports of either prediction models or risk scores of hypertension. The study data were collected, including risk factors, statistic methods, characteristics of study design and participants, performance measurement, etc. Results From the searched literature, 26 studies reporting 48 prediction models were selected. Among them, 20 reports studied the established models using traditional risk factors, such as body mass index (BMI), age, smoking, blood pressure (BP) level, parental history of hypertension, and biochemical factors, whereas 6 reports used genetic risk score (GRS) as the prediction factor. AUC ranged from 0.64 to 0.97, and C-statistic ranged from 60% to 90%. Conclusions The traditional models are still the predominant risk prediction models for hypertension, but recently, more models have begun to incorporate genetic factors as part of their model predictors. However, these genetic predictors need to be well selected. The current reported models have acceptable to good discrimination and calibration ability, but whether the models can be applied in clinical practice still needs more validation and adjustment. PMID:29084293
A comparison of traditional and engaging lecture methods in a large, professional-level course.
Miller, Cynthia J; McNear, Jacquee; Metz, Michael J
2013-12-01
In engaging lectures, also referred to as broken or interactive lectures, students are given short periods of lecture followed by "breaks" that can consist of 1-min papers, problem sets, brainstorming sessions, or open discussion. While many studies have shown positive effects when engaging lectures are used in undergraduate settings, the literature surrounding use of the learning technique for professional students is inconclusive. The novelty of this study design allowed a direct comparison of engaging physiology lectures versus didactic lecture formats in the same cohort of 120 first-year School of Dentistry DMD students. All students were taught five physiological systems using traditional lecture methods and six physiological systems using engaging lecture methods. The use of engaging lectures led to a statistically significant higher average on unit exams compared with traditional didactic lectures (8.6% higher, P < 0.05). Furthermore, students demonstrated an improved long-term retention of information via higher scores on the comprehensive final exam (22.9% higher in engaging lecture sections, P < 0.05). Many qualitative improvements were also indicated via student surveys and evaluations, including an increased perceived effectiveness of lectures, decrease in distractions during lecture, and increased confidence with the material. The development of engaging lecture activities requires a significant amount of instructor preparation and limits the time available to provide traditional lectures. However, the positive results of this study suggest the need for a restructuring of the physiology curriculum to incorporate more engaging lectures to improve both the qualitative experiences and performance levels of professional students.
Syndromic surveillance of influenza activity in Sweden: an evaluation of three tools.
Ma, T; Englund, H; Bjelkmar, P; Wallensten, A; Hulth, A
2015-08-01
An evaluation was conducted to determine which syndromic surveillance tools complement traditional surveillance by serving as earlier indicators of influenza activity in Sweden. Web queries, medical hotline statistics, and school absenteeism data were evaluated against two traditional surveillance tools. Cross-correlation calculations utilized aggregated weekly data for all-age, nationwide activity for four influenza seasons, from 2009/2010 to 2012/2013. The surveillance tool indicative of earlier influenza activity, by way of statistical and visual evidence, was identified. The web query algorithm and medical hotline statistics performed equally well as each other and to the traditional surveillance tools. School absenteeism data were not reliable resources for influenza surveillance. Overall, the syndromic surveillance tools did not perform with enough consistency in season lead nor in earlier timing of the peak week to be considered as early indicators. They do, however, capture incident cases before they have formally entered the primary healthcare system.
Statistical tools for transgene copy number estimation based on real-time PCR.
Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal
2007-11-01
As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.
Evaluating the decision accuracy and speed of clinical data visualizations.
Pieczkiewicz, David S; Finkelstein, Stanley M
2010-01-01
Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
2014-01-01
Background The UK Clinical Aptitude Test (UKCAT) was designed to address issues identified with traditional methods of selection. This study aims to examine the predictive validity of the UKCAT and compare this to traditional selection methods in the senior years of medical school. This was a follow-up study of two cohorts of students from two medical schools who had previously taken part in a study examining the predictive validity of the UKCAT in first year. Methods The sample consisted of 4th and 5th Year students who commenced their studies at the University of Aberdeen or University of Dundee medical schools in 2007. Data collected were: demographics (gender and age group), UKCAT scores; Universities and Colleges Admissions Service (UCAS) form scores; admission interview scores; Year 4 and 5 degree examination scores. Pearson’s correlations were used to examine the relationships between admissions variables, examination scores, gender and age group, and to select variables for multiple linear regression analysis to predict examination scores. Results Ninety-nine and 89 students at Aberdeen medical school from Years 4 and 5 respectively, and 51 Year 4 students in Dundee, were included in the analysis. Neither UCAS form nor interview scores were statistically significant predictors of examination performance. Conversely, the UKCAT yielded statistically significant validity coefficients between .24 and .36 in four of five assessments investigated. Multiple regression analysis showed the UKCAT made a statistically significant unique contribution to variance in examination performance in the senior years. Conclusions Results suggest the UKCAT appears to predict performance better in the later years of medical school compared to earlier years and provides modest supportive evidence for the UKCAT’s role in student selection within these institutions. Further research is needed to assess the predictive validity of the UKCAT against professional and behavioural outcomes as the cohort commences working life. PMID:24762134
Wagner, David G; Russell, Donna K; Benson, Jenna M; Schneider, Ashley E; Hoda, Rana S; Bonfiglio, Thomas A
2011-10-01
Traditional cell block (TCB) sections serve as an important diagnostic adjunct to cytologic smears but are also used today as a reliable preparation for immunohistochemical (IHC) studies. There are many ways to prepare a cell block and the methods continue to be revised. In this study, we compare the TCB with the Cellient™ automated cell block system. Thirty-five cell blocks were obtained from 16 benign and 19 malignant nongynecologic cytology specimens at a large university teaching hospital and prepared according to TCB and Cellient protocols. Cell block sections from both methods were compared for possible differences in various morphologic features and immunohistochemical staining patterns. In the 16 benign cases, no significant morphologic differences were found between the TCB and Cellient cell block sections. For the 19 malignant cases, some noticeable differences in the nuclear chromatin and cellularity were identified, although statistical significance was not attained. Immunohistochemical or special stains were performed on 89% of the malignant cases (17/19). Inadequate cellularity precluded full evaluation in 23% of Cellient cell block IHC preparations (4/17). Of the malignant cases with adequate cellularity (13/17), the immunohistochemical staining patterns from the different methods were identical in 53% of cases. The traditional and Cellient cell block sections showed similar morphologic and immunohistochemical staining patterns. The only significant difference between the two methods concerned the lower overall cell block cellularity identified during immunohistochemical staining in the Cellient cell block sections. Copyright © 2010 Wiley-Liss, Inc.
Barikani, Ameneh; Beheshti, Akram; Javadi, Maryam; Yasi, Marzieh
2015-08-01
Orientation of public and physicians to the complementary and alternative medicine (CAM) is one of the most prominent symbols of structural changes in the health service system. The aim of his study was a determination of knowledge, attitude, and practice of general practitioners in complementary and alternative medicine. This cross- sectional study was conducted in Qazvin, Iran in 2013. A self-administered questionnaire was used for collecting data including four information parts: population information, physicians' attitude and knowledge, methods of getting information and their function. A total of 228 physicians in Qazvin comprised the population of study according to the deputy of treatment's report of Qazvin University of Medical Sciences. A total of 150 physicians were selected randomly, and SPSS Statistical program was used to enter questionnaires' data. Results were analyzed as descriptive statistics and statistical analysis. Sixty percent of all responders were male. About sixty (59.4) percent of participating practitioners had worked less than 10 years.96.4 percent had a positive attitude towards complementary and alternative medicine. Knowledge of practitioners about traditional medicine in 11 percent was good, 36.3% and 52.7% had average and little information, respectively. 17.9% of practitioners offered their patients complementary and alternative medicine for treatment. Although there was little knowledge among practitioners about traditional medicine and complementary approaches, a significant percentage of them had attitude higher than the lower limit.
On prognostic models, artificial intelligence and censored observations.
Anand, S S; Hamilton, P W; Hughes, J G; Bell, D A
2001-03-01
The development of prognostic models for assisting medical practitioners with decision making is not a trivial task. Models need to possess a number of desirable characteristics and few, if any, current modelling approaches based on statistical or artificial intelligence can produce models that display all these characteristics. The inability of modelling techniques to provide truly useful models has led to interest in these models being purely academic in nature. This in turn has resulted in only a very small percentage of models that have been developed being deployed in practice. On the other hand, new modelling paradigms are being proposed continuously within the machine learning and statistical community and claims, often based on inadequate evaluation, being made on their superiority over traditional modelling methods. We believe that for new modelling approaches to deliver true net benefits over traditional techniques, an evaluation centric approach to their development is essential. In this paper we present such an evaluation centric approach to developing extensions to the basic k-nearest neighbour (k-NN) paradigm. We use standard statistical techniques to enhance the distance metric used and a framework based on evidence theory to obtain a prediction for the target example from the outcome of the retrieved exemplars. We refer to this new k-NN algorithm as Censored k-NN (Ck-NN). This reflects the enhancements made to k-NN that are aimed at providing a means for handling censored observations within k-NN.
He, Wei; Xie, Yanming; Wang, Yongyan
2011-12-01
Post-marketing re-evaluation of Chinese herbs can well reflect Chinese medicine characteristics, which is the most easily overlooked the clinical re-evaluation content. Since little attention has been paid to this, study on the clinical trial design method was lost. It is difficult to improving the effectiveness and safety of traditional Chinese medicine. Therefore, more attention should be paid on re-evaluation of the clinical trial design method point about tcm syndrome such as the type of research program design, the study of Chinese medical information collection scale and statistical analysis methods, so as to improve the clinical trial design method study about tcm syndrome of Chinese herbs postmarketing re-evalutation status.
A peaking-regulation-balance-based method for wind & PV power integrated accommodation
NASA Astrophysics Data System (ADS)
Zhang, Jinfang; Li, Nan; Liu, Jun
2018-02-01
Rapid development of China’s new energy in current and future should be focused on cooperation of wind and PV power. Based on the analysis of system peaking balance, combined with the statistical features of wind and PV power output characteristics, a method of comprehensive integrated accommodation analysis of wind and PV power is put forward. By the electric power balance during night peaking load period in typical day, wind power installed capacity is determined firstly; then PV power installed capacity could be figured out by midday peak load hours, which effectively solves the problem of uncertainty when traditional method hard determines the combination of the wind and solar power simultaneously. The simulation results have validated the effectiveness of the proposed method.
Application of meta-analysis methods for identifying proteomic expression level differences.
Amess, Bob; Kluge, Wolfgang; Schwarz, Emanuel; Haenisch, Frieder; Alsaif, Murtada; Yolken, Robert H; Leweke, F Markus; Guest, Paul C; Bahn, Sabine
2013-07-01
We present new statistical approaches for identification of proteins with expression levels that are significantly changed when applying meta-analysis to two or more independent experiments. We showed that the Euclidean distance measure has reduced risk of false positives compared to the rank product method. Our Ψ-ranking method has advantages over the traditional fold-change approach by incorporating both the fold-change direction as well as the p-value. In addition, the second novel method, Π-ranking, considers the ratio of the fold-change and thus integrates all three parameters. We further improved the latter by introducing our third technique, Σ-ranking, which combines all three parameters in a balanced nonparametric approach. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DBS-LC-MS/MS assay for caffeine: validation and neonatal application.
Bruschettini, Matteo; Barco, Sebastiano; Romantsik, Olga; Risso, Francesco; Gennai, Iulian; Chinea, Benito; Ramenghi, Luca A; Tripodi, Gino; Cangemi, Giuliana
2016-09-01
DBS might be an appropriate microsampling technique for therapeutic drug monitoring of caffeine in infants. Nevertheless, its application presents several issues that still limit its use. This paper describes a validated DBS-LC-MS/MS method for caffeine. The results of the method validation showed an hematocrit dependence. In the analysis of 96 paired plasma and DBS clinical samples, caffeine levels measured in DBS were statistically significantly lower than in plasma but the observed differences were independent from hematocrit. These results clearly showed the need for extensive validation with real-life samples for DBS-based methods. DBS-LC-MS/MS can be considered to be a good alternative to traditional methods for therapeutic drug monitoring or PK studies in preterm infants.
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Paramedic student performance: comparison of online with on-campus lecture delivery methods.
Hubble, Michael W; Richards, Michael E
2006-01-01
Colleges and universities are experiencing increasing demand for online courses in many healthcare disciplines, including emergency medical services (EMS). Development and implementation of online paramedic courses with the quality of education experienced in the traditional classroom setting is essential in order to maintain the integrity of the educational process. Currently, there is conflicting evidence of whether a significant difference exists in student performance between online and traditional nursing and allied health courses. However, there are no published investigations of the effectiveness of online learning by paramedic students. Performance of paramedic students enrolled in an online, undergraduate, research methods course is equivalent to the performance of students enrolled in the same course provided in a traditional, classroom environment. Academic performance, learning styles, and course satisfaction surveys were compared between two groups of students. The course content was identical for both courses and taught by the same instructor during the same semester. The primary difference between the traditional course and the online course was the method of lecture delivery. Lectures for the on-campus students were provided live in a traditional classroom setting using PowerPoint slides. Lectures for the online students were provided using the same PowerPoint slides with prerecorded streaming audio and video. A convenience sample of 23 online and 10 traditional students participated in this study. With the exception of two learning domains, the two groups of students exhibited similar learning styles as assessed using the Grasha-Riechmann Student Learning Style Scales instrument. The online students scored significantly lower in the competitive and dependent dimensions than did the on-campus students. Academic performance was similar between the two groups. The online students devoted slightly more time to the course than did the campus students, although this difference did not reach statistical significance. In general, the online students believed the online audio lectures were more effective than the traditional live lectures. Distance learning technology appears to be an effective mechanism for extending didactic paramedic education off-campus, and may be beneficial particularly to areas that lack paramedic training programs or adequate numbers of qualified instructors.
Initial study of Schroedinger eigenmaps for spectral target detection
NASA Astrophysics Data System (ADS)
Dorado-Munoz, Leidy P.; Messinger, David W.
2016-08-01
Spectral target detection refers to the process of searching for a specific material with a known spectrum over a large area containing materials with different spectral signatures. Traditional target detection methods in hyperspectral imagery (HSI) require assuming the data fit some statistical or geometric models and based on the model, to estimate parameters for defining a hypothesis test, where one class (i.e., target class) is chosen over the other classes (i.e., background class). Nonlinear manifold learning methods such as Laplacian eigenmaps (LE) have extensively shown their potential use in HSI processing, specifically in classification or segmentation. Recently, Schroedinger eigenmaps (SE), which is built upon LE, has been introduced as a semisupervised classification method. In SE, the former Laplacian operator is replaced by the Schroedinger operator. The Schroedinger operator includes by definition, a potential term V that steers the transformation in certain directions improving the separability between classes. In this regard, we propose a methodology for target detection that is not based on the traditional schemes and that does not need the estimation of statistical or geometric parameters. This method is based on SE, where the potential term V is taken into consideration to include the prior knowledge about the target class and use it to steer the transformation in directions where the target location in the new space is known and the separability between target and background is augmented. An initial study of how SE can be used in a target detection scheme for HSI is shown here. In-scene pixel and spectral signature detection approaches are presented. The HSI data used comprise various target panels for testing simultaneous detection of multiple objects with different complexities.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.
Weigold, Arne; Weigold, Ingrid K; Russell, Elizabeth J
2013-03-01
Self-report survey-based data collection is increasingly carried out using the Internet, as opposed to the traditional paper-and-pencil method. However, previous research on the equivalence of these methods has yielded inconsistent findings. This may be due to methodological and statistical issues present in much of the literature, such as nonequivalent samples in different conditions due to recruitment, participant self-selection to conditions, and data collection procedures, as well as incomplete or inappropriate statistical procedures for examining equivalence. We conducted 2 studies examining the equivalence of paper-and-pencil and Internet data collection that accounted for these issues. In both studies, we used measures of personality, social desirability, and computer self-efficacy, and, in Study 2, we used personal growth initiative to assess quantitative equivalence (i.e., mean equivalence), qualitative equivalence (i.e., internal consistency and intercorrelations), and auxiliary equivalence (i.e., response rates, missing data, completion time, and comfort completing questionnaires using paper-and-pencil and the Internet). Study 1 investigated the effects of completing surveys via paper-and-pencil or the Internet in both traditional (i.e., lab) and natural (i.e., take-home) settings. Results indicated equivalence across conditions, except for auxiliary equivalence aspects of missing data and completion time. Study 2 examined mailed paper-and-pencil and Internet surveys without contact between experimenter and participants. Results indicated equivalence between conditions, except for auxiliary equivalence aspects of response rate for providing an address and completion time. Overall, the findings show that paper-and-pencil and Internet data collection methods are generally equivalent, particularly for quantitative and qualitative equivalence, with nonequivalence only for some aspects of auxiliary equivalence. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Catto, James W F; Linkens, Derek A; Abbod, Maysam F; Chen, Minyou; Burton, Julian L; Feeley, Kenneth M; Hamdy, Freddie C
2003-09-15
New techniques for the prediction of tumor behavior are needed, because statistical analysis has a poor accuracy and is not applicable to the individual. Artificial intelligence (AI) may provide these suitable methods. Whereas artificial neural networks (ANN), the best-studied form of AI, have been used successfully, its hidden networks remain an obstacle to its acceptance. Neuro-fuzzy modeling (NFM), another AI method, has a transparent functional layer and is without many of the drawbacks of ANN. We have compared the predictive accuracies of NFM, ANN, and traditional statistical methods, for the behavior of bladder cancer. Experimental molecular biomarkers, including p53 and the mismatch repair proteins, and conventional clinicopathological data were studied in a cohort of 109 patients with bladder cancer. For all three of the methods, models were produced to predict the presence and timing of a tumor relapse. Both methods of AI predicted relapse with an accuracy ranging from 88% to 95%. This was superior to statistical methods (71-77%; P < 0.0006). NFM appeared better than ANN at predicting the timing of relapse (P = 0.073). The use of AI can accurately predict cancer behavior. NFM has a similar or superior predictive accuracy to ANN. However, unlike the impenetrable "black-box" of a neural network, the rules of NFM are transparent, enabling validation from clinical knowledge and the manipulation of input variables to allow exploratory predictions. This technique could be used widely in a variety of areas of medicine.
Utilization of blended learning to teach preclinical endodontics.
Maresca, Cristina; Barrero, Carlos; Duggan, Dereck; Platin, Enrique; Rivera, Eric; Hannum, Wallace; Petrola, Frank
2014-08-01
Blended learning (BL) is the integration of classroom learning with an online environment. The purpose of this study was to determine whether dental students who experienced BL in a preclinical endodontic course demonstrated better manual skills, conceptual knowledge, and learning experience compared to those experiencing traditional learning. All eighty-one students (100 percent) in a preclinical endodontics course agreed to participate and were assigned to either the traditional or BL group. A root canal procedure was used to determine the level of manual skills gained by each group. Pre- and post-intervention quizzes were given to all students to evaluate conceptual knowledge gained, and the students' perspectives on the methods were evaluated with a survey. The BL group scored better than the traditional group on the manual skills exercise at a statistically significant level (p=0.0067). There were no differences in the post-intervention quiz scores between the two groups, and the students' opinions were positive regarding BL. With BL, the students were able to learn and demonstrate dental skills at a high level.
Statistical complexity measure of pseudorandom bit generators
NASA Astrophysics Data System (ADS)
González, C. M.; Larrondo, H. A.; Rosso, O. A.
2005-08-01
Pseudorandom number generators (PRNG) are extensively used in Monte Carlo simulations, gambling machines and cryptography as substitutes of ideal random number generators (RNG). Each application imposes different statistical requirements to PRNGs. As L’Ecuyer clearly states “the main goal for Monte Carlo methods is to reproduce the statistical properties on which these methods are based whereas for gambling machines and cryptology, observing the sequence of output values for some time should provide no practical advantage for predicting the forthcoming numbers better than by just guessing at random”. In accordance with different applications several statistical test suites have been developed to analyze the sequences generated by PRNGs. In a recent paper a new statistical complexity measure [Phys. Lett. A 311 (2003) 126] has been defined. Here we propose this measure, as a randomness quantifier of a PRNGs. The test is applied to three very well known and widely tested PRNGs available in the literature. All of them are based on mathematical algorithms. Another PRNGs based on Lorenz 3D chaotic dynamical system is also analyzed. PRNGs based on chaos may be considered as a model for physical noise sources and important new results are recently reported. All the design steps of this PRNG are described, and each stage increase the PRNG randomness using different strategies. It is shown that the MPR statistical complexity measure is capable to quantify this randomness improvement. The PRNG based on the chaotic 3D Lorenz dynamical system is also evaluated using traditional digital signal processing tools for comparison.
ASCS online fault detection and isolation based on an improved MPCA
NASA Astrophysics Data System (ADS)
Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan
2014-09-01
Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Rapid measurement of protein osmotic second virial coefficients by self-interaction chromatography.
Tessier, Peter M; Lenhoff, Abraham M; Sandler, Stanley I
2002-01-01
Weak protein interactions are often characterized in terms of the osmotic second virial coefficient (B(22)), which has been shown to correlate with protein phase behavior, such as crystallization. Traditional methods for measuring B(22), such as static light scattering, are too expensive in terms of both time and protein to allow extensive exploration of the effects of solution conditions on B(22). In this work we have measured protein interactions using self-interaction chromatography, in which protein is immobilized on chromatographic particles and the retention of the same protein is measured in isocratic elution. The relative retention of the protein reflects the average protein interactions, which we have related to the second virial coefficient via statistical mechanics. We obtain quantitative agreement between virial coefficients measured by self-interaction chromatography and traditional characterization methods for both lysozyme and chymotrypsinogen over a wide range of pH and ionic strengths, yet self-interaction chromatography requires at least an order of magnitude less time and protein than other methods. The method thus holds significant promise for the characterization of protein interactions requiring only commonly available laboratory equipment, little specialized expertise, and relatively small investments of both time and protein. PMID:11867474
A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.
Yu, Qingzhao; Zhu, Lin; Zhu, Han
2017-11-01
Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.
Comparison of a multispectral vision system and a colorimeter for the assessment of meat color.
Trinderup, Camilla H; Dahl, Anders; Jensen, Kirsten; Carstensen, Jens Michael; Conradsen, Knut
2015-04-01
The color assessment ability of a multispectral vision system is investigated by a comparison study with color measurements from a traditional colorimeter. The experiment involves fresh and processed meat samples. Meat is a complex material; heterogeneous with varying scattering and reflectance properties, so several factors can influence the instrumental assessment of meat color. In order to assess whether two methods are equivalent, the variation due to these factors must be taken into account. A statistical analysis was conducted and showed that on a calibration sheet the two instruments are equally capable of measuring color. Moreover the vision system provides a more color rich assessment of fresh meat samples with a glossier surface, than the colorimeter. Careful studies of the different sources of variation enable an assessment of the order of magnitude of the variability between methods accounting for other sources of variation leading to the conclusion that color assessment using a multispectral vision system is superior to traditional colorimeter assessments. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Trecia Markes, Cecelia
2006-03-01
With a three-year FIPSE grant, it has been possible at the University of Nebraska at Kearney (UNK) to develop and implement activity- based introductory physics at the algebra level. It has generally been recognized that students enter physics classes with misconceptions about motion and force. Many of these misconceptions persist after instruction. Pretest and posttest responses on the ``Force and Motion Conceptual Evaluation'' (FMCE) are analyzed to determine the effectiveness of the activity- based method of instruction relative to the traditional (lecture/lab) method of instruction. Data were analyzed to determine the following: student understanding at the beginning of the course, student understanding at the end of the course, how student understanding is related to the type of class taken, student understanding based on gender and type of class. Some of the tests used are the t-test, the chi-squared test, and analysis of variance. The results of these tests will be presented, and their implications will be discussed.
A new statistical methodology predicting chip failure probability considering electromigration
NASA Astrophysics Data System (ADS)
Sun, Ted
In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.
ERIC Educational Resources Information Center
Chan, Shiau Wei; Ismail, Zaleha
2014-01-01
The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…
Ensor, Joie; Riley, Richard D.
2016-01-01
Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915
Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education
Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.
2015-01-01
Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236
Chen, Song-Lin; Chen, Cong; Zhu, Hui; Li, Jing; Pang, Yan
2016-01-01
Cancer-related anorexia syndrome (CACS) is one of the main causes for death at present as well as a syndrome seriously harming patients' quality of life, treatment effect and survival time. In current clinical researches, there are fewer reports about empirical traditional Chinese medicine(TCM) prescriptions and patent prescriptions treating CACS, and prescription rules are rarely analyzed in a systematic manner. As the hidden rules are not excavated, it is hard to have an innovative discovery and knowledge of clinical medication. In this paper, the grey screening method combined with the multivariate statistical method was used to build the ″CACS prescriptions database″. Based on the database, totally 359 prescriptions were selected, the frequency of herbs in prescription was determined, and commonly combined drugs were evolved into 4 new prescriptions for different syndromes. Prescriptions of TCM in treatment of CACS gave priority to benefiting qi for strengthening spleen, also laid emphasis on replenishing kidney essence, dispersing stagnated liver-qi and dispersing lung-qi. Moreover, interdependence and mutual promotion of yin and yang should be taken into account to reflect TCM's holism and theory for treatment based on syndrome differentiation. The grey screening method, as a valuable traditional Chinese medicine research-supporting method, can be used to subjectively and objectively analyze prescription rules; and the new prescriptions can provide reference for the clinical use of TCM for treating CACS and the drug development. Copyright© by the Chinese Pharmaceutical Association.
Q methodology: a new way of assessing employee satisfaction.
Chinnis, A S; Summers, D E; Doerr, C; Paulson, D J; Davis, S M
2001-05-01
As yet another nursing shortage faces the country, the issue of the satisfaction of nurses again becomes of critical concern to nursing managers in the interest of staff retention. The authors describe the use of the statistical technique Q methodology to assess the needs of nurses and other medical staff at a level one, tertiary care emergency department in the United States. Using the Q method, the authors were able to identify different, unique viewpoints concerning employee needs among the study population, as well as commonly shared views. This level of detail, not obtainable using more traditional statistical techniques, can aid in the design of more effective strategies aimed at fulfilling the needs of an organization's staff to increase their satisfaction.
Comparative analysis of profitability of honey production using traditional and box hives.
Al-Ghamdi, Ahmed A; Adgaba, Nuru; Herab, Ahmed H; Ansari, Mohammad J
2017-07-01
Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA), the Cobb-Douglas (CD) production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.
A weighted U-statistic for genetic association analyses of sequencing data.
Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing
2014-12-01
With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.
A statistical approach for validating eSOTER and digital soil maps in front of traditional soil maps
NASA Astrophysics Data System (ADS)
Bock, Michael; Baritz, Rainer; Köthe, Rüdiger; Melms, Stephan; Günther, Susann
2015-04-01
During the European research project eSOTER, three different Digital Soil Maps (DSM) were developed for the pilot area Chemnitz 1:250,000 (FP7 eSOTER project, grant agreement nr. 211578). The core task of the project was to revise the SOTER method for the interpretation of soil and terrain data. It was one of the working hypothesis that eSOTER does not only provide terrain data with typical soil profiles, but that the new products actually perform like a conceptual soil map. The three eSOTER maps for the pilot area considerably differed in spatial representation and content of soil classes. In this study we compare the three eSOTER maps against existing reconnaissance soil maps keeping in mind that traditional soil maps have many subjective issues and intended bias regarding the overestimation and emphasize of certain features. Hence, a true validation of the proper representation of modeled soil maps is hardly possible; rather a statistical comparison between modeled and empirical approaches is possible. If eSOTER data represent conceptual soil maps, then different eSOTER, DSM and conventional maps from various sources and different regions could be harmonized towards consistent new data sets for large areas including the whole European continent. One of the eSOTER maps has been developed closely to the traditional SOTER method: terrain classification data (derived from SRTM DEM) were combined with lithology data (re-interpreted geological map); the corresponding terrain units were then extended with soil information: a very dense regional soil profile data set was used to define soil mapping units based on a statistical grouping of terrain units. The second map is a pure DSM map using continuous terrain parameters instead of terrain classification; radiospectrometric data were used to supplement parent material information from geology maps. The classification method Random Forest was used. The third approach predicts soil diagnostic properties based on covariates similar to DSM practices; in addition, multi-temporal MODIS data were used; the resulting soil map is the product of these diagnostic layers producing a map of soil reference groups (classified according to WRB). Because the third approach was applied to a larger test area in central Europe, and compared to the first two approaches, has worked with coarser input data, comparability is only partly fulfilled. To evaluate the usability of the three eSOTER maps, and to make a comparison among them, traditional soil maps 1:200,000 and 1:50,000 were used as reference data sets. Three statistical methods were applied: (i) in a moving window the distribution of the soil classes of each DSM product was compared to that of the soil maps by calculating the corrected coefficient of contingency, (ii) the value of predictive power for each of the eSOTER maps was determined, and (iii) the degree of consistency was derived. The latter is based on a weighting of the match of occurring class combinations via expert knowledge and recalculating the proportions of map appearance with these weights. To re-check the validation results a field study by local soil experts was conducted. The results show clearly that the first eSOTER approach based on the terrain classification / reinterpreted parent material information has the greatest similarity with traditional soil maps. The spatial differentiation offered by such an approach is well suitable to serve as a conceptual soil map. Therefore, eSOTER can be a tool for soil mappers to generate conceptual soil maps in a faster and more consistent way. This conclusion is at least valid for overview scales such as 1.250,000.
A bootstrapping method for development of Treebank
NASA Astrophysics Data System (ADS)
Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.
2017-01-01
Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].
Level set method for image segmentation based on moment competition
NASA Astrophysics Data System (ADS)
Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai
2015-05-01
We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.
Høyer, Ellen; Jahnsen, Reidun; Stanghelle, Johan Kvalvik; Strand, Liv Inger
2012-01-01
Treadmill training with body weight support (TTBWS) for relearning walking ability after brain damage is an approach under current investigation. Efficiency of this method beyond traditional training is lacking evidence, especially in patients needing walking assistance after stroke. The objective of this study was to investigate change in walking and transfer abilities, comparing TTBWS with traditional walking training. A single-blinded, randomized controlled trial was conducted. Sixty patients referred for multi-disciplinary primary rehabilitation were assigned into one of two intervention groups, one received 30 sessions of TTBWS plus traditional training, the other traditional training alone. Daily training was 1 hr. Outcome measures were Functional Ambulation Categories (FAC), Walking, Functional Independence Measure (FIM); shorter transfer and stairs, 10 m and 6-min walk tests. Substantial improvements in walking and transfer were shown within both groups after 5 and 11 weeks of intervention. Overall no statistical significant differences were found between the groups, but 12 of 17 physical measures tended to show improvements in favour of the treadmill approach. Both training strategies provided significant improvements in the tested activities, suggesting that similar outcomes can be obtained in the two modalities by systematic, intensive and goal directed training.
Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J
2015-03-01
Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between training groups. This pilot study of a deployed mixed-provider disaster response team suggests that a novel portable US simulator may provide equivalent skills training in comparison to traditional live instructor and model training. Further studies with a larger sample size and other measures of short- and long-term clinical performance are warranted.
Shi, Zhao-feng; Song, Tie-bing; Xie, Juan; Yan, Yi-quan
2017-01-01
Background Atopic dermatitis (AD) has become a common skin disease that requires systematic and comprehensive treatment to achieve adequate clinical control. Traditional Chinese medicines and related treatments have shown clinical effects for AD in many studies. But the systematic reviews and meta-analyses for them are lacking. Objective The systematic review and meta-analysis based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement were conducted to evaluate the efficacy and safety of traditional Chinese medicines and related treatments for AD treatment. Methods Randomized controlled trials (RCTs) were searched based on standardized searching rules in eight medical databases from the inception up to December 2016 and a total of 24 articles with 1,618 patients were enrolled in this meta-analysis. Results The results revealed that traditional Chinese medicines and related treatments did not show statistical differences in clinical effectiveness, SCORAD amelioration, and SSRI amelioration for AD treatment compared with control group. However, EASI amelioration of traditional Chinese medicines and related treatments for AD was superior to control group. Conclusion We need to make conclusion cautiously for the efficacy and safety of traditional Chinese medicine and related treatment on AD therapy. More standard, multicenter, double-blind randomized controlled trials (RCTs) of traditional Chinese medicine and related treatment for AD were required to be conducted for more clinical evidences providing in the future. PMID:28713436
A simple method of equine limb force vector analysis and its potential applications.
Hobbs, Sarah Jane; Robinson, Mark A; Clayton, Hilary M
2018-01-01
Ground reaction forces (GRF) measured during equine gait analysis are typically evaluated by analyzing discrete values obtained from continuous force-time data for the vertical, longitudinal and transverse GRF components. This paper describes a simple, temporo-spatial method of displaying and analyzing sagittal plane GRF vectors. In addition, the application of statistical parametric mapping (SPM) is introduced to analyse differences between contra-lateral fore and hindlimb force-time curves throughout the stance phase. The overall aim of the study was to demonstrate alternative methods of evaluating functional (a)symmetry within horses. GRF and kinematic data were collected from 10 horses trotting over a series of four force plates (120 Hz). The kinematic data were used to determine clean hoof contacts. The stance phase of each hoof was determined using a 50 N threshold. Vertical and longitudinal GRF for each stance phase were plotted both as force-time curves and as force vector diagrams in which vectors originating at the centre of pressure on the force plate were drawn at intervals of 8.3 ms for the duration of stance. Visual evaluation was facilitated by overlay of the vector diagrams for different limbs. Summary vectors representing the magnitude (VecMag) and direction (VecAng) of the mean force over the entire stance phase were superimposed on the force vector diagram. Typical measurements extracted from the force-time curves (peak forces, impulses) were compared with VecMag and VecAng using partial correlation (controlling for speed). Paired samples t -tests (left v. right diagonal pair comparison and high v. low vertical force diagonal pair comparison) were performed on discrete and vector variables using traditional methods and Hotelling's T 2 tests on normalized stance phase data using SPM. Evidence from traditional statistical tests suggested that VecMag is more influenced by the vertical force and impulse, whereas VecAng is more influenced by the longitudinal force and impulse. When used to evaluate mean data from the group of ten sound horses, SPM did not identify differences between the left and right contralateral limb pairs or between limb pairs classified according to directional asymmetry. When evaluating a single horse, three periods were identified during which differences in the forces between the left and right forelimbs exceeded the critical threshold ( p < .01). Traditional statistical analysis of 2D GRF peak values, summary vector variables and visual evaluation of force vector diagrams gave harmonious results and both methods identified the same inter-limb asymmetries. As alpha was more tightly controlled using SPM, significance was only found in the individual horse although T 2 plots followed the same trends as discrete analysis for the group. The techniques of force vector analysis and SPM hold promise for investigations of sidedness and asymmetry in horses.
Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan
2017-12-27
Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.
Demodulation of messages received with low signal to noise ratio
NASA Astrophysics Data System (ADS)
Marguinaud, A.; Quignon, T.; Romann, B.
The implementation of this all-digital demodulator is derived from maximum likelihood considerations applied to an analytical representation of the received signal. Traditional adapted filters and phase lock loops are replaced by minimum variance estimators and hypothesis tests. These statistical tests become very simple when working on phase signal. These methods, combined with rigorous control data representation allow significant computation savings as compared to conventional realizations. Nominal operation has been verified down to energetic signal over noise of -3 dB upon a QPSK demodulator.
1992-02-01
configuration. We have spent the last year observing two firms as they experimented with modular manufacturing. The following report will track the progress of...the transitions as they I moved through the year . Incorporated into the analysis is the statistical interpretation of data collected from each firm, as...during the year . FEBRUARY The most noticeable change this month was the introduction of the new ergonomic chairs for the operators. Previously the
Hoch, Jeffrey S; Briggs, Andrew H; Willan, Andrew R
2002-07-01
Economic evaluation is often seen as a branch of health economics divorced from mainstream econometric techniques. Instead, it is perceived as relying on statistical methods for clinical trials. Furthermore, the statistic of interest in cost-effectiveness analysis, the incremental cost-effectiveness ratio is not amenable to regression-based methods, hence the traditional reliance on comparing aggregate measures across the arms of a clinical trial. In this paper, we explore the potential for health economists undertaking cost-effectiveness analysis to exploit the plethora of established econometric techniques through the use of the net-benefit framework - a recently suggested reformulation of the cost-effectiveness problem that avoids the reliance on cost-effectiveness ratios and their associated statistical problems. This allows the formulation of the cost-effectiveness problem within a standard regression type framework. We provide an example with empirical data to illustrate how a regression type framework can enhance the net-benefit method. We go on to suggest that practical advantages of the net-benefit regression approach include being able to use established econometric techniques, adjust for imperfect randomisation, and identify important subgroups in order to estimate the marginal cost-effectiveness of an intervention. Copyright 2002 John Wiley & Sons, Ltd.
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Data Mining: Going beyond Traditional Statistics
ERIC Educational Resources Information Center
Zhao, Chun-Mei; Luan, Jing
2006-01-01
The authors provide an overview of data mining, giving special attention to the relationship between data mining and statistics to unravel some misunderstandings about the two techniques. (Contains 1 figure.)
Spatial modelling of disease using data- and knowledge-driven approaches.
Stevens, Kim B; Pfeiffer, Dirk U
2011-09-01
The purpose of spatial modelling in animal and public health is three-fold: describing existing spatial patterns of risk, attempting to understand the biological mechanisms that lead to disease occurrence and predicting what will happen in the medium to long-term future (temporal prediction) or in different geographical areas (spatial prediction). Traditional methods for temporal and spatial predictions include general and generalized linear models (GLM), generalized additive models (GAM) and Bayesian estimation methods. However, such models require both disease presence and absence data which are not always easy to obtain. Novel spatial modelling methods such as maximum entropy (MAXENT) and the genetic algorithm for rule set production (GARP) require only disease presence data and have been used extensively in the fields of ecology and conservation, to model species distribution and habitat suitability. Other methods, such as multicriteria decision analysis (MCDA), use knowledge of the causal factors of disease occurrence to identify areas potentially suitable for disease. In addition to their less restrictive data requirements, some of these novel methods have been shown to outperform traditional statistical methods in predictive ability (Elith et al., 2006). This review paper provides details of some of these novel methods for mapping disease distribution, highlights their advantages and limitations, and identifies studies which have used the methods to model various aspects of disease distribution. Copyright © 2011. Published by Elsevier Ltd.
Application of Statistics in Engineering Technology Programs
ERIC Educational Resources Information Center
Zhan, Wei; Fink, Rainer; Fang, Alex
2010-01-01
Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…
Natural-Annotation-based Unsupervised Construction of Korean-Chinese Domain Dictionary
NASA Astrophysics Data System (ADS)
Liu, Wuying; Wang, Lin
2018-03-01
The large-scale bilingual parallel resource is significant to statistical learning and deep learning in natural language processing. This paper addresses the automatic construction issue of the Korean-Chinese domain dictionary, and presents a novel unsupervised construction method based on the natural annotation in the raw corpus. We firstly extract all Korean-Chinese word pairs from Korean texts according to natural annotations, secondly transform the traditional Chinese characters into the simplified ones, and finally distill out a bilingual domain dictionary after retrieving the simplified Chinese words in an extra Chinese domain dictionary. The experimental results show that our method can automatically build multiple Korean-Chinese domain dictionaries efficiently.
Lake bed classification using acoustic data
Yin, Karen K.; Li, Xing; Bonde, John; Richards, Carl; Cholwek, Gary
1998-01-01
As part of our effort to identify the lake bed surficial substrates using remote sensing data, this work designs pattern classifiers by multivariate statistical methods. Probability distribution of the preprocessed acoustic signal is analyzed first. A confidence region approach is then adopted to improve the design of the existing classifier. A technique for further isolation is proposed which minimizes the expected loss from misclassification. The devices constructed are applicable for real-time lake bed categorization. A mimimax approach is suggested to treat more general cases where the a priori probability distribution of the substrate types is unknown. Comparison of the suggested methods with the traditional likelihood ratio tests is discussed.
A Comparison of the Achievement of Statistics Students Enrolled in Online and Face-to-Face Settings
ERIC Educational Resources Information Center
Christmann, Edwin P.
2017-01-01
This study compared the achievement of male and female students who were enrolled in an online univariate statistics course to students enrolled in a traditional face-to-face univariate statistics course. The subjects, 47 graduate students enrolled in univariate statistics classes at a public, comprehensive university, were randomly assigned to…
Comparative evaluation of traditional and self-priming hydrophilic resin
Singla, Ruchi; Bogra, Poonam; Singal, Bhawana
2012-01-01
Background: The purpose of this study was to compare the microleakage of traditional composite (Charisma/Gluma Comfort Bond) and self-priming resin (Embrace Wetbond). Materials and Methods: Standardized Class V cavities partly in enamel and cementum were prepared in 20 extracted human premolars. Teeth were divided into two groups. Group 1 was restored with Charisma/Gluma Comfort Bond and Group 2 with Embrace Wetbond. The specimens were stored in distilled water at room temperature for 24 h and then subjected to 200 thermocycles at 5°C and 55°C with a 1 min dwell time. After thermocycling teeth were immersed in a 0.2% solution of methylene blue dye for 24 h. Teeth were sectioned vertically approximately midway through the facial and lingual surfaces using a diamond saw blade. Microleakage was evaluated at enamel and cementum surfaces using 10 × stereomicroscope. The statistical analysis was performed using Wilcoxon signed-rank test. Results: Wetbond showed less microleakage at occlusal and gingival margins as compared with Charisma/Gluma Comfort Bond and the results were statistically significant (P < 0.05). Conclusion: Class V cavities restored with Embrace Wetbond with fewer steps and fewer materials offers greater protection against microleakage at the tooth restorative interface. PMID:22876008
Huang, P; He, J; Zhang, Y M
2017-05-30
Objective: To apply themobile application of patient management in education and follow-up for patients following total knee arthroplasty, and evaluate the clinical outcomes. Methods: A total of 150 patients following total knee arthroplasty were chosen from May to October 2016 in orthopaedics department of our hospital, and they were randomly divided into two groups. On the basis of the traditional education, the observation group combined with the APP education, guidance of functional exercise and follow-up. While traditional face-to-face and telephone education were combined to control group. The activity, compliance and satisfaction score of the two groups were observed. Results: Finally, 132 patients were included in the study. The postoperative range of motion of the two groups in February were respectively (110.83±6.83)°and (105.45±7.53)°, the difference was statistically significant ( P <0.05); the range of motion in March were respectively (110±6.33)°and (103.26±7.57)°, the difference was statistically significant too ( P <0.05); Patients's compliance and satisfaction score in observation group were significantly better than control group( P <0.05). Conclusion: Combination of traditional face-to-face education with mobile application will improve effects of functional training, compliance, and hospital-discharge satisfaction, it will also both shorten the education time and increase the education efficiency. To sum up, it's worth being widely applied clinically.
Beckstead, D Joel; Lambert, Michael J; DuBose, Anthony P; Linehan, Marsha
2015-12-01
This pilot study examined pre to post-change of patients in a substance use residential treatment center that incorporated Dialectical Behavior Therapy with specific cultural, traditional and spiritual practices for American Indian/Alaska Native adolescents. Specifically, the incorporation of cultural, spiritual and traditional practices was done while still maintaining fidelity to the evidence based treatment (DBT). 229 adolescents participated in the study and were given the Youth Outcome Questionnaire-Self-Report version at pre-treatment and post-treatment and the total scores were compared. The results of the research study showed that 96% of adolescents were either "recovered" or "improved" using clinical significant change criteria. Additionally, differences between the group's pre-test scores and post-test scores were statistically significant using a matched standard T-test comparison. Finally, the effect size that was calculated using Cohen's criteria was found to be large. The results are discussed in terms of the implication for integrating western and traditional based methods of care in addressing substance use disorders and other mental health disorders with American Indian/Alaska Native adolescents. Published by Elsevier Ltd.
He, Lan-Juan; Zhu, Xiang-Dong
2016-06-01
To analyze the regularities of prescriptions in "a guide to clinical practice with medical record" (Ye Tianshi) for diarrhoea based on traditional Chinese medicine inheritance support system(V2.5), and provide a reference for further research and development of new traditional Chinese medicines in treating diarrhoea. Traditional Chinese medicine inheritance support system was used to build a prescription database of Chinese medicines for diarrhoea. The software integration data mining method was used to analyze the prescriptions according to "four natures", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis on 94 prescriptions for diarrhoea was used to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations, and achieve 13 new prescriptions. This study indicated that the prescriptions for diarrhoea in "a guide to clinical practice with medical record" are mostly of eliminating dampness and tonifying deficienccy, with neutral drug property, sweet, bitter or hot in flavor, and reflecting the treatment principle of "activating spleen-energy and resolving dampness". Copyright© by the Chinese Pharmaceutical Association.
Aerobic conditioning for team sport athletes.
Stone, Nicholas M; Kilding, Andrew E
2009-01-01
Team sport athletes require a high level of aerobic fitness in order to generate and maintain power output during repeated high-intensity efforts and to recover. Research to date suggests that these components can be increased by regularly performing aerobic conditioning. Traditional aerobic conditioning, with minimal changes of direction and no skill component, has been demonstrated to effectively increase aerobic function within a 4- to 10-week period in team sport players. More importantly, traditional aerobic conditioning methods have been shown to increase team sport performance substantially. Many team sports require the upkeep of both aerobic fitness and sport-specific skills during a lengthy competitive season. Classic team sport trainings have been shown to evoke marginal increases/decreases in aerobic fitness. In recent years, aerobic conditioning methods have been designed to allow adequate intensities to be achieved to induce improvements in aerobic fitness whilst incorporating movement-specific and skill-specific tasks, e.g. small-sided games and dribbling circuits. Such 'sport-specific' conditioning methods have been demonstrated to promote increases in aerobic fitness, though careful consideration of player skill levels, current fitness, player numbers, field dimensions, game rules and availability of player encouragement is required. Whilst different conditioning methods appear equivalent in their ability to improve fitness, whether sport-specific conditioning is superior to other methods at improving actual game performance statistics requires further research.
[Effectiveness of different maintenance methods for codonopsis radix].
Shi, Yan-Bin; Wang, Yu-Ping; Li, Yan; Liu, Cheng-Song; Li, Hui-Li; Zhang, Xiao-Yun; Li, Shou-Tang
2014-05-01
To observe different maintenance methods including vacuum-packing, storage together with tobacco, storage together with fennel, ethanol steam and sulfur fumigation for the protection of Codonopsis Radix against mildew and insect damage, and to analyze the content of polysaccharide and flavonoids of Codonopsis Radix tested in this studies, so as to look for the scientific maintenance methods replacing traditional sulfur fumigation. Except for the sulfur fumigation, naturally air-dried Codonopsis Radix was used to investigate the maintenance effectiveness of the above methods, respectively. Mildew was observed by visual inspection, and the content of polysaccharide and flavonoids were determined by ultra-violet and visible spectrophotometer. Comprehensive evaluation was given based on the results of the different maintenance methods. Low-temperature vacuum-packing, ambient-temperature vacuum-packing and sulfur fumigation could keep Codonopsis Radix from mildew and insect damage for one year, but ambient-temperature vacuum-packing showed flatulent phenomenon; ethanol steam could keep Codonopsis Radix from mildew and insects for over half a year; storage together with tobacco or fennel did not have maintenance effect. The difference of polysaccharide and flavonoids contents of all tested Codonopsis Radix was not statistically significant. Low temperature vacuum-packing maintenance can replace traditional sulfur fumigation, and it can maintain the quality of Codonopsis Radix to a certain extent.
Irrigated areas of India derived using MODIS 500 m time series for the years 2001-2003
Dheeravath, V.; Thenkabail, P.S.; Chandrakantha, G.; Noojipady, P.; Reddy, G.P.O.; Biradar, C.M.; Gumma, M.K.; Velpuri, M.
2010-01-01
The overarching goal of this research was to develop methods and protocols for mapping irrigated areas using a Moderate Resolution Imaging Spectroradiometer (MODIS) 500 m time series, to generate irrigated area statistics, and to compare these with ground- and census-based statistics. The primary mega-file data-cube (MFDC), comparable to a hyper-spectral data cube, used in this study consisted of 952 bands of data in a single file that were derived from MODIS 500 m, 7-band reflectance data acquired every 8-days during 2001-2003. The methods consisted of (a) segmenting the 952-band MFDC based not only on elevation-precipitation-temperature zones but on major and minor irrigated command area boundaries obtained from India's Central Board of Irrigation and Power (CBIP), (b) developing a large ideal spectral data bank (ISDB) of irrigated areas for India, (c) adopting quantitative spectral matching techniques (SMTs) such as the spectral correlation similarity (SCS) R2-value, (d) establishing a comprehensive set of protocols for class identification and labeling, and (e) comparing the results with the National Census data of India and field-plot data gathered during this project for determining accuracies, uncertainties and errors. The study produced irrigated area maps and statistics of India at the national and the subnational (e.g., state, district) levels based on MODIS data from 2001-2003. The Total Area Available for Irrigation (TAAI) and Annualized Irrigated Areas (AIAs) were 113 and 147 million hectares (MHa), respectively. The TAAI does not consider the intensity of irrigation, and its nearest equivalent is the net irrigated areas in the Indian National Statistics. The AIA considers intensity of irrigation and is the equivalent of "irrigated potential utilized (IPU)" reported by India's Ministry of Water Resources (MoWR). The field-plot data collected during this project showed that the accuracy of TAAI classes was 88% with a 12% error of omission and 32% of error of commission. Comparisons between the AIA and IPU produced an R2-value of 0.84. However, AIA was consistently higher than IPU. The causes for differences were both in traditional approaches and remote sensing. The causes of uncertainties unique to traditional approaches were (a) inadequate accounting of minor irrigation (groundwater, small reservoirs and tanks), (b) unwillingness to share irrigated area statistics by the individual Indian states because of their stakes, (c) absence of comprehensive statistical analyses of reported data, and (d) subjectivity involved in observation-based data collection process. The causes of uncertainties unique to remote sensing approaches were (a) irrigated area fraction estimate and related sub-pixel area computations and (b) resolution of the imagery. The causes of uncertainties common in both traditional and remote sensing approaches were definitions and methodological issues. ?? 2009 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).
A powerful score-based test statistic for detecting gene-gene co-association.
Xu, Jing; Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Li, Hongkai; Wu, Xuesen; Xue, Fuzhong; Liu, Yanxun
2016-01-29
The genetic variants identified by Genome-wide association study (GWAS) can only account for a small proportion of the total heritability for complex disease. The existence of gene-gene joint effects which contains the main effects and their co-association is one of the possible explanations for the "missing heritability" problems. Gene-gene co-association refers to the extent to which the joint effects of two genes differ from the main effects, not only due to the traditional interaction under nearly independent condition but the correlation between genes. Generally, genes tend to work collaboratively within specific pathway or network contributing to the disease and the specific disease-associated locus will often be highly correlated (e.g. single nucleotide polymorphisms (SNPs) in linkage disequilibrium). Therefore, we proposed a novel score-based statistic (SBS) as a gene-based method for detecting gene-gene co-association. Various simulations illustrate that, under different sample sizes, marginal effects of causal SNPs and co-association levels, the proposed SBS has the better performance than other existed methods including single SNP-based and principle component analysis (PCA)-based logistic regression model, the statistics based on canonical correlations (CCU), kernel canonical correlation analysis (KCCU), partial least squares path modeling (PLSPM) and delta-square (δ (2)) statistic. The real data analysis of rheumatoid arthritis (RA) further confirmed its advantages in practice. SBS is a powerful and efficient gene-based method for detecting gene-gene co-association.
2013-01-01
Background Although traditional medicine (TM) in South Korea is included in the national health care system, it is considered complementary and alternative medicine (CAM), and not mainstream medicine. Therefore, the lack of statistical data regarding the usage and adverse experiences of traditional Korean medicine (TKM) makes difficult to understand the current status of TM. In this study, we aimed to report usage patterns and adverse experiences on TKM targeting consumers in South Korea. Methods A total of 2000 consumers participated in the survey on usage and adverse experiences in 2008. Among the 2,000 participants, 915 (45.8%) had taken herbal medicine or received traditional medicinal therapies; these individuals were further surveyed on the internet or in an interview. Results The usage rate was higher among women and among patients in their 30s. Of the total TKM usage, acupuncture accounted for 36.7%, and herbal medicine accounted for 13.4%. Regarding the frequency of use of TKM, 73.8% of patients reported using TM less than 5 times in 1 year. Of the 915 respondents, 8.2% of individuals had some type of adverse experience resulting from TKM. Adverse experiences were primarily caused by acupuncture and herbal medicines, and they primarily involved diseases of the digestive system and skin. The incidence of adverse experiences was less than 3.7% for acupuncture and 3.8% for herbal medicine. Overall, the incidence rate of adverse experiences for TKM for the entire population was 0.04 per 10,000 individuals. Conclusions The medical usage and occurrence of adverse events on TKM should be surveyed periodically, and the statistical trends should be analysed. The disparity between the survey results for traditional herbal medicines and medical practices, and those for the national pharmacovigilance system or academic reports of adverse experiences should be examined. The national pharmacovigilance system must be improved to compensate for the disparities. Policies and regulations are required to enhance the reporting of adverse experiences not only for herbal medicines but also for traditional medicinal therapies. PMID:24289266
2012-01-01
Background The increasing gender equality during the 20th century, mainly in the Nordic countries, represents a major social change. A well-established theory is that this may affect the mental health patterns of women and men. This study aimed at examining associations between childhood and adulthood gendered life on mental ill-health symptoms. Methods A follow-up study of a cohort of all school leavers in a medium-sized industrial town in northern Sweden was performed from age 16 to age 42. Of those still alive of the original cohort, 94% (n = 1007) participated during the whole period. Gendered life was divided into three stages according to whether they were traditional or non-traditional (the latter includes equal): childhood (mother’s paid work position), adulthood at age 30 (ideology and childcare), and adulthood at age 42 (partnership and childcare). Mental ill-health was measured by self-reported anxious symptoms (“frequent nervousness”) and depressive symptoms (“frequent sadness”) at age 42. The statistical method was logistic regression analysis, finally adjusted for earlier mental ill-health symptoms and social confounding factors. Results Generally, parents’ gendered life was not decisive for a person’s own gendered life, and adulthood gender position ruled out the impact of childhood gender experience on self-reported mental ill-health. For women, non-traditional gender ideology at age 30 was associated with decreased risk of anxious symptoms (76% for traditional childhood, 78% for non-traditional childhood). For men, non-traditional childcare at age 42 was associated with decreased risk of depressive symptoms (84% for traditional childhood, 78% for non-traditional childhood). A contradictory indication was that non-traditional women in childcare at age 30 had a threefold increased risk of anxious symptoms at age 42, but only when having experienced a traditional childhood. Conclusion Adulthood gender equality is generally good for self-reported mental health regardless of whether one opposes or continues one’s gendered history. However, the childcare findings indicate a differentiated picture; men seem to benefit in depressive symptoms from embracing this traditionally female duty, while women suffer anxious symptoms from departing from it, if their mother did not. PMID:22747800
Corrò, M; Saleh-Mohamed-Lamin, S; Jatri-Hamdi, S; Slut-Ahmed, B; Mohamed-Lejlifa, S; Di Lello, S; Rossi, D; Broglia, A; Vivas-Alegre, L
2012-10-01
The aim of this study was to investigate the hygiene performance of a camel (Camelus dromedarius) slaughtering process as carried out with the traditional method in the Sahrawi refugee camps located in southwestern Algeria. The camel slaughtering process in this region differs significantly from that carried out in commercial abattoirs. Slaughtering is performed outdoors in desert areas, and dehiding of the carcass is approached via the dorsoventral route rather than the classic ventrodorsal route. Samples were taken from 10 camel carcasses from three different areas: the hide, the carcass meat immediately after dehiding, and the meat after final cutting. Enterobacteriaceae counts (EC) were enumerated employing conventional laboratory techniques. Carcass meat samples resulted in EC below the detection limit more frequently if the hide samples from the same carcass had also EC counts below the detection limit. Because of the low number of trials, the calculation of statistical significance of the results was not possible. Further experimental research is needed in order to validate the results presented in this study. The comparison of the microbiological hygiene performance between dorsal dehiding and traditional ventral dehiding of slaughtered animals could serve to validate the hypothesis of the potential positive impact of the dorsal dehiding method in carcass meat hygiene.
Wang, Jianmiao; Xu, Yongjian; Liu, Xiansheng; Xiong, Weining; Xie, Jungang; Zhao, Jianping
2016-01-01
Problem-based learning (PBL) has been extensively applied as an experimental educational method in Chinese medical schools over the past decade. A meta-analysis was performed to assess the effectiveness of PBL on students’ learning outcomes in physical diagnostics education. Related databases were searched for eligible studies evaluating the effects of PBL compared to traditional teaching on students’ knowledge and/or skill scores of physical diagnostics. Standardized mean difference (SMD) with 95% confidence interval (CI) was estimated. Thirteen studies with a total of 2086 medical students were included in this meta-analysis. All of these studies provided usable data on knowledge scores, and the pooled analysis showed a significant difference in favor of PBL compared to the traditional teaching (SMD = 0.76, 95%CI = 0.33–1.19). Ten studies provided usable data on skill scores, and a significant difference in favor of PBL was also observed (SMD = 1.46, 95%CI = 0.89–2.02). Statistically similar results were obtained in the sensitivity analysis, and there was no significant evidence of publication bias. These results suggested that PBL in physical diagnostics education in China appeared to be more effective than traditional teaching method in improving knowledge and skills. PMID:27808158
Wang, Jianmiao; Xu, Yongjian; Liu, Xiansheng; Xiong, Weining; Xie, Jungang; Zhao, Jianping
2016-11-03
Problem-based learning (PBL) has been extensively applied as an experimental educational method in Chinese medical schools over the past decade. A meta-analysis was performed to assess the effectiveness of PBL on students' learning outcomes in physical diagnostics education. Related databases were searched for eligible studies evaluating the effects of PBL compared to traditional teaching on students' knowledge and/or skill scores of physical diagnostics. Standardized mean difference (SMD) with 95% confidence interval (CI) was estimated. Thirteen studies with a total of 2086 medical students were included in this meta-analysis. All of these studies provided usable data on knowledge scores, and the pooled analysis showed a significant difference in favor of PBL compared to the traditional teaching (SMD = 0.76, 95%CI = 0.33-1.19). Ten studies provided usable data on skill scores, and a significant difference in favor of PBL was also observed (SMD = 1.46, 95%CI = 0.89-2.02). Statistically similar results were obtained in the sensitivity analysis, and there was no significant evidence of publication bias. These results suggested that PBL in physical diagnostics education in China appeared to be more effective than traditional teaching method in improving knowledge and skills.
Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.
Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M
2014-12-01
In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.
Does money matter in inflation forecasting?
NASA Astrophysics Data System (ADS)
Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.
2010-11-01
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.
2011-01-01
Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440
Discontinuous Galerkin Methods for Turbulence Simulation
NASA Technical Reports Server (NTRS)
Collis, S. Scott
2002-01-01
A discontinuous Galerkin (DG) method is formulated, implemented, and tested for simulation of compressible turbulent flows. The method is applied to turbulent channel flow at low Reynolds number, where it is found to successfully predict low-order statistics with fewer degrees of freedom than traditional numerical methods. This reduction is achieved by utilizing local hp-refinement such that the computational grid is refined simultaneously in all three spatial coordinates with decreasing distance from the wall. Another advantage of DG is that Dirichlet boundary conditions can be enforced weakly through integrals of the numerical fluxes. Both for a model advection-diffusion problem and for turbulent channel flow, weak enforcement of wall boundaries is found to improve results at low resolution. Such weak boundary conditions may play a pivotal role in wall modeling for large-eddy simulation.
Infrared face recognition based on LBP histogram and KW feature selection
NASA Astrophysics Data System (ADS)
Xie, Zhihua
2014-07-01
The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).
Statistical Literacy Social Media Project for the Masses
ERIC Educational Resources Information Center
Gundlach, Ellen; Maybee, Clarence; O'Shea, Kevin
2015-01-01
This article examines a social media assignment used to teach and practice statistical literacy with over 400 students each semester in large-lecture traditional, fully online, and flipped sections of an introductory-level statistics course. Following the social media assignment, students completed a survey on how they approached the assignment.…
Teaching Probabilities and Statistics to Preschool Children
ERIC Educational Resources Information Center
Pange, Jenny
2003-01-01
This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
A simple method of equine limb force vector analysis and its potential applications
Robinson, Mark A.; Clayton, Hilary M.
2018-01-01
Background Ground reaction forces (GRF) measured during equine gait analysis are typically evaluated by analyzing discrete values obtained from continuous force-time data for the vertical, longitudinal and transverse GRF components. This paper describes a simple, temporo-spatial method of displaying and analyzing sagittal plane GRF vectors. In addition, the application of statistical parametric mapping (SPM) is introduced to analyse differences between contra-lateral fore and hindlimb force-time curves throughout the stance phase. The overall aim of the study was to demonstrate alternative methods of evaluating functional (a)symmetry within horses. Methods GRF and kinematic data were collected from 10 horses trotting over a series of four force plates (120 Hz). The kinematic data were used to determine clean hoof contacts. The stance phase of each hoof was determined using a 50 N threshold. Vertical and longitudinal GRF for each stance phase were plotted both as force-time curves and as force vector diagrams in which vectors originating at the centre of pressure on the force plate were drawn at intervals of 8.3 ms for the duration of stance. Visual evaluation was facilitated by overlay of the vector diagrams for different limbs. Summary vectors representing the magnitude (VecMag) and direction (VecAng) of the mean force over the entire stance phase were superimposed on the force vector diagram. Typical measurements extracted from the force-time curves (peak forces, impulses) were compared with VecMag and VecAng using partial correlation (controlling for speed). Paired samples t-tests (left v. right diagonal pair comparison and high v. low vertical force diagonal pair comparison) were performed on discrete and vector variables using traditional methods and Hotelling’s T2 tests on normalized stance phase data using SPM. Results Evidence from traditional statistical tests suggested that VecMag is more influenced by the vertical force and impulse, whereas VecAng is more influenced by the longitudinal force and impulse. When used to evaluate mean data from the group of ten sound horses, SPM did not identify differences between the left and right contralateral limb pairs or between limb pairs classified according to directional asymmetry. When evaluating a single horse, three periods were identified during which differences in the forces between the left and right forelimbs exceeded the critical threshold (p < .01). Discussion Traditional statistical analysis of 2D GRF peak values, summary vector variables and visual evaluation of force vector diagrams gave harmonious results and both methods identified the same inter-limb asymmetries. As alpha was more tightly controlled using SPM, significance was only found in the individual horse although T2 plots followed the same trends as discrete analysis for the group. Conclusions The techniques of force vector analysis and SPM hold promise for investigations of sidedness and asymmetry in horses. PMID:29492341
Liu, Huiling; Xia, Bingbing; Yi, Dehui
2016-01-01
We propose a new feature extraction method of liver pathological image based on multispatial mapping and statistical properties. For liver pathological images of Hematein Eosin staining, the image of R and B channels can reflect the sensitivity of liver pathological images better, while the entropy space and Local Binary Pattern (LBP) space can reflect the texture features of the image better. To obtain the more comprehensive information, we map liver pathological images to the entropy space, LBP space, R space, and B space. The traditional Higher Order Local Autocorrelation Coefficients (HLAC) cannot reflect the overall information of the image, so we propose an average correction HLAC feature. We calculate the statistical properties and the average gray value of pathological images and then update the current pixel value as the absolute value of the difference between the current pixel gray value and the average gray value, which can be more sensitive to the gray value changes of pathological images. Lastly the HLAC template is used to calculate the features of the updated image. The experiment results show that the improved features of the multispatial mapping have the better classification performance for the liver cancer. PMID:27022407
A Classification of Remote Sensing Image Based on Improved Compound Kernels of Svm
NASA Astrophysics Data System (ADS)
Zhao, Jianing; Gao, Wanlin; Liu, Zili; Mou, Guifen; Lu, Lin; Yu, Lina
The accuracy of RS classification based on SVM which is developed from statistical learning theory is high under small number of train samples, which results in satisfaction of classification on RS using SVM methods. The traditional RS classification method combines visual interpretation with computer classification. The accuracy of the RS classification, however, is improved a lot based on SVM method, because it saves much labor and time which is used to interpret images and collect training samples. Kernel functions play an important part in the SVM algorithm. It uses improved compound kernel function and therefore has a higher accuracy of classification on RS images. Moreover, compound kernel improves the generalization and learning ability of the kernel.
Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis
NASA Astrophysics Data System (ADS)
Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.
We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.
Li, Qingling; Ma, Qiang; Li, Dan; Liu, Nana; Yang, Jiahui; Sun, Chun; Cheng, Cheng; Jia, Xuezhao; Wang, Jing; Zeng, Yonglei
2018-03-12
To analyze statistically the situation of the National Natural Science Foundation of China (NSFC) from 2007 to 2016 in the field of acupuncture and moxibustion for supporting the national Universities colleges of traditional Chinese medicine on the General Program (GP) and the National Science Fund for Young Scholars (NSFYS). In view of five aspects, named fund, supporting units, key words, method, disorder and signal path, the differences were compared between GP and NSFYS, the following characteristics were summarized. ① The fund aid was increased from 2007 through 2013 and down-regulated from 2013 through 2016. In recent ten years, the funding condition was fluctuated, but increasing in tendency generally. ② The relevant projects of the same research direction had been approved continuously for over 3 years in a part of TCM universities, in which, the research continuity was the hot topic. ③ Regarding the therapeutic methods, acupuncture was the chief therapy; electroacupuncture, moxibustion and acupoints were involved as well. ④ The disorders involved in the research were cerebral ischemia, myocardial ischemia and reperfusion injury. It is suggested that the ischemic disorder is predominated in the research. ⑤ The signal path occupied the main research index system, including cell proliferation, metabolism, immune, apoptosis and autophagy. The researches on the other aspects were less.
Yang, Peilan; Wang, Jie; Wu, Yingen; Zi, Suna; Tang, Jie; Wang, Zhenwei
2018-01-01
Purpose To compare the efficacy of individualized herbal decoction with standard decoction for patients with stable bronchiectasis through N-of-1 trials. Methods We conducted a single center N-of-1 trials in 17 patients with stable bronchiectasis. Each N-of-1 trial contains three cycles. Each cycle is divided into two 4-week intervention including individualized decoction and fixed decoction (control). The primary outcome was patient self-reported symptoms scores on a 1–7 point Likert scale. Secondary outcomes were 24-hour sputum volume and CAT scores. Results Among 14 completed trials, five showed that the individualized decoction was statistically better than the control decoction on symptom scores (P < 0.05) but was not clinically significant. The group data of all the trials showed that individualized decoction was superior to control decoction on symptom scores (2.13 ± 0.58 versus 2.30 ± 0.65, P = 0.002, mean difference and 95% CI: 0.18 (0.10, 0.25)), 24 h sputum volume (P = 0.009), and CAT scores (9.69 ± 4.89 versus 11.64 ± 5.59, P = 0.013, mean difference and 95% CI: 1.95 (1.04, 2.86)) but not clinically significant. Conclusion Optimizing the combined analysis of individual and group data and the improvement of statistical models may make contribution in establishing a method of evaluating clinical efficacy in line with the characteristics of traditional Chinese medicine individual diagnosis and treatment. PMID:29552084
Le, Linh Cu; Vu, Lan T H
2012-10-01
Globally, population surveys on HIV/AIDS and other sensitive topics have been using audio computer-assisted self interview for many years. This interview technique, however, is still new to Vietnam and little is known about its application and impact in general population surveys. One plausible hypothesis is that residents of Vietnam interviewed using this technique may provide a higher response rate and be more willing to reveal their true behaviors than if interviewed with traditional methods. This study aims to compare audio computer-assisted self interview with traditional face-to-face personal interview and self-administered interview with regard to rates of refusal and affirmative responses to questions on sensitive topics related to HIV/AIDS. In June 2010, a randomized study was conducted in three cities (Ha Noi, Da Nan and Can Tho), using a sample of 4049 residents aged 15 to 49 years. Respondents were randomly assigned to one of three interviewing methods: audio computer-assisted self interview, personal face-to-face interview, and self-administered paper interview. Instead of providing answers directly to interviewer questions as with traditional methods, audio computer-assisted self-interview respondents read the questions displayed on a laptop screen, while listening to the questions through audio headphones, then entered responses using a laptop keyboard. A MySQL database was used for data management and SPSS statistical package version 18 used for data analysis with bivariate and multivariate statistical techniques. Rates of high risk behaviors and mean values of continuous variables were compared for the three data collection methods. Audio computer-assisted self interview showed advantages over comparison techniques, achieving lower refusal rates and reporting higher prevalence of some sensitive and risk behaviors (perhaps indication of more truthful answers). Premarital sex was reported by 20.4% in the audio computer-assisted self-interview survey group, versus 11.4% in the face-to-face group and 11.1% in the self-administered paper questionnaire group. The pattern was consistent for both male and female respondents and in both urban and rural settings. Men in the audio computer-assisted self-interview group also reported higher levels of high-risk sexual behavior--such as sex with sex workers and a higher average number of sexual partners--than did women in the same group. Importantly, item refusal rates on sensitive topics tended to be lower with audio computer-assisted self interview than with the other two methods. Combined with existing data from other countries and previous studies in Vietnam, these findings suggest that researchers should consider using audio computer-assisted self interview for future studies of sensitive and stigmatized topics, especially for men.
NASA Astrophysics Data System (ADS)
Dukes, Michael Dickey
The objective of this research is to compare problem-based learning and lecture as methods to teach whole-systems design to engineering students. A case study, Appendix A, exemplifying successful whole-systems design was developed and written by the author in partnership with the Rocky Mountain Institute. Concepts to be tested were then determined, and a questionnaire was developed to test students' preconceptions. A control group of students was taught using traditional lecture methods, and a sample group of students was taught using problem-based learning methods. After several weeks, the students were given the same questionnaire as prior to the instruction, and the data was analyzed to determine if the teaching methods were effective in correcting misconceptions. A statistically significant change in the students' preconceptions was observed in both groups on the topic of cost related to the design process. There was no statistically significant change in the students' preconceptions concerning the design process, technical ability within five years, and the possibility of drastic efficiency gains with current technologies. However, the results were inconclusive in determining that problem-based learning is more effective than lecture as a method for teaching the concept of whole-systems design, or vice versa.
Current application of chemometrics in traditional Chinese herbal medicine research.
Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke
2016-07-15
Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.
Smith, Aaron Douglas; Lockman, Nur Ain; Holtzapple, Mark T
2011-06-01
Nutrients are essential for microbial growth and metabolism in mixed-culture acid fermentations. Understanding the influence of nutrient feeding strategies on fermentation performance is necessary for optimization. For a four-bottle fermentation train, five nutrient contacting patterns (single-point nutrient addition to fermentors F1, F2, F3, and F4 and multi-point parallel addition) were investigated. Compared to the traditional nutrient contacting method (all nutrients fed to F1), the near-optimal feeding strategies improved exit yield, culture yield, process yield, exit acetate-equivalent yield, conversion, and total acid productivity by approximately 31%, 39%, 46%, 31%, 100%, and 19%, respectively. There was no statistical improvement in total acid concentration. The traditional nutrient feeding strategy had the highest selectivity and acetate-equivalent selectivity. Total acid productivity depends on carbon-nitrogen ratio.
Traditional Practices of Mothers in the Postpartum Period: Evidence from Turkey.
Altuntuğ, Kamile; Anık, Yeşim; Ege, Emel
2018-03-01
In various cultures, the postpartum period is a sensitive time and various traditional practices are applied to protect the health of the mother and the baby. The aim of this study was to determine traditional practices of mother care in the postpartum period in Konya City of Turkey. The research was a descriptive, cross-sectional study carried out among 291 women at the first 8 weeks of postpartum period who visited to family health centers from June 1 to December 1, 2015. The data were collected using questionnaires. Statistical analysis of the data was done with SSPS version 22.0. Descriptive statistics were used to analyze the data. Based on the results, 84.5% of women applied a traditional mother care practice during the postpartum period. The most popular, were practices for increasing of breast milk (97.9%), preventing incubus "albasması" (81.8%), getting rid of incubus (74.9%), and preventing postpartum bleeding (14.1%).The findings of the study show that traditional practices towards mother care in the period after birth are common. In order to provide better health services, it is important for health professionals to understand the traditional beliefs and practices of the individuals, families, and society that they serve.
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Immersed boundary method for Boltzmann model kinetic equations
NASA Astrophysics Data System (ADS)
Pekardan, Cem; Chigullapalli, Sruti; Sun, Lin; Alexeenko, Alina
2012-11-01
Three different immersed boundary method formulations are presented for Boltzmann model kinetic equations such as Bhatnagar-Gross-Krook (BGK) and Ellipsoidal statistical Bhatnagar-Gross-Krook (ESBGK) model equations. 1D unsteady IBM solution for a moving piston is compared with the DSMC results and 2D quasi-steady microscale gas damping solutions are verified by a conformal finite volume method solver. Transient analysis for a sinusoidally moving beam is also carried out for the different pressure conditions (1 atm, 0.1 atm and 0.01 atm) corresponding to Kn=0.05,0.5 and 5. Interrelaxation method (Method 2) is shown to provide a faster convergence as compared to the traditional interpolation scheme used in continuum IBM formulations. Unsteady damping in rarefied regime is characterized by a significant phase-lag which is not captured by quasi-steady approximations.
Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James
2014-01-01
The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259
MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James
2014-06-01
The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.
Statistical Evaluation and Improvement of Methods for Combining Random and Harmonic Loads
NASA Technical Reports Server (NTRS)
Brown, A. M.; McGhee, D. S.
2003-01-01
Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield/ultimate strength and high- cycle fatigue capability. This Technical Publication examines the cumulative distribution percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematica to calculate the combined value corresponding to any desired percentile is then presented along with a curve tit to this value. Another Excel macro that calculates the combination using Monte Carlo simulation is shown. Unlike the traditional techniques. these methods quantify the calculated load value with a consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Additionally, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can substantially lower the design loading without losing any of the identified structural reliability.
Statistical Comparison and Improvement of Methods for Combining Random and Harmonic Loads
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; McGhee, David S.
2004-01-01
Structures in many environments experience both random and harmonic excitation. A variety of closed-form techniques has been used in the aerospace industry to combine the loads resulting from the two sources. The resulting combined loads are then used to design for both yield ultimate strength and high cycle fatigue capability. This paper examines the cumulative distribution function (CDF) percentiles obtained using each method by integrating the joint probability density function of the sine and random components. A new Microsoft Excel spreadsheet macro that links with the software program Mathematics is then used to calculate the combined value corresponding to any desired percentile along with a curve fit to this value. Another Excel macro is used to calculate the combination using a Monte Carlo simulation. Unlike the traditional techniques, these methods quantify the calculated load value with a Consistent percentile. Using either of the presented methods can be extremely valuable in probabilistic design, which requires a statistical characterization of the loading. Also, since the CDF at high probability levels is very flat, the design value is extremely sensitive to the predetermined percentile; therefore, applying the new techniques can lower the design loading substantially without losing any of the identified structural reliability.
Hobbs, Brian P.; Carlin, Bradley P.; Mandrekar, Sumithra J.; Sargent, Daniel J.
2011-01-01
Summary Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected Type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this paper, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen. PMID:21361892
Hamill, Daniel; Buscombe, Daniel; Wheaton, Joseph M
2018-01-01
Side scan sonar in low-cost 'fishfinder' systems has become popular in aquatic ecology and sedimentology for imaging submerged riverbed sediment at coverages and resolutions sufficient to relate bed texture to grain-size. Traditional methods to map bed texture (i.e. physical samples) are relatively high-cost and low spatial coverage compared to sonar, which can continuously image several kilometers of channel in a few hours. Towards a goal of automating the classification of bed habitat features, we investigate relationships between substrates and statistical descriptors of bed textures in side scan sonar echograms of alluvial deposits. We develop a method for automated segmentation of bed textures into between two to five grain-size classes. Second-order texture statistics are used in conjunction with a Gaussian Mixture Model to classify the heterogeneous bed into small homogeneous patches of sand, gravel, and boulders with an average accuracy of 80%, 49%, and 61%, respectively. Reach-averaged proportions of these sediment types were within 3% compared to similar maps derived from multibeam sonar.
Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements
NASA Astrophysics Data System (ADS)
Papa, A. R.; Akel, A. F.
2009-05-01
Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.
Kohno, R; Hotta, K; Nishioka, S; Matsubara, K; Tansho, R; Suzuki, T
2011-11-21
We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30-16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9-67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning.
Reconstruction of a Real World Social Network using the Potts Model and Loopy Belief Propagation.
Bisconti, Cristian; Corallo, Angelo; Fortunato, Laura; Gentile, Antonio A; Massafra, Andrea; Pellè, Piergiuseppe
2015-01-01
The scope of this paper is to test the adoption of a statistical model derived from Condensed Matter Physics, for the reconstruction of the structure of a social network. The inverse Potts model, traditionally applied to recursive observations of quantum states in an ensemble of particles, is here addressed to observations of the members' states in an organization and their (anti)correlations, thus inferring interactions as links among the members. Adopting proper (Bethe) approximations, such an inverse problem is showed to be tractable. Within an operational framework, this network-reconstruction method is tested for a small real-world social network, the Italian parliament. In this study case, it is easy to track statuses of the parliament members, using (co)sponsorships of law proposals as the initial dataset. In previous studies of similar activity-based networks, the graph structure was inferred directly from activity co-occurrences: here we compare our statistical reconstruction with such standard methods, outlining discrepancies and advantages.
Reconstruction of a Real World Social Network using the Potts Model and Loopy Belief Propagation
Bisconti, Cristian; Corallo, Angelo; Fortunato, Laura; Gentile, Antonio A.; Massafra, Andrea; Pellè, Piergiuseppe
2015-01-01
The scope of this paper is to test the adoption of a statistical model derived from Condensed Matter Physics, for the reconstruction of the structure of a social network. The inverse Potts model, traditionally applied to recursive observations of quantum states in an ensemble of particles, is here addressed to observations of the members' states in an organization and their (anti)correlations, thus inferring interactions as links among the members. Adopting proper (Bethe) approximations, such an inverse problem is showed to be tractable. Within an operational framework, this network-reconstruction method is tested for a small real-world social network, the Italian parliament. In this study case, it is easy to track statuses of the parliament members, using (co)sponsorships of law proposals as the initial dataset. In previous studies of similar activity-based networks, the graph structure was inferred directly from activity co-occurrences: here we compare our statistical reconstruction with such standard methods, outlining discrepancies and advantages. PMID:26617539
NASA Astrophysics Data System (ADS)
Koparan, Timur
2016-02-01
In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.
Zhou, Xiangrong; Xu, Rui; Hara, Takeshi; Hirano, Yasushi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Kido, Shoji; Fujita, Hiroshi
2014-07-01
The shapes of the inner organs are important information for medical image analysis. Statistical shape modeling provides a way of quantifying and measuring shape variations of the inner organs in different patients. In this study, we developed a universal scheme that can be used for building the statistical shape models for different inner organs efficiently. This scheme combines the traditional point distribution modeling with a group-wise optimization method based on a measure called minimum description length to provide a practical means for 3D organ shape modeling. In experiments, the proposed scheme was applied to the building of five statistical shape models for hearts, livers, spleens, and right and left kidneys by use of 50 cases of 3D torso CT images. The performance of these models was evaluated by three measures: model compactness, model generalization, and model specificity. The experimental results showed that the constructed shape models have good "compactness" and satisfied the "generalization" performance for different organ shape representations; however, the "specificity" of these models should be improved in the future.
Recent advances in statistical energy analysis
NASA Technical Reports Server (NTRS)
Heron, K. H.
1992-01-01
Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.
Han, Qi; Li, Hong-Hai; Fan, Cui-Ping; Liu, Chun; Liang, Yong-Lin
2016-07-01
Nausea is special in the symptoms, and is different from hiccups and vomiting. The main symptom is that the patients throw up the indigested food from the stomach regularly--if the patients have a dinner, they will throw out it in the next morning, or if the patients have a breakfast, they will throw out it at night. Nausea is common in clinic, and different physicians may use different treatment methods for it. This disease also cannot be treated efficiently and may happen repeatedly with the western medicine. In this study, the composition principles of prescriptions in past traditional Chinese medicine for nausea were analyzed and summarized by using traditional Chinese medicine inheritance support system(V2.5), hoping to provide guidance for clinical drug use and summarize the basic rules for treatment of nausea.The prescriptions for nausea in "the prescription of traditional Chinese medicine dictionary" were selected, and the information was entered into the traditional Chinese medicine inheritance support system(TCMISS) to build a database. Data mining methods such as frequency statistics, association rules, complex system entropy clustering were used to analyze and summarize the composition principles of these prescriptions. The herb frequencies of the prescriptions were finally determined; herbs with higher use frequencies were obtained; and the association rules between herbs were found. 19 commonly used herb pairs, 10 core combinations and 10 newly developed prescriptions were found. The basic pathogenesis of nausea in traditional Chinese medicine is the weakness and coldness of spleen and stomach, and the Qi adverseness of stomach. Generations of physicians' main therapeutic method for nausea is mainly to warm the middle and invigorate the spleen, lower Qi and regulate stomach. The commonly used herbs for nausea are ginger, ginseng, large head attractylodes, tuckahoe, licorice, and appropriately supplemented with the herbs of eliminating dampness and eliminating phlegm, and regulating Qi-flowing for harmonizing stomach. In addition, it shall be treated according to the different accompanying syndromes such as phlegm, blood stasis, and yin deficiency. Copyright© by the Chinese Pharmaceutical Association.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
The Interaction of TXNIP and AFq1 Genes Increases the Susceptibility of Schizophrenia.
Su, Yousong; Ding, Wenhua; Xing, Mengjuan; Qi, Dake; Li, Zezhi; Cui, Donghong
2017-08-01
Although previous studies showed the reduced risk of cancer in patients with schizophrenia, whether patients with schizophrenia possess genetic factors that also contribute to tumor suppressor is still unknown. In the present study, based on our previous microarray data, we focused on the tumor suppressor genes TXNIP and AF1q, which differentially expressed in patients with schizophrenia. A total of 413 patients and 578 healthy controls were recruited. We found no significant differences in genotype, allele, or haplotype frequencies at the selected five single nucleotide polymorphisms (SNPs) (rs2236566 and rs7211 in TXNIP gene; rs10749659, rs2140709, and rs3738481 in AF1q gene) between patients with schizophrenia and controls. However, we found the association between the interaction of TXNIP and AF1q with schizophrenia by using the MDR method followed by traditional statistical analysis. The best gene-gene interaction model identified was a three-locus model TXNIP (rs2236566, rs7211)-AF1q (rs2140709). After traditional statistical analysis, we found the high-risk genotype combination was rs2236566 (GG)-rs7211(CC)-rs2140709(CC) (OR = 1.35 [1.03-1.76]). The low-risk genotype combination was rs2236566 (GT)-rs7211(CC)-rs2140709(CC) (OR = 0.67 [0.49-0.91]). Our finding suggested statistically significant role of interaction of TXNIP and AF1q polymorphisms (TXNIP-rs2236566, TXNIP-rs7211, and AF1q-rs2769605) in schizophrenia susceptibility.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Background Machine learning methods may complement traditional analytic methods for medical device surveillance. Methods and results Using data from the National Cardiovascular Data Registry for implantable cardioverter–defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%–20.9%; nonfatal ICD-related adverse events, 19.3%–26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%–37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=−0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=−0.042). Conclusion Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance. PMID:28860874
Efficient Statistically Accurate Algorithms for the Fokker-Planck Equation in Large Dimensions
NASA Astrophysics Data System (ADS)
Chen, N.; Majda, A.
2017-12-01
Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method, which is based on an effective data assimilation framework, provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace. Therefore, it is computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from the traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has a significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O(100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.
Antibacterial Effect of Diode Laser in Pulpectomy of Primary Teeth.
Bahrololoomi, Zahra; Fekrazad, Reza; Zamaninejad, Shiva
2017-01-01
Introduction: Laser irradiation has been suggested as an adjunct to traditional methods of canal preparation but few studies are available on the antibacterial effect of diode laser in pulpectomy of primary teeth. The purpose of the present study is to investigate the antibacterial effect of diode laser in pulpectomy of primary teeth, in addition to define the optimal and harmless diode lasing conditions in the root canal. Methods: A total of 125 single rooted primary teeth were selected. After traditional canal cleaning, they were divided in 2 groups. Sixty-five specimens after culturing of Enterococcus faecalis into the canals, were divided in 3 groups: (1) traditional canal cleaning with 0.5% NaOCl irrigation, (2) method of group 1+ 1.5 W diode laser (980 nm, pulse), (3) without treatment (5 specimens). Then the specimens were cultured and after colony counting under light microscope, were statistically analyzed by Kruskal-Wallis and Mann-Whitney tests. For 60 specimens, temperature rise of apical and cervical parts of the external root surface were measured using 2 thermocouple type K, when radiating a 1.5 W diode laser into the canal. Results: In the first experiment, the diode laser group showed tmost reduction in bacterial count. And in the second experiment, the mean temperature rise of external root surface was less than the threshold of periodontal ligament (PDL) damage. Conclusion: Diode laser with a power output of 1.5 W, is effective in reduction of E. faecalis bacterial count without damaging periodontal structures.
Mann, Caroline E; Himelein, Melissa J
2008-07-01
This research aims to compare the effectiveness of two methods of teaching psychopathology in reducing stigma toward mental illness. Based on previous stigma research, a first-person, narrative approach was contrasted with traditional, diagnosis-centered education. STUDY 1 Participants consisted of 53 undergraduates at a small, public university enrolled in two introductory psychology classes. During six hours of class time focused on psychopathology, one class received the experimental pedagogy while the other served as a control, receiving traditional instruction. Stigma was assessed pre- and post-intervention using a social distance scale and vignette design. Statistical analyses compared means and change scores between the two classes. STUDY 1 Students in the experimental classroom showed a significant decrease in stigma following the intervention, whereas those in the control group showed no change. STUDY 2 A follow-up study was conducted to replicate the promising effects demonstrated in Study 1. Two additional classrooms (n = 48) were both exposed to the first-person, narrative pedagogy, and their stigma monitored pre- and post- intervention. STUDY 2 Students reported a significant decrease in stigma following the intervention. Together, these studies suggest that traditional methods of teaching psychopathology do not lessen mental illness stigma, a serious concern that can potentially be reconciled by incorporating more person-centered instructional methods. Results are discussed in terms of their implications for the way psychopathology is taught throughout the mental health field, as well as the practical application of stigma interventions woven into the curriculum.
Direct Statistical Simulation of Astrophysical and Geophysical Flows
NASA Astrophysics Data System (ADS)
Marston, B.; Tobias, S.
2011-12-01
Astrophysical and geophysical flows are amenable to direct statistical simulation (DSS), the calculation of statistical properties that does not rely upon accumulation by direct numerical simulation (DNS) (Tobias and Marston, 2011). Anisotropic and inhomogeneous flows, such as those found in the atmospheres of planets, in rotating stars, and in disks, provide the starting point for an expansion in fluctuations about the mean flow, leading to a hierarchy of equations of motion for the equal-time cumulants. The method is described for a general set of evolution equations, and then illustrated for two specific cases: (i) A barotropic jet on a rotating sphere (Marston, Conover, and Schneider, 2008); and (ii) A model of a stellar tachocline driven by relaxation to an underlying flow with shear (Cally 2001) for which a joint instability arises from the combination of shearing forces and magnetic stress. The reliability of DSS is assessed by comparing statistics so obtained against those accumulated from DNS, the traditional approach. The simplest non-trivial closure, CE2, sets the third and higher cumulants to zero yet yields qualitatively accurate low-order statistics for both systems. Physically CE2 retains only the eddy-mean flow interaction, and drops the eddy-eddy interaction. Quantitatively accurate zonal means are found for barotropic jet for long and short (but not intermediate) relaxation times, and for Cally problem in the case of strong shearing and large magnetic fields. Deficiencies in CE2 can be repaired at the CE3 level, that is by retaining the third cumulant (Marston 2011). We conclude by discussing possible extensions of the method both in terms of computational methods and the range of astrophysical and geophysical problems that are of interest.
Liu, Yu-Qi; Liu, Meng-Yu; Li, Chun; Shi, Nan-Nan; Wang, Yue-Xi; Wang, Li-Ying; Zhao, Xue-Yao; Kou, Shuang; Han, Xue-Jie; Wang, Yan-Ping
2017-09-01
This study is to assess the Guidelines for Diagnosis and Treatment of Common Diseases of Otolaryngology in Traditional Chinese Medicine in clinical application and provide evidence for further guideline revision. The assessment was divided into applicability assessment and practicability assessment. The applicability assessment based on questionnaire survey and the traditional Chinese medicine (TCM) practitioners were asked to independently fill the Questionnaire for Applicability Assessment on the Guidelines for Diagnosis and Treatment in Traditional Chinese Medicine. The practicability assessment was based on prospective case investigation and analysis method and the TCM practitioners-in-charge filled the Case Investigation Questionnaire for Practicability Assessment on the Guidelines for Diagnosis and Treatment in Traditional Chinese Medicine. The data were analyzed in descriptive statistics. 151 questionnaires were investigated for applicability assessment and 1 016 patients were included for practicability assessment. The results showed that 88.74% of them were familiar with the guidelines and 45.70% used them. The guidelines quality and related items were similar in applicability assessment and practicability assessment, and scored highly as more than 85.00% except the "recuperating and prevention". The results suggested that the quality of Guidelines for Diagnosis and Treatment of Common Diseases of Otolaryngology in Traditional Chinese Medicine was high and could better guide the clinical practice. The "recuperating and prevention" part should be improved and the evidence data should be included in future guideline revision, so that the clinical utilization rate could be increased. Copyright© by the Chinese Pharmaceutical Association.
Laparoscopic skills acquisition: a study of simulation and traditional training.
Marlow, Nicholas; Altree, Meryl; Babidge, Wendy; Field, John; Hewett, Peter; Maddern, Guy J
2014-12-01
Training in basic laparoscopic skills can be undertaken using traditional methods, where trainees are educated by experienced surgeons through a process of graduated responsibility or by simulation-based training. This study aimed to assess whether simulation trained individuals reach the same level of proficiency in basic laparoscopic skills as traditional trained participants when assessed in a simulated environment. A prospective study was undertaken. Participants were allocated to one of two cohorts according to surgical experience. Participants from the inexperienced cohort were randomized to receive training in basic laparoscopic skills on either a box trainer or a virtual reality simulator. They were then assessed on the simulator on which they did not receive training. Participants from the experienced cohort, considered to have received traditional training in basic laparoscopic skills, did not receive simulation training and were randomized to either the box trainer or virtual reality simulator for skills assessment. The assessment scores from different cohorts on either simulator were then compared. A total of 138 participants completed the assessment session, 101 in the inexperienced simulation-trained cohort and 37 on the experienced traditionally trained cohort. There was no statistically significant difference between the training outcomes of simulation and traditionally trained participants, irrespective of the simulator type used. The results demonstrated that participants trained on either a box trainer or virtual reality simulator achieved a level of basic laparoscopic skills assessed in a simulated environment that was not significantly different from participants who had been traditionally trained in basic laparoscopic skills. © 2013 Royal Australasian College of Surgeons.
Raffelt, David A.; Smith, Robert E.; Ridgway, Gerard R.; Tournier, J-Donald; Vaughan, David N.; Rose, Stephen; Henderson, Robert; Connelly, Alan
2015-01-01
In brain regions containing crossing fibre bundles, voxel-average diffusion MRI measures such as fractional anisotropy (FA) are difficult to interpret, and lack within-voxel single fibre population specificity. Recent work has focused on the development of more interpretable quantitative measures that can be associated with a specific fibre population within a voxel containing crossing fibres (herein we use fixel to refer to a specific fibre population within a single voxel). Unfortunately, traditional 3D methods for smoothing and cluster-based statistical inference cannot be used for voxel-based analysis of these measures, since the local neighbourhood for smoothing and cluster formation can be ambiguous when adjacent voxels may have different numbers of fixels, or ill-defined when they belong to different tracts. Here we introduce a novel statistical method to perform whole-brain fixel-based analysis called connectivity-based fixel enhancement (CFE). CFE uses probabilistic tractography to identify structurally connected fixels that are likely to share underlying anatomy and pathology. Probabilistic connectivity information is then used for tract-specific smoothing (prior to the statistical analysis) and enhancement of the statistical map (using a threshold-free cluster enhancement-like approach). To investigate the characteristics of the CFE method, we assessed sensitivity and specificity using a large number of combinations of CFE enhancement parameters and smoothing extents, using simulated pathology generated with a range of test-statistic signal-to-noise ratios in five different white matter regions (chosen to cover a broad range of fibre bundle features). The results suggest that CFE input parameters are relatively insensitive to the characteristics of the simulated pathology. We therefore recommend a single set of CFE parameters that should give near optimal results in future studies where the group effect is unknown. We then demonstrate the proposed method by comparing apparent fibre density between motor neurone disease (MND) patients with control subjects. The MND results illustrate the benefit of fixel-specific statistical inference in white matter regions that contain crossing fibres. PMID:26004503
Mayrink, Gabriela; Sawazaki, Renato; Asprino, Luciana; de Moraes, Márcio; Fernandes Moreira, Roger William
2011-11-01
Compare the traditional method of mounting dental casts on a semiadjustable articulator and the new method suggested by Wolford and Galiano, 1 analyzing the inclination of maxillary occlusal plane in relation to FHP. Two casts of 10 patients were obtained. One of them was used for mounting of models on a traditional articulator, by using a face bow transfer system and the other one was used to mounting models at Occlusal Plane Indicator platform (OPI), using the SAM articulator. After that, na analysis of the accuracy of mounting models was performed. The angle made by de occlusal plane and FHP on the cephalogram should be equal the angle between the occlusal plane and the upper member of the articulator. The measures were tabulated in Microsoft Excell(®) and calculated using a 1-way analysis variance. Statistically, the results did not reveal significant differences among the measures. OPI and face bow presents similar results but more studies are needed to verify its accuracy relative to the maxillary cant in OPI or develop new techniques able to solve the disadvantages of each technique. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Chen, Qian-Qian; Liu, Xiao-Dong; Liu, Wen-Qi; Jiang, Shan
2011-10-01
Compared with traditional chemical analysis methods, reflectance spectroscopy has the advantages of speed, minimal or no sample preparation, non-destruction, and low cost. In order to explore the potential application of spectroscopy technology in the paleolimnological study on Antarctic lakes, we took a lake sediment core in Mochou Lake at Zhongshan Station of Antarctic, and analyzed the near infrared reflectance spectroscopy (NIRS) data in the sedimentary samples. The results showed that the factor loadings of principal component analysis (PCA) displayed very similar depth-profile change pattern with the S2 index, a reliable proxy for the change in historical lake primary productivity. The correlation analysis showed that the values of PCA factor loading and S2 were correlated significantly, suggesting that it is feasible to infer paleoproductivity changes recorded in Antarctic lakes using NIRS technology. Compared to the traditional method of the trough area between 650 and 700 nm, the authors found that the PCA statistical approach was more accurate for reconstructing the change in historical lake primary productivity. The results reported here demonstrate that reflectance spectroscopy can provide a rapid method for the reconstruction of lake palaeoenviro nmental change in the remote Antarctic regions.
Computing the nucleon charge and axial radii directly at Q2=0 in lattice QCD
NASA Astrophysics Data System (ADS)
Hasan, Nesreen; Green, Jeremy; Meinel, Stefan; Engelhardt, Michael; Krieg, Stefan; Negele, John; Pochinsky, Andrew; Syritsyn, Sergey
2018-02-01
We describe a procedure for extracting momentum derivatives of nucleon matrix elements on the lattice directly at Q2=0 . This is based on the Rome method for computing momentum derivatives of quark propagators. We apply this procedure to extract the nucleon isovector magnetic moment and charge radius as well as the isovector induced pseudoscalar form factor at Q2=0 and the axial radius. For comparison, we also determine these quantities with the traditional approach of computing the corresponding form factors, i.e. GEv(Q2) and GMv(Q2) for the case of the vector current and GPv(Q2) and GAv(Q2) for the axial current, at multiple Q2 values followed by z -expansion fits. We perform our calculations at the physical pion mass using a 2HEX-smeared Wilson-clover action. To control the effects of excited-state contamination, the calculations were done at three source-sink separations and the summation method was used. The derivative method produces results consistent with those from the traditional approach but with larger statistical uncertainties especially for the isovector charge and axial radii.
Poon, Woei Bing; Tagamolila, Vina; Toh, Ying Pin Anne; Cheng, Zai Ru
2015-01-01
INTRODUCTION Various meta-analyses have shown that e-learning is as effective as traditional methods of continuing professional education. However, there are some disadvantages to e-learning, such as possible technical problems, the need for greater self-discipline, cost involved in developing programmes and limited direct interaction. Currently, most strategies for teaching amplitude-integrated electroencephalography (aEEG) in neonatal intensive care units (NICUs) worldwide depend on traditional teaching methods. METHODS We implemented a programme that utilised an integrated approach to e-learning. The programme consisted of three sessions of supervised protected time e-learning in an NICU. The objective and subjective effectiveness of the approach was assessed through surveys administered to participants before and after the programme. RESULTS A total of 37 NICU staff (32 nurses and 5 doctors) participated in the study. 93.1% of the participants appreciated the need to acquire knowledge of aEEG. We also saw a statistically significant improvement in the subjective knowledge score (p = 0.041) of the participants. The passing rates for identifying abnormal aEEG tracings (defined as ≥ 3 correct answers out of 5) also showed a statistically significant improvement (from 13.6% to 81.8%, p < 0.001). Among the participants who completed the survey, 96.0% felt the teaching was well structured, 77.8% felt the duration was optimal, 80.0% felt that they had learnt how to systematically interpret aEEGs, and 70.4% felt that they could interpret normal aEEG with confidence. CONCLUSION An integrated approach to e-learning can help improve subjective and objective knowledge of aEEG. PMID:25820847
NASA Astrophysics Data System (ADS)
Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.
2016-03-01
Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.
Toti, Giulia; Vilalta, Ricardo; Lindner, Peggy; Lefer, Barry; Macias, Charles; Price, Daniel
2016-11-01
Traditional studies on effects of outdoor pollution on asthma have been criticized for questionable statistical validity and inefficacy in exploring the effects of multiple air pollutants, alone and in combination. Association rule mining (ARM), a method easily interpretable and suitable for the analysis of the effects of multiple exposures, could be of use, but the traditional interest metrics of support and confidence need to be substituted with metrics that focus on risk variations caused by different exposures. We present an ARM-based methodology that produces rules associated with relevant odds ratios and limits the number of final rules even at very low support levels (0.5%), thanks to post-pruning criteria that limit rule redundancy and control for statistical significance. The methodology has been applied to a case-crossover study to explore the effects of multiple air pollutants on risk of asthma in pediatric subjects. We identified 27 rules with interesting odds ratio among more than 10,000 having the required support. The only rule including only one chemical is exposure to ozone on the previous day of the reported asthma attack (OR=1.14). 26 combinatory rules highlight the limitations of air quality policies based on single pollutant thresholds and suggest that exposure to mixtures of chemicals is more harmful, with odds ratio as high as 1.54 (associated with the combination day0 SO 2 , day0 NO, day0 NO 2 , day1 PM). The proposed method can be used to analyze risk variations caused by single and multiple exposures. The method is reliable and requires fewer assumptions on the data than parametric approaches. Rules including more than one pollutant highlight interactions that deserve further investigation, while helping to limit the search field. Copyright © 2016 Elsevier B.V. All rights reserved.
A Commercial IOTV Cleaning Study
2010-04-12
manufacturer’s list price without taking into consideration of possible volume discount. Equipment depreciation cost was calculated based on...Laundering with Prewash Spot Cleaning) 32 Table 12 Shrinkage Statistical Data (Traditional Wet Laundering without Prewash Spot Cleaning...Statistical Data (Computer-controlled Wet Cleaning without Prewash Spot Cleaning) 35 Table 15 Shrinkage Statistical Data (Liquid CO2 Cleaning
ERIC Educational Resources Information Center
Owens, Susan T.
2017-01-01
Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…
ERIC Educational Resources Information Center
Hldreth, Laura A.; Robison-Cox, Jim; Schmidt, Jade
2018-01-01
This study examines the transferability of results from previous studies of simulation-based curriculum in introductory statistics using data from 3,500 students enrolled in an introductory statistics course at Montana State University from fall 2013 through spring 2016. During this time, four different curricula, a traditional curriculum and…
Dipnall, Joanna F.
2016-01-01
Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009–2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. Results After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). Conclusion The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin. PMID:26848571
a Data Field Method for Urban Remotely Sensed Imagery Classification Considering Spatial Correlation
NASA Astrophysics Data System (ADS)
Zhang, Y.; Qin, K.; Zeng, C.; Zhang, E. B.; Yue, M. X.; Tong, X.
2016-06-01
Spatial correlation between pixels is important information for remotely sensed imagery classification. Data field method and spatial autocorrelation statistics have been utilized to describe and model spatial information of local pixels. The original data field method can represent the spatial interactions of neighbourhood pixels effectively. However, its focus on measuring the grey level change between the central pixel and the neighbourhood pixels results in exaggerating the contribution of the central pixel to the whole local window. Besides, Geary's C has also been proven to well characterise and qualify the spatial correlation between each pixel and its neighbourhood pixels. But the extracted object is badly delineated with the distracting salt-and-pepper effect of isolated misclassified pixels. To correct this defect, we introduce the data field method for filtering and noise limitation. Moreover, the original data field method is enhanced by considering each pixel in the window as the central pixel to compute statistical characteristics between it and its neighbourhood pixels. The last step employs a support vector machine (SVM) for the classification of multi-features (e.g. the spectral feature and spatial correlation feature). In order to validate the effectiveness of the developed method, experiments are conducted on different remotely sensed images containing multiple complex object classes inside. The results show that the developed method outperforms the traditional method in terms of classification accuracies.
Olokundun, Maxwell; Moses, Chinonye Love; Iyiola, Oluwole; Ibidunni, Stephen; Ogbari, Mercy; Peter, Fred; Borishade, Taiye
2018-08-01
Traditional methods of teaching entrepreneurship in universities involves more theoretical approaches which are less effective in motivating considerations for an entrepreneurship career. This owes to the fact that such techniques essentially make students develop a dormant attitude rather than active participation. Expert views suggest that experiential entrepreneurship teaching methods in universities which involve practical activities and active participation can be considered salient to students' development of entrepreneurial interest an business startup potentials. This present study presents data on the extent to which experiential teaching methods in entrepreneurship adopted by Nigerian universities stimulate students' entrepreneurial interest and business startups. Data have been gathered following a descriptive cross-sectional quantitative survey conducted among university students ( N = 600) of four selected institutions in Nigeria offering a degree programme in entrepreneurship. Hierarchical Multiple Regression Analysis was used in confirming the hypothesis proposed in the study using the Statistical Package for Social Sciences (SPSS) version 22.The findings from the analysis showed that the adoption of experiential practical activities considered as best practices in entrepreneurship teaching in Nigerian universities can stimulate students' interest and drive for engaging in business start-up activities even as undergraduates. The field data set is made extensively available to allow for critical investigation.
Halloran, L
1995-01-01
Computers increasingly are being integrated into nursing education. One method of integration is through computer managed instruction (CMI). Recently, technology has become available that allows the integration of keypad questions into CMI. This brings a new type of interactivity between students and teachers into the classroom. The purpose of this study was to evaluate differences in achievement between a control group taught by traditional classroom lecture (TCL) and an experimental group taught using CMI and keypad questions. Both control and experimental groups consisted of convenience samples of junior nursing students in a baccalaureate program taking a medical/surgical nursing course. Achievement was measured by three instructor-developed multiple choice examinations. Findings demonstrated that although the experimental group demonstrated increasingly higher test scores as the semester progressed, no statistical difference was found in achievement between the two groups. One reason for this may be phenomenon of vampire video. Initially, the method of presentation overshadowed the content. As students became desensitized to the method, they were able to focus and absorb more content. This study suggests that CMI and keypads are a viable teaching option for nursing education. It is equal to TCL in student achievement and provides a new level of interaction in the classroom setting.
NASA Astrophysics Data System (ADS)
Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao
2018-04-01
In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.
Statistical alignment: computational properties, homology testing and goodness-of-fit.
Hein, J; Wiuf, C; Knudsen, B; Møller, M B; Wibling, G
2000-09-08
The model of insertions and deletions in biological sequences, first formulated by Thorne, Kishino, and Felsenstein in 1991 (the TKF91 model), provides a basis for performing alignment within a statistical framework. Here we investigate this model.Firstly, we show how to accelerate the statistical alignment algorithms several orders of magnitude. The main innovations are to confine likelihood calculations to a band close to the similarity based alignment, to get good initial guesses of the evolutionary parameters and to apply an efficient numerical optimisation algorithm for finding the maximum likelihood estimate. In addition, the recursions originally presented by Thorne, Kishino and Felsenstein can be simplified. Two proteins, about 1500 amino acids long, can be analysed with this method in less than five seconds on a fast desktop computer, which makes this method practical for actual data analysis.Secondly, we propose a new homology test based on this model, where homology means that an ancestor to a sequence pair can be found finitely far back in time. This test has statistical advantages relative to the traditional shuffle test for proteins.Finally, we describe a goodness-of-fit test, that allows testing the proposed insertion-deletion (indel) process inherent to this model and find that real sequences (here globins) probably experience indels longer than one, contrary to what is assumed by the model. Copyright 2000 Academic Press.
A novel energy conversion based method for velocity correction in molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Hanhui; Collaborative Innovation Center of Advanced Aero-Engine, Hangzhou 310027; Liu, Ningning
2017-05-01
Molecular dynamics (MD) simulation has become an important tool for studying micro- or nano-scale dynamics and the statistical properties of fluids and solids. In MD simulations, there are mainly two approaches: equilibrium and non-equilibrium molecular dynamics (EMD and NEMD). In this paper, a new energy conversion based correction (ECBC) method for MD is developed. Unlike the traditional systematic correction based on macroscopic parameters, the ECBC method is developed strictly based on the physical interaction processes between the pair of molecules or atoms. The developed ECBC method can apply to EMD and NEMD directly. While using MD with this method, themore » difference between the EMD and NEMD is eliminated, and no macroscopic parameters such as external imposed potentials or coefficients are needed. With this method, many limits of using MD are lifted. The application scope of MD is greatly extended.« less
Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Letant, S E; Kane, S R; Murphy, G A
2008-05-30
This note presents a comparison of Most-Probable-Number Rapid Viability (MPN-RV) PCR and traditional culture methods for the quantification of Bacillus anthracis Sterne spores in macrofoam swabs generated by the Centers for Disease Control and Prevention (CDC) for a multi-center validation study aimed at testing environmental swab processing methods for recovery, detection, and quantification of viable B. anthracis spores from surfaces. Results show that spore numbers provided by the MPN RV-PCR method were in statistical agreement with the CDC conventional culture method for all three levels of spores tested (10{sup 4}, 10{sup 2}, and 10 spores) even in the presence ofmore » dirt. In addition to detecting low levels of spores in environmental conditions, the MPN RV-PCR method is specific, and compatible with automated high-throughput sample processing and analysis protocols.« less
Guerrero, Luis; Guàrdia, Maria Dolors; Xicola, Joan; Verbeke, Wim; Vanhonacker, Filiep; Zakowska-Biemans, Sylwia; Sajdakowska, Marta; Sulmont-Rossé, Claire; Issanchou, Sylvie; Contel, Michele; Scalvedi, M Luisa; Granli, Britt Signe; Hersleth, Margrethe
2009-04-01
Traditional food products (TFP) are an important part of European culture, identity, and heritage. In order to maintain and expand the market share of TFP, further improvement in safety, health, or convenience is needed by means of different innovations. The aim of this study was to obtain a consumer-driven definition for the concept of TFP and innovation and to compare these across six European countries (Belgium, France, Italy, Norway, Poland and Spain) by means of semantic and textual statistical analyses. Twelve focus groups were performed, two per country, under similar conditions. The transcriptions obtained were submitted to an ordinary semantic analysis and to a textual statistical analysis using the software ALCESTE. Four main dimensions were identified for the concept of TFP: habit-natural, origin-locality, processing-elaboration and sensory properties. Five dimensions emerged around the concept of innovation: novelty-change, variety, processing-technology, origin-ethnicity and convenience. TFP were similarly perceived in the countries analysed, while some differences were detected for the concept of innovation. Semantic and statistical analyses of the focus groups led to similar results for both concepts. In some cases and according to the consumers' point of view the application of innovations may damage the traditional character of TFP.
Reconstruction of early phase deformations by integrated magnetic and mesotectonic data evaluation
NASA Astrophysics Data System (ADS)
Sipos, András A.; Márton, Emő; Fodor, László
2018-02-01
Markers of brittle faulting are widely used for recovering past deformation phases. Rocks often have oriented magnetic fabrics, which can be interpreted as connected to ductile deformation before cementation of the sediment. This paper reports a novel statistical procedure for simultaneous evaluation of AMS (Anisotropy of Magnetic Susceptibility) and fault-slip data. The new method analyzes the AMS data, without linearization techniques, so that weak AMS lineation and rotational AMS can be assessed that are beyond the scope of classical methods. This idea is extended to the evaluation of fault-slip data. While the traditional assumptions of stress inversion are not rejected, the method recovers the stress field via statistical hypothesis testing. In addition it provides statistical information needed for the combined evaluation of the AMS and the mesotectonic (0.1 to 10 m) data. In the combined evaluation a statistical test is carried out that helps to decide if the AMS lineation and the mesotectonic markers (in case of repeated deformation of the oldest set of markers) were formed in the same or different deformation phases. If this condition is met, the combined evaluation can improve the precision of the reconstruction. When the two data sets do not have a common solution for the direction of the extension, the deformational origin of the AMS is questionable. In this case the orientation of the stress field responsible for the AMS lineation might be different from that which caused the brittle deformation. Although most of the examples demonstrate the reconstruction of weak deformations in sediments, the new method is readily applicable to investigate the ductile-brittle transition of any rock formation as long as AMS and fault-slip data are available.
Ethnobotanical study on medicinal plants used by Maonan people in China.
Hong, Liya; Guo, Zhiyong; Huang, Kunhui; Wei, Shanjun; Liu, Bo; Meng, Shaowu; Long, Chunlin
2015-04-30
This paper is based on an ethnobotanical investigation that focused on the traditional medicinal plants used by local Maonan people to treat human diseases in Maonan concentration regions. The Maonan people have relied on traditional medicine since ancient times, especially medicinal plants. The aim of this study is to document medicinal plants used by the Maonans and to report the status of medicinal plants and associated traditional knowledge. Ethnobotanical data were collected from June 2012 to September 2014 in Huanjiang Maonan Autonomous County, northern Guangxi, southwest China. In total, 118 knowledgeable informants were interviewed. Following statistically sampling method, eighteen villages from 5 townships were selected to conduct field investigations. Information was collected through the approache of participatory observation, semi-structured interviews, ranking exercises, key informant interviews, focus group discussions, and participatory rural appraisals. A total of 368 medicinal plant species were investigated and documented together with their medicinal uses by the Maonans, most of which were obtained from the wild ecosystems. The plants were used to treat 95 human diseases. Grinding was a widely used method to prepare traditional herbal medicines. There were significant relationships between gender and age, and between gender and informants' knowledge of medicinal plant use. Deforestation for agricultural purposes was identified as the most destructive factor of medicinal plants, followed by drought and over-harvest. The species diversity of medicinal plants used by the Maonans in the study area was very rich. Medicinal plants played a significant role in healing various human disorders in the Maonan communities. However, the conflicts between traditional inheriting system and recent socio-economic changes (and other factors) resulted in the reduction or loss of both medicinal plants and associated indigenous knowledge. Thus, conservation efforts and policies, and innovation of inheriting system are necessary for protecting the medicinal plants and associated indigenous knowledge. Awareness is also needed to be raised among local Maonans focusing on sustainable utilization and management of both medicinal plants and traditional knowledge.
[Design and implementation of Chinese materia medica resources survey results display system].
Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Wang, Ling; Zhao, Yan-Ping; Jing, Zhi-Xian; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
From the beginning of the fourth national census of traditional Chinese medicine resources in 2011, a large amount of data have been collected and compiled, including wild medicinal plant resource data, cultivation of medicinal plant information, traditional knowledge, and specimen information. The traditional paper-based recording method is inconvenient for query and application. The B/S architecture, JavaWeb framework and SOA are used to design and develop the fourth national census results display platform. Through the data integration and sorting, the users are to provide with integrated data services and data query display solutions. The platform realizes the fine data classification, and has the simple data retrieval and the university statistical analysis function. The platform uses Echarts components, Geo Server, Open Layers and other technologies to provide a variety of data display forms such as charts, maps and other visualization forms, intuitive reflects the number, distribution and type of Chinese material medica resources. It meets the data mapping requirements of different levels of users, and provides support for management decision-making. Copyright© by the Chinese Pharmaceutical Association.
Jiang, Wei; Yu, Weichuan
2017-01-01
In genome-wide association studies, we normally discover associations between genetic variants and diseases/traits in primary studies, and validate the findings in replication studies. We consider the associations identified in both primary and replication studies as true findings. An important question under this two-stage setting is how to determine significance levels in both studies. In traditional methods, significance levels of the primary and replication studies are determined separately. We argue that the separate determination strategy reduces the power in the overall two-stage study. Therefore, we propose a novel method to determine significance levels jointly. Our method is a reanalysis method that needs summary statistics from both studies. We find the most powerful significance levels when controlling the false discovery rate in the two-stage study. To enjoy the power improvement from the joint determination method, we need to select single nucleotide polymorphisms for replication at a less stringent significance level. This is a common practice in studies designed for discovery purpose. We suggest this practice is also suitable in studies with validation purpose in order to identify more true findings. Simulation experiments show that our method can provide more power than traditional methods and that the false discovery rate is well-controlled. Empirical experiments on datasets of five diseases/traits demonstrate that our method can help identify more associations. The R-package is available at: http://bioinformatics.ust.hk/RFdr.html .
Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting
2018-08-30
Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.
First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation
Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...
2016-12-02
Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.
International experience on the use of artificial neural networks in gastroenterology.
Grossi, E; Mancini, A; Buscema, M
2007-03-01
In this paper, we reconsider the scientific background for the use of artificial intelligence tools in medicine. A review of some recent significant papers shows that artificial neural networks, the more advanced and effective artificial intelligence technique, can improve the classification accuracy and survival prediction of a number of gastrointestinal diseases. We discuss the 'added value' the use of artificial neural networks-based tools can bring in the field of gastroenterology, both at research and clinical application level, when compared with traditional statistical or clinical-pathological methods.
Gooya, Ali; Lekadir, Karim; Alba, Xenia; Swift, Andrew J; Wild, Jim M; Frangi, Alejandro F
2015-01-01
Construction of Statistical Shape Models (SSMs) from arbitrary point sets is a challenging problem due to significant shape variation and lack of explicit point correspondence across the training data set. In medical imaging, point sets can generally represent different shape classes that span healthy and pathological exemplars. In such cases, the constructed SSM may not generalize well, largely because the probability density function (pdf) of the point sets deviates from the underlying assumption of Gaussian statistics. To this end, we propose a generative model for unsupervised learning of the pdf of point sets as a mixture of distinctive classes. A Variational Bayesian (VB) method is proposed for making joint inferences on the labels of point sets, and the principal modes of variations in each cluster. The method provides a flexible framework to handle point sets with no explicit point-to-point correspondences. We also show that by maximizing the marginalized likelihood of the model, the optimal number of clusters of point sets can be determined. We illustrate this work in the context of understanding the anatomical phenotype of the left and right ventricles in heart. To this end, we use a database containing hearts of healthy subjects, patients with Pulmonary Hypertension (PH), and patients with Hypertrophic Cardiomyopathy (HCM). We demonstrate that our method can outperform traditional PCA in both generalization and specificity measures.
Rtop - an R package for interpolation along the stream network
NASA Astrophysics Data System (ADS)
Skøien, J. O.
2009-04-01
Rtop - an R package for interpolation along the stream network Geostatistical methods have been used to a limited extent for estimation along stream networks, with a few exceptions(Gottschalk, 1993; Gottschalk, et al., 2006; Sauquet, et al., 2000; Skøien, et al., 2006). Interpolation of runoff characteristics are more complicated than the traditional random variables estimated by geostatistical methods, as the measurements have a more complicated support, and many catchments are nested. Skøien et al. (2006) presented the model Top-kriging which takes these effects into account for interpolation of stream flow characteristics (exemplified by the 100 year flood). The method has here been implemented as a package in the statistical environment R (R Development Core Team, 2004). Taking advantage of the existing methods in R for working with spatial objects, and the extensive possibilities for visualizing the result, this makes it considerably easier to apply the method on new data sets, in comparison to earlier implementation of the method. Gottschalk, L. 1993. Interpolation of runoff applying objective methods. Stochastic Hydrology and Hydraulics, 7, 269-281. Gottschalk, L., I. Krasovskaia, E. Leblois, and E. Sauquet. 2006. Mapping mean and variance of runoff in a river basin. Hydrology and Earth System Sciences, 10, 469-484. R Development Core Team. 2004. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Sauquet, E., L. Gottschalk, and E. Leblois. 2000. Mapping average annual runoff: a hierarchical approach applying a stochastic interpolation scheme. Hydrological Sciences Journal, 45 (6), 799-815. Skøien, J. O., R. Merz, and G. Blöschl. 2006. Top-kriging - geostatistics on stream networks. Hydrology and Earth System Sciences, 10, 277-287.
A practical method of predicting client revisit intention in a hospital setting.
Lee, Kyun Jick
2005-01-01
Data mining (DM) models are an alternative to traditional statistical methods for examining whether higher customer satisfaction leads to higher revisit intention. This study used a total of 906 outpatients' satisfaction data collected from a nationwide survey interviews conducted by professional interviewers on a face-to-face basis in South Korea, 1998. Analyses showed that the relationship between overall satisfaction with hospital services and outpatients' revisit intention, along with word-of-mouth recommendation as intermediate variables, developed into a nonlinear relationship. The five strongest predictors of revisit intention were overall satisfaction, intention to recommend to others, awareness of hospital promotion, satisfaction with physician's kindness, and satisfaction with treatment level.
Mesquita, Cristina S; Oliveira, Raquel; Bento, Fátima; Geraldo, Dulce; Rodrigues, João V; Marcos, João C
2014-08-01
This work proposes a modification of the 2,4-dinitrophenylhydrazine (DNPH) spectrophotometric assay commonly used to evaluate the concentration of carbonyl groups in oxidized proteins. In this approach NaOH is added to the protein solution after the addition of DNPH, shifting the maximum absorbance wavelength of the derivatized protein from 370 to 450nm. This reduces the interference of DNPH and allows the direct quantification in the sample solution without the need for the precipitation, washing, and resuspension steps that are carried out in the traditional DNPH method. The two methods were compared under various conditions and are statistically equivalent. Copyright © 2014 Elsevier Inc. All rights reserved.
Norris, Peter M; da Silva, Arlindo M
2016-07-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
NASA Technical Reports Server (NTRS)
Norris, Peter M.; Da Silva, Arlindo M.
2016-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.
Norris, Peter M.; da Silva, Arlindo M.
2018-01-01
A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Accounting for spatial effects in land use regression for urban air pollution modeling.
Bertazzon, Stefania; Johnson, Markey; Eccles, Kristin; Kaplan, Gilaad G
2015-01-01
In order to accurately assess air pollution risks, health studies require spatially resolved pollution concentrations. Land-use regression (LUR) models estimate ambient concentrations at a fine spatial scale. However, spatial effects such as spatial non-stationarity and spatial autocorrelation can reduce the accuracy of LUR estimates by increasing regression errors and uncertainty; and statistical methods for resolving these effects--e.g., spatially autoregressive (SAR) and geographically weighted regression (GWR) models--may be difficult to apply simultaneously. We used an alternate approach to address spatial non-stationarity and spatial autocorrelation in LUR models for nitrogen dioxide. Traditional models were re-specified to include a variable capturing wind speed and direction, and re-fit as GWR models. Mean R(2) values for the resulting GWR-wind models (summer: 0.86, winter: 0.73) showed a 10-20% improvement over traditional LUR models. GWR-wind models effectively addressed both spatial effects and produced meaningful predictive models. These results suggest a useful method for improving spatially explicit models. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang
2015-05-01
In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, Stephen; Santi, Peter A.; Henzlova, Daniela
The Feynman-Y statistic is a type of autocorrelation analysis. It is defined as the excess variance-to-mean ratio, Y = VMR - 1, of the number count distribution formed by sampling a pulse train using a series of non-overlapping gates. It is a measure of the degree of correlation present on the pulse train with Y = 0 for Poisson data. In the context of neutron coincidence counting we show that the same information can be obtained from the accidentals histogram acquired using the multiplicity shift-register method, which is currently the common autocorrelation technique applied in nuclear safeguards. In the casemore » of multiplicity shift register analysis however, overlapping gates, either triggered by the incoming pulse stream or by a periodic clock, are used. The overlap introduces additional covariance but does not alter the expectation values. In this paper we discuss, for a particular data set, the relative merit of the Feynman and shift-register methods in terms of both precision and dead time correction. Traditionally the Feynman approach is applied with a relatively long gate width compared to the dieaway time. The main reason for this is so that the gate utilization factor can be taken as unity rather than being treated as a system parameter to be determined at characterization/calibration. But because the random trigger interval gate utilization factor is slow to saturate this procedure requires a gate width many times the effective 1/e dieaway time. In the traditional approach this limits the number of gates that can be fitted into a given assay duration. We empirically show that much shorter gates, similar in width to those used in traditional shift register analysis can be used. Because the way in which the correlated information present on the pulse train is extracted is different for the moments based method of Feynman and the various shift register based approaches, the dead time losses are manifested differently for these two approaches. The resulting estimates for the dead time corrected first and second order reduced factorial moments should be independent of the method however and this allows the respective dead time formalism to be checked. We discuss how to make dead time corrections in both the shift register and the Feynman approaches.« less
Yu, Jihnhee; Yang, Luge; Vexler, Albert; Hutson, Alan D
2016-06-15
The receiver operating characteristic (ROC) curve is a popular technique with applications, for example, investigating an accuracy of a biomarker to delineate between disease and non-disease groups. A common measure of accuracy of a given diagnostic marker is the area under the ROC curve (AUC). In contrast with the AUC, the partial area under the ROC curve (pAUC) looks into the area with certain specificities (i.e., true negative rate) only, and it can be often clinically more relevant than examining the entire ROC curve. The pAUC is commonly estimated based on a U-statistic with the plug-in sample quantile, making the estimator a non-traditional U-statistic. In this article, we propose an accurate and easy method to obtain the variance of the nonparametric pAUC estimator. The proposed method is easy to implement for both one biomarker test and the comparison of two correlated biomarkers because it simply adapts the existing variance estimator of U-statistics. In this article, we show accuracy and other advantages of the proposed variance estimation method by broadly comparing it with previously existing methods. Further, we develop an empirical likelihood inference method based on the proposed variance estimator through a simple implementation. In an application, we demonstrate that, depending on the inferences by either the AUC or pAUC, we can make a different decision on a prognostic ability of a same set of biomarkers. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.