ERIC Educational Resources Information Center
Luh, Wei-Ming; Guo, Jiin-Huarng
2005-01-01
To deal with nonnormal and heterogeneous data for the one-way fixed effect analysis of variance model, the authors adopted a trimmed means method in conjunction with Hall's invertible transformation into a heteroscedastic test statistic (Alexander-Govern test or Welch test). The results of simulation experiments showed that the proposed technique…
Inhibition of Orthopaedic Implant Infections by Immunomodulatory Effects of Host Defense Peptides
2014-12-01
significance was determined by t- tests or by one-way analysis of variance (ANOVA) followed by Bonferroni post hoc tests in experiments with multiple...groups. Non- parametric Mann-Whitney tests , Kruskal-Wallis ANOVA followed by Newman-Kuels post hoc tests , or van Elteren’s two-way tests were applied to...in D, and black symbols in A), statistical analysis was by one-way ANOVA followed by Bonferroni versus control, post hoc tests . Otherwise, statistical
Statistical analysis of Skylab 3. [endocrine/metabolic studies of astronauts
NASA Technical Reports Server (NTRS)
Johnston, D. A.
1974-01-01
The results of endocrine/metabolic studies of astronauts on Skylab 3 are reported. One-way analysis of variance, contrasts, two-way unbalanced analysis of variance, and analysis of periodic changes in flight are included. Results for blood tests, and urine tests are presented.
The Heuristics of Statistical Argumentation: Scaffolding at the Postsecondary Level
ERIC Educational Resources Information Center
Pardue, Teneal Messer
2017-01-01
Language plays a key role in statistics and, by extension, in statistics education. Enculturating students into the practice of statistics requires preparing them to communicate results of data analysis. Statistical argumentation is one way of providing structure to facilitate discourse in the statistics classroom. In this study, a teaching…
Experimental and Computational Analysis of Modes in a Partially Constrained Plate
2004-03-01
way to quantify a structure. One technique utilizing an energy method is the Statistical Energy Analysis (SEA). The SEA process involves regarding...B.R. Mace. “ Statistical Energy Analysis of Two Edge- Coupled Rectangular Plates: Ensemble Averages,” Journal of Sound and Vibration, 193(4): 793-822
ERIC Educational Resources Information Center
Lix, Lisa M.; And Others
1996-01-01
Meta-analytic techniques were used to summarize the statistical robustness literature on Type I error properties of alternatives to the one-way analysis of variance "F" test. The James (1951) and Welch (1951) tests performed best under violations of the variance homogeneity assumption, although their use is not always appropriate. (SLD)
ERIC Educational Resources Information Center
Larson-Hall, Jenifer; Herrington, Richard
2010-01-01
In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…
Examples of Data Analysis with SPSS-X.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics using SPSS-X Release 3.0 for VAX/UNIX. Statistical measures covered include Chi-square analysis; Spearman's rank correlation coefficient; Student's t-test with two independent samples; Student's t-test with a paired sample; One-way analysis…
ERIC Educational Resources Information Center
Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa
2009-01-01
This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…
Engaging Students in Survey Research Projects across Research Methods and Statistics Courses
ERIC Educational Resources Information Center
Lovekamp, William E.; Soboroff, Shane D.; Gillespie, Michael D.
2017-01-01
One innovative way to help students make sense of survey research has been to create a multifaceted, collaborative assignment that promotes critical thinking, comparative analysis, self-reflection, and statistical literacy. We use a short questionnaire adapted from the Higher Education Research Institute's Cooperative Institutional Research…
1983-03-01
should be familiar with typical military/navy terms, and elementary statistical tests (T-test, Chi Square, and One-Way Analysis of Variance). The ...and the media. One theory is that the gradual internalization or acceptance of values and ideals (which is influenced by the individual’s class, family...completed with a comparison of the two. A similar format is followed for the commanding officer’s data. Three sets of statistical tests were done on the
Dynamic association rules for gene expression data analysis.
Chen, Shu-Chuan; Tsai, Tsung-Hsien; Chung, Cheng-Han; Li, Wen-Hsiung
2015-10-14
The purpose of gene expression analysis is to look for the association between regulation of gene expression levels and phenotypic variations. This association based on gene expression profile has been used to determine whether the induction/repression of genes correspond to phenotypic variations including cell regulations, clinical diagnoses and drug development. Statistical analyses on microarray data have been developed to resolve gene selection issue. However, these methods do not inform us of causality between genes and phenotypes. In this paper, we propose the dynamic association rule algorithm (DAR algorithm) which helps ones to efficiently select a subset of significant genes for subsequent analysis. The DAR algorithm is based on association rules from market basket analysis in marketing. We first propose a statistical way, based on constructing a one-sided confidence interval and hypothesis testing, to determine if an association rule is meaningful. Based on the proposed statistical method, we then developed the DAR algorithm for gene expression data analysis. The method was applied to analyze four microarray datasets and one Next Generation Sequencing (NGS) dataset: the Mice Apo A1 dataset, the whole genome expression dataset of mouse embryonic stem cells, expression profiling of the bone marrow of Leukemia patients, Microarray Quality Control (MAQC) data set and the RNA-seq dataset of a mouse genomic imprinting study. A comparison of the proposed method with the t-test on the expression profiling of the bone marrow of Leukemia patients was conducted. We developed a statistical way, based on the concept of confidence interval, to determine the minimum support and minimum confidence for mining association relationships among items. With the minimum support and minimum confidence, one can find significant rules in one single step. The DAR algorithm was then developed for gene expression data analysis. Four gene expression datasets showed that the proposed DAR algorithm not only was able to identify a set of differentially expressed genes that largely agreed with that of other methods, but also provided an efficient and accurate way to find influential genes of a disease. In the paper, the well-established association rule mining technique from marketing has been successfully modified to determine the minimum support and minimum confidence based on the concept of confidence interval and hypothesis testing. It can be applied to gene expression data to mine significant association rules between gene regulation and phenotype. The proposed DAR algorithm provides an efficient way to find influential genes that underlie the phenotypic variance.
[Bayesian statistics in medicine -- part II: main applications and inference].
Montomoli, C; Nichelatti, M
2008-01-01
Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.
1989-06-01
letters on one line and several letters on the next line, there is no accurate way to credit these extra letters for statistical analysis. The decimal and...contains the descriptive statistics of the objective refractive error components of infantrymen. Figures 8-11 show the frequency distributions for sphere...equivalents. Nonspectacle wearers Table 12 contains the idescriptive statistics for non- spectacle wearers. Based or these refractive error data, about 30
Chan, Y; Walmsley, R P
1997-12-01
When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.
Estimating an Effect Size in One-Way Multivariate Analysis of Variance (MANOVA)
ERIC Educational Resources Information Center
Steyn, H. S., Jr.; Ellis, S. M.
2009-01-01
When two or more univariate population means are compared, the proportion of variation in the dependent variable accounted for by population group membership is eta-squared. This effect size can be generalized by using multivariate measures of association, based on the multivariate analysis of variance (MANOVA) statistics, to establish whether…
[How to fit and interpret multilevel models using SPSS].
Pardo, Antonio; Ruiz, Miguel A; San Martín, Rafael
2007-05-01
Hierarchic or multilevel models are used to analyse data when cases belong to known groups and sample units are selected both from the individual level and from the group level. In this work, the multilevel models most commonly discussed in the statistic literature are described, explaining how to fit these models using the SPSS program (any version as of the 11 th ) and how to interpret the outcomes of the analysis. Five particular models are described, fitted, and interpreted: (1) one-way analysis of variance with random effects, (2) regression analysis with means-as-outcomes, (3) one-way analysis of covariance with random effects, (4) regression analysis with random coefficients, and (5) regression analysis with means- and slopes-as-outcomes. All models are explained, trying to make them understandable to researchers in health and behaviour sciences.
DOT National Transportation Integrated Search
2015-11-01
One of the most efficient ways to solve the damage detection problem using the statistical pattern recognition : approach is that of exploiting the methods of outlier analysis. Cast within the pattern recognition framework, : damage detection assesse...
Pacifici, Edoardo; Bossù, Maurizio; Giovannetti, Agostino; La Torre, Giuseppe; Guerra, Fabrizio; Polimeni, Antonella
2013-01-01
Summary Background Even today, use of Glass Ionomer Cements (GIC) as restorative material is indicated for uncooperative patients. Aim The study aimed at estimating the surface roughness of different GICs using or not their proprietary surface coatings and at observing the interfaces between cement and coating through SEM. Materials and methods Forty specimens have been obtained and divided into 4 groups: Fuji IX (IX), Fuji IX/G-Coat Plus (IXC), Vitremer (V), Vitremer/Finishing Gloss (VFG). Samples were obtained using silicone moulds to simulate class I restorations. All specimens were processed for profilometric evaluation. The statistical differences of surface roughness between groups were assessed using One-Way Analysis of Variance (One-Way ANOVA) (p<0.05). The Two-Way Analysis of Variance (Two-Way ANOVA) was used to evaluate the influence of two factors: restoration material and presence of coating. Coated restoration specimens (IXC and VFG) were sectioned perpendicular to the restoration surface and processed for SEM evaluation. Results No statistical differences in roughness could be noticed between groups or factors. Following microscopic observation, interfaces between restoration material and coating were better for group IXC than for group VFG. Conclusions When specimens are obtained simulating normal clinical procedures, the presence of surface protection does not significantly improve the surface roughness of GICs. PMID:24611090
Pacifici, Edoardo; Bossù, Maurizio; Giovannetti, Agostino; La Torre, Giuseppe; Guerra, Fabrizio; Polimeni, Antonella
2013-01-01
Even today, use of Glass Ionomer Cements (GIC) as restorative material is indicated for uncooperative patients. The study aimed at estimating the surface roughness of different GICs using or not their proprietary surface coatings and at observing the interfaces between cement and coating through SEM. Forty specimens have been obtained and divided into 4 groups: Fuji IX (IX), Fuji IX/G-Coat Plus (IXC), Vitremer (V), Vitremer/Finishing Gloss (VFG). Samples were obtained using silicone moulds to simulate class I restorations. All specimens were processed for profilometric evaluation. The statistical differences of surface roughness between groups were assessed using One-Way Analysis of Variance (One-Way ANOVA) (p<0.05). The Two-Way Analysis of Variance (Two-Way ANOVA) was used to evaluate the influence of two factors: restoration material and presence of coating. Coated restoration specimens (IXC and VFG) were sectioned perpendicular to the restoration surface and processed for SEM evaluation. No statistical differences in roughness could be noticed between groups or factors. Following microscopic observation, interfaces between restoration material and coating were better for group IXC than for group VFG. When specimens are obtained simulating normal clinical procedures, the presence of surface protection does not significantly improve the surface roughness of GICs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, PhillipN.; Gelman, Andrew
2014-11-24
In this chapter we elucidate four main themes. The first is that modern data analyses, including "Big Data" analyses, often rely on data from different sources, which can present challenges in constructing statistical models that can make effective use of all of the data. The second theme is that although data analysis is usually centralized, frequently the final outcome is to provide information or allow decision-making for individuals. Third, data analyses often have multiple uses by design: the outcomes of the analysis are intended to be used by more than one person or group, for more than one purpose. Finally,more » issues of privacy and confidentiality can cause problems in more subtle ways than are usually considered; we will illustrate this point by discussing a case in which there is substantial and effective political opposition to simply acknowledging the geographic distribution of a health hazard. A researcher analyzes some data and learns something important. What happens next? What does it take for the results to make a difference in people's lives? In this chapter we tell a story - a true story - about a statistical analysis that should have changed government policy, but didn't. The project was a research success that did not make its way into policy, and we think it provides some useful insights into the interplay between locally-collected data, statistical analysis, and individual decision making.« less
GLIMMPSE Lite: Calculating Power and Sample Size on Smartphone Devices
Munjal, Aarti; Sakhadeo, Uttara R.; Muller, Keith E.; Glueck, Deborah H.; Kreidler, Sarah M.
2014-01-01
Researchers seeking to develop complex statistical applications for mobile devices face a common set of difficult implementation issues. In this work, we discuss general solutions to the design challenges. We demonstrate the utility of the solutions for a free mobile application designed to provide power and sample size calculations for univariate, one-way analysis of variance (ANOVA), GLIMMPSE Lite. Our design decisions provide a guide for other scientists seeking to produce statistical software for mobile platforms. PMID:25541688
Salt preference: age and sex related variability.
Verma, Punam; Mittal, Sunita; Ghildiyal, Archana; Chaudhary, Lalita; Mahajan, K K
2007-01-01
Salt preference was assessed in 60 adults of 18-21 yrs of age (30 males and 30 females) and in 60 children of 7-12 yrs of age (30 boys and 30 girls). Subjects rated the preference on Likert scale for popcorns of five salt concentrations (OM, 1M, 2M, 3M and +3M). Statistical analysis using Two way ANOVA revealed statistically significant effect of age and sex on salt preference (F4,100 = 15.027, P < 0.01) and One Way ANOVA revealed statistically significant sex difference in salt preference of adults (F4,50 = 16.26, P < 0.01) but no statistically significant sex difference in salt preference of children (F4,50 = 4.08, P > 0.05). Dietary experiences during development and more physical activity in children may be responsible for higher salt preference in children while finding no sex variability in children favours the role of sex hormones in salt preference of male and females.
Gencrypt: one-way cryptographic hashes to detect overlapping individuals across samples
Turchin, Michael C.; Hirschhorn, Joel N.
2012-01-01
Summary: Meta-analysis across genome-wide association studies is a common approach for discovering genetic associations. However, in some meta-analysis efforts, individual-level data cannot be broadly shared by study investigators due to privacy and Institutional Review Board concerns. In such cases, researchers cannot confirm that each study represents a unique group of people, leading to potentially inflated test statistics and false positives. To resolve this problem, we created a software tool, Gencrypt, which utilizes a security protocol known as one-way cryptographic hashes to allow overlapping participants to be identified without sharing individual-level data. Availability: Gencrypt is freely available under the GNU general public license v3 at http://www.broadinstitute.org/software/gencrypt/ Contact: joelh@broadinstitute.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22302573
NASA Astrophysics Data System (ADS)
Rubtsov, Vladimir; Kapralov, Sergey; Chalyk, Iuri; Ulianova, Onega; Ulyanov, Sergey
2013-02-01
Statistical properties of laser speckles, formed in skin and mucous of colon have been analyzed and compared. It has been demonstrated that first and second order statistics of "skin" speckles and "mucous" speckles are quite different. It is shown that speckles, formed in mucous, are not Gaussian one. Layered structure of colon mucous causes formation of speckled biospeckles. First- and second- order statistics of speckled speckles have been reviewed in this paper. Statistical properties of Fresnel and Fraunhofer doubly scattered and cascade speckles are described. Non-gaussian statistics of biospeckles may lead to high localization of intensity of coherent light in human tissue during the laser surgery. Way of suppression of highly localized non-gaussian speckles is suggested.
ERIC Educational Resources Information Center
Guerra, Jorge
2012-01-01
The purpose of this research was to examine the relationship between teaching readiness and teaching excellence with three variables of preparedness of adjunct professors teaching career technical education courses through student surveys using a correlational design of two statistical techniques; least-squares regression and one-way analysis of…
ERIC Educational Resources Information Center
Park, Hyeran; Nielsen, Wendy; Woodruff, Earl
2014-01-01
This study examined and compared students' understanding of nature of science (NOS) with 521 Grade 8 Canadian and Korean students using a mixed methods approach. The concepts of NOS were measured using a survey that had both quantitative and qualitative elements. Descriptive statistics and one-way multivariate analysis of variances examined the…
Sorting: Groups and Graphs. Used Numbers. Grades 2-3.
ERIC Educational Resources Information Center
Russell, Susan Jo; Corwin, Rebecca B.
A unit of study that introduces sorting and classification as a way of organizing data is presented. Suitable for students in grades 2 and 3, it provides a foundation for further work in statistics and data analysis. The investigations may extend from one to five class sessions and are grouped into three parts: "Introduction to Sorting"; "Sorting…
Jothika, Mohan; Vanajassun, P. Pranav; Someshwar, Battu
2015-01-01
Aim: To determine the short-term efficiency of probiotic, chlorhexidine, and fluoride mouthwashes on plaque Streptococcus mutans level at four periodic intervals. Materials and Methods: This was a single-blind, randomized control study in which each subject was tested with only one mouthwash regimen. Fifty-two healthy qualified adult patients were selected randomly for the study and were divided into the following groups: group 1- 10 ml of distilled water, group 2- 10 ml of 0.2% chlorhexidine mouthwash, group 3- 10 ml of 500 ppm F/400 ml sodium fluoride mouthwash, and group 4- 10 ml of probiotic mouthwash. Plaque samples were collected from the buccal surface of premolars and molars in the maxillary quadrant. Sampling procedure was carried out by a single examiner after 7 days, 14 days, and 30 days, respectively, after the use of the mouthwash. All the samples were subjected to microbiological analysis and statistically analyzed with one-way analysis of variance (ANOVA) and post-hoc test. Results: One-way ANOVA comparison among groups 2, 3, and 4 showed no statistical significance, whereas group 1 showed statistically significant difference when compared with groups 2, 3, and 4 at 7th, 14th, and 30th day. Conclusion: Chlorhexidine, sodium fluoride, and probiotic mouthwashes reduce plaque S. mutans levels. Probiotic mouthwash is effective and equivalent to chlorhexidine and sodium fluoride mouthwashes. Thus, probiotic mouthwash can also be considered as an effective oral hygiene regimen. PMID:25984467
Dusek, Wolfgang; Pierscionek, Barbara K; McClelland, Julie F
2010-05-25
To describe and compare visual function measures of two groups of school age children (6-14 years of age) attending a specialist eyecare practice in Austria; one group referred to the practice from educational assessment centres diagnosed with reading and writing difficulties and the other, a clinical age-matched control group. Retrospective clinical data from one group of subjects with reading difficulties (n = 825) and a clinical control group of subjects (n = 328) were examined.Statistical analysis was performed to determine whether any differences existed between visual function measures from each group (refractive error, visual acuity, binocular status, accommodative function and reading speed and accuracy). Statistical analysis using one way ANOVA demonstrated no differences between the two groups in terms of refractive error and the size or direction of heterophoria at distance (p > 0.05). Using predominately one way ANOVA and chi-square analyses, those subjects in the referred group were statistically more likely to have poorer distance visual acuity, an exophoric deviation at near, a lower amplitude of accommodation, reduced accommodative facility, reduced vergence facility, a reduced near point of convergence, a lower AC/A ratio and a slower reading speed than those in the clinical control group (p < 0.05). This study highlights the high proportions of visual function anomalies in a group of children with reading difficulties in an Austrian population. It confirms the importance of a full assessment of binocular visual status in order to detect and remedy these deficits in order to prevent the visual problems continuing to impact upon educational development.
Multivariate analysis in thoracic research.
Mengual-Macenlle, Noemí; Marcos, Pedro J; Golpe, Rafael; González-Rivas, Diego
2015-03-01
Multivariate analysis is based in observation and analysis of more than one statistical outcome variable at a time. In design and analysis, the technique is used to perform trade studies across multiple dimensions while taking into account the effects of all variables on the responses of interest. The development of multivariate methods emerged to analyze large databases and increasingly complex data. Since the best way to represent the knowledge of reality is the modeling, we should use multivariate statistical methods. Multivariate methods are designed to simultaneously analyze data sets, i.e., the analysis of different variables for each person or object studied. Keep in mind at all times that all variables must be treated accurately reflect the reality of the problem addressed. There are different types of multivariate analysis and each one should be employed according to the type of variables to analyze: dependent, interdependence and structural methods. In conclusion, multivariate methods are ideal for the analysis of large data sets and to find the cause and effect relationships between variables; there is a wide range of analysis types that we can use.
An evaluation of shear bond strength of self-etch adhesive on pre-etched enamel: an in vitro study.
Rao, Bhadra; Reddy, Satti Narayana; Mujeeb, Abdul; Mehta, Kanchan; Saritha, G
2013-11-01
To determine the shear bond strength of self-etch adhesive G-bond on pre-etched enamel. Thirty caries free human mandibular premolars extracted for orthodontic purpose were used for the study. Occlusal surfaces of all the teeth were flattened with diamond bur and a silicon carbide paper was used for surface smoothening. The thirty samples were randomly grouped into three groups. Three different etch systems were used for the composite build up: group 1 (G-bond self-etch adhesive system), group 2 (G-bond) and group 3 (Adper single bond). Light cured was applied for 10 seconds with a LED unit for composite buildup on the occlusal surface of each tooth with 8 millimeters (mm) in diameter and 3 mm in thickness. The specimens in each group were tested in shear mode using a knife-edge testing apparatus in a universal testing machine across head speed of 1 mm/ minute. Shear bond strength values in Mpa were calculated from the peak load at failure divided by the specimen surface area. The mean shear bond strength of all the groups were calculated and statistical analysis was carried out using one-way Analysis of Variance (ANOVA). The mean bond strength of group 1 is 15.5 Mpa, group 2 is 19.5 Mpa and group 3 is 20.1 Mpa. Statistical analysis was carried out between the groups using one-way ANOVA. Group 1 showed statistically significant lower bond strength when compared to groups 2 and 3. No statistical significant difference between groups 2 and 3 (p < 0.05). Self-etch adhesive G-bond showed increase in shear bond strength on pre-etched enamel.
On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis.
Li, Bing; Chun, Hyonho; Zhao, Hongyu
2014-09-01
We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis.
Statistical and Machine Learning forecasting methods: Concerns and ways forward
Makridakis, Spyros; Assimakopoulos, Vassilios
2018-01-01
Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784
NASA Astrophysics Data System (ADS)
Simatos, N.; Perivolaropoulos, L.
2001-01-01
We use the publicly available code CMBFAST, as modified by Pogosian and Vachaspati, to simulate the effects of wiggly cosmic strings on the cosmic microwave background (CMB). Using the modified CMBFAST code, which takes into account vector modes and models wiggly cosmic strings by the one-scale model, we go beyond the angular power spectrum to construct CMB temperature maps with a resolution of a few degrees. The statistics of these maps are then studied using conventional and recently proposed statistical tests optimized for the detection of hidden temperature discontinuities induced by the Gott-Kaiser-Stebbins effect. We show, however, that these realistic maps cannot be distinguished in a statistically significant way from purely Gaussian maps with an identical power spectrum.
The Effects of Four Decades of Recession on Higher Education Enrollments in the United States
ERIC Educational Resources Information Center
Wright, Dianne A.; Ramdin, Gianna; Vásquez-Colina, María D.
2013-01-01
The United States experienced six economic recessions between 1970 and 2009. The impact of economic recession on higher education enrollment was examined using seasonally adjusted data from the U.S. Census and the U.S. Department of Labor Bureau of Labor Statistics, Unemployment Level-Civilian Labor Force. One-way analysis of variance, factorial…
Measuring: From Paces to Feet. Used Numbers: Real Data in the Classroom. Grades 3-4.
ERIC Educational Resources Information Center
Corwin, Rebecca B.; Russell, Susan Jo
A unit of study that introduces measuring as a way of collecting data is presented. Suitable for students in grades 3 and 4, it provides a foundation for further work in statistics and data analysis. The investigations may extend from one to four class sessions and are grouped into three parts: "Introduction to Measurement"; "Using Standard…
Exact and Monte carlo resampling procedures for the Wilcoxon-Mann-Whitney and Kruskal-Wallis tests.
Berry, K J; Mielke, P W
2000-12-01
Exact and Monte Carlo resampling FORTRAN programs are described for the Wilcoxon-Mann-Whitney rank sum test and the Kruskal-Wallis one-way analysis of variance for ranks test. The program algorithms compensate for tied values and do not depend on asymptotic approximations for probability values, unlike most algorithms contained in PC-based statistical software packages.
Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models
NASA Astrophysics Data System (ADS)
Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter
Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
1998-04-28
be discussed. 2.1 ECONOMIC REPLACEMENT THEORY Decisions about heavy equipment should be made based on sound economic principles , not emotions...Life) will be less than L*. The converse is also true. 2.1.3 The Repair Limit Theory A different way of looking at the economic replacement decision...Summary Three different economic models have been reviewed in this section. The output of each is distinct. One seeks to minimize costs, one seeks to
Recent advances in statistical energy analysis
NASA Technical Reports Server (NTRS)
Heron, K. H.
1992-01-01
Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.
NASA Astrophysics Data System (ADS)
Gourdol, L.; Hissler, C.; Pfister, L.
2012-04-01
The Luxembourg sandstone aquifer is of major relevance for the national supply of drinking water in Luxembourg. The city of Luxembourg (20% of the country's population) gets almost 2/3 of its drinking water from this aquifer. As a consequence, the study of both the groundwater hydrochemistry, as well as its spatial and temporal variations, are considered as of highest priority. Since 2005, a monitoring network has been implemented by the Water Department of Luxembourg City, with a view to a more sustainable management of this strategic water resource. The data collected to date forms a large and complex dataset, describing spatial and temporal variations of many hydrochemical parameters. The data treatment issue is tightly connected to this kind of water monitoring programs and complex databases. Standard multivariate statistical techniques, such as principal components analysis and hierarchical cluster analysis, have been widely used as unbiased methods for extracting meaningful information from groundwater quality data and are now classically used in many hydrogeological studies, in particular to characterize temporal or spatial hydrochemical variations induced by natural and anthropogenic factors. But these classical multivariate methods deal with two-way matrices, usually parameters/sites or parameters/time, while often the dataset resulting from qualitative water monitoring programs should be seen as a datacube parameters/sites/time. Three-way matrices, such as the one we propose here, are difficult to handle and to analyse by classical multivariate statistical tools and thus should be treated with approaches dealing with three-way data structures. One possible analysis approach consists in the use of partial triadic analysis (PTA). The PTA was previously used with success in many ecological studies but never to date in the domain of hydrogeology. Applied to the dataset of the Luxembourg Sandstone aquifer, the PTA appears as a new promising statistical instrument for hydrogeologists, in particular to characterize temporal and spatial hydrochemical variations induced by natural and anthropogenic factors. This new approach for groundwater management offers potential for 1) identifying a common multivariate spatial structure, 2) untapping the different hydrochemical patterns and explaining their controlling factors and 3) analysing the temporal variability of this structure and grasping hydrochemical changes.
2005-06-01
Qimaging, Burnaby, BC , Canada). Statistical analysis Assessment of apoptosis in MCF-7/dox cells One-way analysis of variance followed by Tukey’s...labelling of P-gp by azidopine. Wood etal classes of drug. we observed that our MCF-7/dox cells (1996) proposed that the drug resistance modification by...adMRi in -7du cedells.ted expressin (1994) Mobile ionophores are a novel class of P-glycoprotein ofdMDR l and MRPI in MCF-7idox cellso inhibitors. The
Downside Risk analysis applied to the Hedge Funds universe
NASA Astrophysics Data System (ADS)
Perelló, Josep
2007-09-01
Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958
A Comparison of Analytical and Data Preprocessing Methods for Spectral Fingerprinting
LUTHRIA, DEVANAND L.; MUKHOPADHYAY, SUDARSAN; LIN, LONG-ZE; HARNLY, JAMES M.
2013-01-01
Spectral fingerprinting, as a method of discriminating between plant cultivars and growing treatments for a common set of broccoli samples, was compared for six analytical instruments. Spectra were acquired for finely powdered solid samples using Fourier transform infrared (FT-IR) and Fourier transform near-infrared (NIR) spectrometry. Spectra were also acquired for unfractionated aqueous methanol extracts of the powders using molecular absorption in the ultraviolet (UV) and visible (VIS) regions and mass spectrometry with negative (MS−) and positive (MS+) ionization. The spectra were analyzed using nested one-way analysis of variance (ANOVA) and principal component analysis (PCA) to statistically evaluate the quality of discrimination. All six methods showed statistically significant differences between the cultivars and treatments. The significance of the statistical tests was improved by the judicious selection of spectral regions (IR and NIR), masses (MS+ and MS−), and derivatives (IR, NIR, UV, and VIS). PMID:21352644
Analysis of Statistical Methods Currently used in Toxicology Journals
Na, Jihye; Yang, Hyeri
2014-01-01
Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health. PMID:25343012
Analysis of Statistical Methods Currently used in Toxicology Journals.
Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min
2014-09-01
Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.
Comparison of the flexural strength of six reinforced restorative materials.
Cohen, B I; Volovich, Y; Musikant, B L; Deutsch, A S
2001-01-01
This study calculated the flexural strength for six reinforced restorative materials and demonstrated that flexural strength values can be determined simply by using physical parameters (diametral tensile strength and Young's modulus values) that are easily determined experimentally. A one-way ANOVA analysis demonstrated a statistically significant difference between the two reinforced glass ionomers and the four composite resin materials, with the composite resin being stronger than the glass ionomers.
Data Preparation 101: How to Use Query-by-Example to Get Your Research Dataset Ready for Primetime
ERIC Educational Resources Information Center
Lazarony, Paul J.; Driscoll, Donna A.
2011-01-01
Researchers are often distressed to discover that the data they wanted to use in their landmark study is not configured in a way that is usable by a Statistical Analysis Software Package (SASP). For example, the data needed may come from two or more sources and it may not be clear to the researcher how to get them combined into one analyzable…
NASA Astrophysics Data System (ADS)
Rivera Landa, Rogelio; Cardenas Cardenas, Eduardo; Fossion, Ruben; Pérez Zepeda, Mario Ulises
2014-11-01
Technological advances in the last few decennia allow the monitoring of many physiological observables in a continuous way, which in physics is called a "time series". The best studied physiological time series is that of the heart rhythm, which can be derived from an electrocardiogram (ECG). Studies have shown that a healthy heart is characterized by a complex time series and high heart rate variability (HRV). In adverse conditions, the cardiac time series degenerates towards randomness (as seen in, e.g., fibrillation) or rigidity (as seen in, e.g., ageing), both corresponding to a loss of HRV as described by, e.g., Golberger et. al [1]. Cardiac and digestive rhythms are regulated by the autonomous nervous system (ANS), that consists of two antagonistic branches, the orthosympathetic branch (ONS) that accelerates the cardiac rhythm but decelerates the digestive system, and the parasympathetic brand (PNS) that works in the opposite way. Because of this reason, one might expect that the statistics of gastro-esophageal time series, as described by Gardner et. al. [2,3], reflects the health state of the digestive system in a similar way as HRV in the cardiac case, described by Minocha et. al. In the present project, we apply statistical methods derived from HRV analysis to time series of esophageal acidity (24h pHmetry). The study is realized on data from a large patient population from the Instituto Nacional de Ciencias Médicas y Nutrición Salvador Zubirán. Our focus is on patients with functional disease (symptoms but no anatomical damage). We find that traditional statistical approaches (e.g. Fourier spectral analysis) are unable to distinguish between different degenerations of the digestive system, such as gastric esophageal reflux disease (GERD) or functional gastrointestinal disorder (FGID).
An Investigation of Dental Luting Cement Solubility as a Function of the Marginal Gap.
1988-05-01
way ANOVA for the Phase 1 Diffusion Study revealed that there were statistically significant differences between the test groups. A Duncan’s Multiple...cement. The 25, 50, and 75 micron groups demonstrated no statistically significant differences in the amount of remaining luting cement. ( p< 0.05) A...one-way ANOVA was also performed on Phase 2 Dynamic Study. This test revealed that there were statistically significant differences among the test
Wald, David S; Butt, Shahena; Bestwick, Jonathan P
2015-10-01
Mobile telephone text messaging is a simple potential solution to the failure to take medications as directed. There is uncertainty over the effectiveness of 1-way text messaging (sending text message reminders only) compared with 2-way text messaging (sending reminders and receiving replies confirming whether medication has been taken) as a means of improving medication adherence. A meta-analysis of 8 randomized trials (1994 patients) that tested the effectiveness of text messaging on medication adherence was performed. The trials were divided into 2 groups: trials using 1-way text messaging versus no text messaging and trials using 2-way text messaging versus no text messaging. The summary estimates of the effect of the 2 methods of text messaging (1-way or 2-way) were compared. The summary relative risk estimate was 1.04 (95% confidence interval, 0.97-1.11) for 1-way text messaging and 1.23 (95% confidence interval, 1.13-1.35) for 2-way text messaging. The difference in effect between the 2 methods was statistically significant (P = .007). Two-way text messaging is associated with substantially improved medication adherence compared with 1-way text messaging. This has important implications in the provision of mobile-based messaging in the management of patients taking medication for the prevention of chronic disease. Copyright © 2015 Elsevier Inc. All rights reserved.
One Yard Below: Education Statistics from a Different Angle.
ERIC Educational Resources Information Center
Education Intelligence Agency, Carmichael, CA.
This report offers a different perspective on education statistics by highlighting rarely used "stand-alone" statistics on public education, inputs, outputs, and descriptions, and it uses interactive statistics that combine two or more statistics in an unusual way. It is a report that presents much evidence, but few conclusions. It is not intended…
Targeted versus statistical approaches to selecting parameters for modelling sediment provenance
NASA Astrophysics Data System (ADS)
Laceby, J. Patrick
2017-04-01
One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.
Stewart, Gavin B.; Altman, Douglas G.; Askie, Lisa M.; Duley, Lelia; Simmonds, Mark C.; Stewart, Lesley A.
2012-01-01
Background Individual participant data (IPD) meta-analyses that obtain “raw” data from studies rather than summary data typically adopt a “two-stage” approach to analysis whereby IPD within trials generate summary measures, which are combined using standard meta-analytical methods. Recently, a range of “one-stage” approaches which combine all individual participant data in a single meta-analysis have been suggested as providing a more powerful and flexible approach. However, they are more complex to implement and require statistical support. This study uses a dataset to compare “two-stage” and “one-stage” models of varying complexity, to ascertain whether results obtained from the approaches differ in a clinically meaningful way. Methods and Findings We included data from 24 randomised controlled trials, evaluating antiplatelet agents, for the prevention of pre-eclampsia in pregnancy. We performed two-stage and one-stage IPD meta-analyses to estimate overall treatment effect and to explore potential treatment interactions whereby particular types of women and their babies might benefit differentially from receiving antiplatelets. Two-stage and one-stage approaches gave similar results, showing a benefit of using anti-platelets (Relative risk 0.90, 95% CI 0.84 to 0.97). Neither approach suggested that any particular type of women benefited more or less from antiplatelets. There were no material differences in results between different types of one-stage model. Conclusions For these data, two-stage and one-stage approaches to analysis produce similar results. Although one-stage models offer a flexible environment for exploring model structure and are useful where across study patterns relating to types of participant, intervention and outcome mask similar relationships within trials, the additional insights provided by their usage may not outweigh the costs of statistical support for routine application in syntheses of randomised controlled trials. Researchers considering undertaking an IPD meta-analysis should not necessarily be deterred by a perceived need for sophisticated statistical methods when combining information from large randomised trials. PMID:23056232
Two-Way Tables: Issues at the Heart of Statistics and Probability for Students and Teachers
ERIC Educational Resources Information Center
Watson, Jane; Callingham, Rosemary
2014-01-01
Some problems exist at the intersection of statistics and probability, creating a dilemma in relation to the best approach to assist student understanding. Such is the case with problems presented in two-way tables representing conditional information. The difficulty can be confounded if the context within which the problem is set is one where…
On the Distribution of Orbital Poles of Milky Way Satellites
NASA Astrophysics Data System (ADS)
Palma, Christopher; Majewski, Steven R.; Johnston, Kathryn V.
2002-01-01
In numerous studies of the outer Galactic halo some evidence for accretion has been found. If the outer halo did form in part or wholly through merger events, we might expect to find coherent streams of stars and globular clusters following orbits similar to those of their parent objects, which are assumed to be present or former Milky Way dwarf satellite galaxies. We present a study of this phenomenon by assessing the likelihood of potential descendant ``dynamical families'' in the outer halo. We conduct two analyses: one that involves a statistical analysis of the spatial distribution of all known Galactic dwarf satellite galaxies (DSGs) and globular clusters, and a second, more specific analysis of those globular clusters and DSGs for which full phase space dynamical data exist. In both cases our methodology is appropriate only to members of descendant dynamical families that retain nearly aligned orbital poles today. Since the Sagittarius dwarf (Sgr) is considered a paradigm for the type of merger/tidal interaction event for which we are searching, we also undertake a case study of the Sgr system and identify several globular clusters that may be members of its extended dynamical family. In our first analysis, the distribution of possible orbital poles for the entire sample of outer (Rgc>8 kpc) halo globular clusters is tested for statistically significant associations among globular clusters and DSGs. Our methodology for identifying possible associations is similar to that used by Lynden-Bell & Lynden-Bell, but we put the associations on a more statistical foundation. Moreover, we study the degree of possible dynamical clustering among various interesting ensembles of globular clusters and satellite galaxies. Among the ensembles studied, we find the globular cluster subpopulation with the highest statistical likelihood of association with one or more of the Galactic DSGs to be the distant, outer halo (Rgc>25 kpc), second-parameter globular clusters. The results of our orbital pole analysis are supported by the great circle cell count methodology of Johnston, Hernquist, & Bolte. The space motions of the clusters Pal 4, NGC 6229, NGC 7006, and Pyxis are predicted to be among those most likely to show the clusters to be following stream orbits, since these clusters are responsible for the majority of the statistical significance of the association between outer halo, second-parameter globular clusters and the Milky Way DSGs. In our second analysis, we study the orbits of the 41 globular clusters and six Milky Way-bound DSGs having measured proper motions to look for objects with both coplanar orbits and similar angular momenta. Unfortunately, the majority of globular clusters with measured proper motions are inner halo clusters that are less likely to retain memory of their original orbit. Although four potential globular cluster/DSG associations are found, we believe three of these associations involving inner halo clusters to be coincidental. While the present sample of objects with complete dynamical data is small and does not include many of the globular clusters that are more likely to have been captured by the Milky Way, the methodology we adopt will become increasingly powerful as more proper motions are measured for distant Galactic satellites and globular clusters, and especially as results from the Space Interferometry Mission (SIM) become available.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
NF-kB2/p52 Activation and Androgen Receptor Signaling in Prostate Cancer
2010-08-01
for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202- 4302. Respondents should be aware...Materials and Methods; ref. 38). Statistical analysis. Data are shown as the mean ± SD. Multiple group comparison was performed by one-way ANO- VA followed... Moroz , Byron Crawford, Asim Abdel-Mageed, New Orleans, LA INTRODUCTION AND OBJECTIVES: African American men (AA) have twice the incidence and mortality
Stawarczyk, Bogna; Ozcan, Mutlu; Roos, Malgorzata; Trottmann, Albert; Hämmerle, Christoph H F
2011-01-01
This study determined the fracture load of zirconia crowns veneered with four overpressed and four layered ceramics after chewing simulation. The veneered zirconia crowns were cemented and subjected to chewing cycling. Subsequently, the specimens were loaded at an angle of 45° in a Universal Testing Machine to determine the fracture load. One-way ANOVA, followed by a post-hoc Scheffé test, t-test and Weibull statistic were performed. Overpressed crowns showed significantly lower fracture load (543-577 N) compared to layered ones (805-1067 N). No statistical difference was found between the fracture loads within the overpressed group. Within the layered groups, LV (1067 N) presented significantly higher results compared to LC (805 N). The mean values of all other groups were not significantly different. Single zirconia crowns veneered with overpressed ceramics exhibited lower fracture load than those of the layered ones after chewing simulation.
Catalog of Observed Tangents to the Spiral Arms in the Milky Way Galaxy
NASA Astrophysics Data System (ADS)
Vallée, Jacques P.
2014-11-01
From the Sun's location in the Galactic disk, one can use different arm tracers (CO, H I, thermal or ionized or relativistic electrons, masers, cold and hot dust, etc.) to locate a tangent to each spiral arm in the disk of the Milky Way. We present a master catalog of the astronomically observed tangents to the Galaxy's spiral arms, using different arm tracers from the literature. Some arm tracers can have slightly divergent results from several papers, so a mean value is taken—see the Appendix for CO, H II, and masers. The catalog of means currently consists of 63 mean tracer entries, spread over many arms (Carina, Crux-Centaurus, Norma, Perseus origin, near 3 kpc, Scutum, Sagittarius), stemming from 107 original arm tracer entries. Additionally, we updated and revised a previous statistical analysis of the angular offset and linear separation from the mid-arm for each different mean arm tracer. Given enough arm tracers, and summing and averaging over all four spiral arms, one could determine if arm tracers have separate and parallel lanes in the Milky Way. This statistical analysis allows a cross-cut of a Galactic spiral arm to be made, confirming a recent discovery of a linear separation between arm tracers. Here, from the mid-arm's CO to the inner edge's hot dust, the arm halfwidth is about 340 pc doubling would yield a full arm width of 680 pc. We briefly compare these observations with the predictions of many spiral arm theories, notably the density wave theory.
CATALOG OF OBSERVED TANGENTS TO THE SPIRAL ARMS IN THE MILKY WAY GALAXY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vallée, Jacques P., E-mail: jacques.vallee@nrc-cnrc.gc.ca
2014-11-01
From the Sun's location in the Galactic disk, one can use different arm tracers (CO, H I, thermal or ionized or relativistic electrons, masers, cold and hot dust, etc.) to locate a tangent to each spiral arm in the disk of the Milky Way. We present a master catalog of the astronomically observed tangents to the Galaxy's spiral arms, using different arm tracers from the literature. Some arm tracers can have slightly divergent results from several papers, so a mean value is taken—see the Appendix for CO, H II, and masers. The catalog of means currently consists of 63 meanmore » tracer entries, spread over many arms (Carina, Crux-Centaurus, Norma, Perseus origin, near 3 kpc, Scutum, Sagittarius), stemming from 107 original arm tracer entries. Additionally, we updated and revised a previous statistical analysis of the angular offset and linear separation from the mid-arm for each different mean arm tracer. Given enough arm tracers, and summing and averaging over all four spiral arms, one could determine if arm tracers have separate and parallel lanes in the Milky Way. This statistical analysis allows a cross-cut of a Galactic spiral arm to be made, confirming a recent discovery of a linear separation between arm tracers. Here, from the mid-arm's CO to the inner edge's hot dust, the arm halfwidth is about 340 pc; doubling would yield a full arm width of 680 pc. We briefly compare these observations with the predictions of many spiral arm theories, notably the density wave theory.« less
GWAMA: software for genome-wide association meta-analysis.
Mägi, Reedik; Morris, Andrew P
2010-05-28
Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Study of long term structural and functional changes in medically controlled glaucoma
Pandey, Achyut N; Sujata, S
2014-01-01
AIM Prospectively analyze the long term structural and functional changes in patients of primary open angle glaucoma (POAG) receiving medical therapy (beta blockers and non beta blockers). In this study an attempt has been made to evaluate whether medical reduction of IOP prevents or delays the progression of glaucomatous visual field loss and/or optic nerve damage in patients with open angle glaucoma. METHODS Study conducted over a period of 27 months, at a tertiary eye care hospital including both eyes of 40 patients with POAG. Group 1 (20 patients, 40 eyes) received beta-blockers, and Group 2 (20 patients, 40 eyes) received non-beta-blockers. Each patient underwent intraocular pressure measurement, best corrected visual acuity, slit-lamp, fundus examination, gonioscopy, central corneal thickness, visual field assessment by Humphrey automated perimetry and retinal nerve fibre layer thickness by Stratus optical coherence tomography at baseline and at two subsequent visits. The average time interval between each visit was 10-11 months. The statistical analysis was done using one-way analysis of variance (ANOVA). Post-hoc test, using tukey' method were adopted. Probablity (P) value of 0.05 or less was considered to be statistically significant. RESULTS A total of 80 eyes of 40 patients of POAG were enrolled, 24 males, 16 females, age group 50-80 years. In both beta and non beta blocker group, reduction (improvement) in mean IOP from initial levels to the levels achieved at the 2nd and 3rd visits was statistically significant. One way ANOVA (df=2), fisher f value=11.64, P=0.000, one way ANOVA (df=3), fisher f value=35.61, P=0.000. Both mean deviation (MD) and pattern standard deviation (PSD) in both beta and non beta blockers at different visits were not statistically significant. Retinal nerve fibre layer thickness (RNFL) -only mean inferior retinal nerve fibre layer, the difference between the mean value in beta and non beta blocker groupwere statistically significant. [unpaired t test value (df=78) =2.27, P=0.03]. Side effects with beta blocker were conjunctival hyperemia (10%), burning (5%), and conjunctival hyperemia (5%) in non beta blockers. CONCLUSION Non-beta-blockers are as effective as beta-blockers in bringing about a significant lowering of intraocular pressure to the normal range, and in preventing progressive damage to the visual fields and retinal nerve fibre layer. The absence of systemic side effects and superior IOP lowering efficacy has made non beta-blockers attractive for first line therapy for the treatment of glaucoma worldwide. PMID:24634878
Study of long term structural and functional changes in medically controlled glaucoma.
Pandey, Achyut N; Sujata, S
2014-01-01
Prospectively analyze the long term structural and functional changes in patients of primary open angle glaucoma (POAG) receiving medical therapy (beta blockers and non beta blockers). In this study an attempt has been made to evaluate whether medical reduction of IOP prevents or delays the progression of glaucomatous visual field loss and/or optic nerve damage in patients with open angle glaucoma. Study conducted over a period of 27 months, at a tertiary eye care hospital including both eyes of 40 patients with POAG. Group 1 (20 patients, 40 eyes) received beta-blockers, and Group 2 (20 patients, 40 eyes) received non-beta-blockers. Each patient underwent intraocular pressure measurement, best corrected visual acuity, slit-lamp, fundus examination, gonioscopy, central corneal thickness, visual field assessment by Humphrey automated perimetry and retinal nerve fibre layer thickness by Stratus optical coherence tomography at baseline and at two subsequent visits. The average time interval between each visit was 10-11 months. The statistical analysis was done using one-way analysis of variance (ANOVA). Post-hoc test, using tukey' method were adopted. Probablity (P) value of 0.05 or less was considered to be statistically significant. A total of 80 eyes of 40 patients of POAG were enrolled, 24 males, 16 females, age group 50-80 years. In both beta and non beta blocker group, reduction (improvement) in mean IOP from initial levels to the levels achieved at the 2nd and 3rd visits was statistically significant. One way ANOVA (df=2), fisher f value=11.64, P=0.000, one way ANOVA (df=3), fisher f value=35.61, P=0.000. Both mean deviation (MD) and pattern standard deviation (PSD) in both beta and non beta blockers at different visits were not statistically significant. Retinal nerve fibre layer thickness (RNFL) -only mean inferior retinal nerve fibre layer, the difference between the mean value in beta and non beta blocker groupwere statistically significant. [unpaired t test value (df=78) =2.27, P=0.03]. Side effects with beta blocker were conjunctival hyperemia (10%), burning (5%), and conjunctival hyperemia (5%) in non beta blockers. Non-beta-blockers are as effective as beta-blockers in bringing about a significant lowering of intraocular pressure to the normal range, and in preventing progressive damage to the visual fields and retinal nerve fibre layer. The absence of systemic side effects and superior IOP lowering efficacy has made non beta-blockers attractive for first line therapy for the treatment of glaucoma worldwide.
Gómez, Miguel A; Lorenzo, Alberto; Barakat, Rubén; Ortega, Enrique; Palao, José M
2008-02-01
The aim of the present study was to identify game-related statistics that differentiate winning and losing teams according to game location. The sample included 306 games of the 2004-2005 regular season of the Spanish professional men's league (ACB League). The independent variables were game location (home or away) and game result (win or loss). The game-related statistics registered were free throws (successful and unsuccessful), 2- and 3-point field goals (successful and unsuccessful), offensive and defensive rebounds, blocks, assists, fouls, steals, and turnovers. Descriptive and inferential analyses were done (one-way analysis of variance and discriminate analysis). The multivariate analysis showed that winning teams differ from losing teams in defensive rebounds (SC = .42) and in assists (SC = .38). Similarly, winning teams differ from losing teams when they play at home in defensive rebounds (SC = .40) and in assists (SC = .41). On the other hand, winning teams differ from losing teams when they play away in defensive rebounds (SC = .44), assists (SC = .30), successful 2-point field goals (SC = .31), and unsuccessful 3-point field goals (SC = -.35). Defensive rebounds and assists were the only game-related statistics common to all three analyses.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
WASP (Write a Scientific Paper) using Excel - 2: Pivot tables.
Grech, Victor
2018-02-01
Data analysis at the descriptive stage and the eventual presentation of results requires the tabulation and summarisation of data. This exercise should always precede inferential statistics. Pivot tables and pivot charts are one of Excel's most powerful and underutilised features, with tabulation functions that immensely facilitate descriptive statistics. Pivot tables permit users to dynamically summarise and cross-tabulate data, create tables in several dimensions, offer a range of summary statistics and can be modified interactively with instant outputs. Large and detailed datasets are thereby easily manipulated making pivot tables arguably the best way to explore, summarise and present data from many different angles. This second paper in the WASP series in Early Human Development provides pointers for pivot table manipulation in Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.
Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip
2018-02-01
Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.
ERIC Educational Resources Information Center
Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.
2012-01-01
Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…
Analysis of repeated measurement data in the clinical trials
Singh, Vineeta; Rana, Rakesh Kumar; Singhal, Richa
2013-01-01
Statistics is an integral part of Clinical Trials. Elements of statistics span Clinical Trial design, data monitoring, analyses and reporting. A solid understanding of statistical concepts by clinicians improves the comprehension and the resulting quality of Clinical Trials. In biomedical research it has been seen that researcher frequently use t-test and ANOVA to compare means between the groups of interest irrespective of the nature of the data. In Clinical Trials we record the data on the patients more than two times. In such a situation using the standard ANOVA procedures is not appropriate as it does not consider dependencies between observations within subjects in the analysis. To deal with such types of study data Repeated Measure ANOVA should be used. In this article the application of One-way Repeated Measure ANOVA has been demonstrated by using the software SPSS (Statistical Package for Social Sciences) Version 15.0 on the data collected at four time points 0 day, 15th day, 30th day, and 45th day of multicentre clinical trial conducted on Pandu Roga (~Iron Deficiency Anemia) with an Ayurvedic formulation Dhatrilauha. PMID:23930038
Statistical analysis of trypanosomes' motility
NASA Astrophysics Data System (ADS)
Zaburdaev, Vasily; Uppaluri, Sravanti; Pfohl, Thomas; Engstler, Markus; Stark, Holger; Friedrich, Rudolf
2010-03-01
Trypanosome is a parasite causing the sleeping sickness. The way it moves in the blood stream and penetrates various obstacles is the area of active research. Our goal was to investigate a free trypanosomes' motion in the planar geometry. Our analysis of trypanosomes' trajectories reveals that there are two correlation times - one is associated with a fast motion of its body and the second one with a slower rotational diffusion of the trypanosome as a point object. We propose a system of Langevin equations to model such motion. One of its peculiarities is the presence of multiplicative noise predicting higher level of noise for higher velocity of the trypanosome. Theoretical and numerical results give a comprehensive description of the experimental data such as the mean squared displacement, velocity distribution and auto-correlation function.
Statistical significance of trace evidence matches using independent physicochemical measurements
NASA Astrophysics Data System (ADS)
Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George
1997-02-01
A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.
Active Structural Acoustic Control as an Approach to Acoustic Optimization of Lightweight Structures
2001-06-01
appropriate approach based on Statistical Energy Analysis (SEA) would facilitate investigations of the structural behavior at a high modal density. On the way...higher frequency investigations an approach based on the Statistical Energy Analysis (SEA) is recommended to describe the structural dynamic behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael; Turitsyn, Konstantin; Sulc, Petr
The anticipated increase in the number of plug-in electric vehicles (EV) will put additional strain on electrical distribution circuits. Many control schemes have been proposed to control EV charging. Here, we develop control algorithms based on randomized EV charging start times and simple one-way broadcast communication allowing for a time delay between communication events. Using arguments from queuing theory and statistical analysis, we seek to maximize the utilization of excess distribution circuit capacity while keeping the probability of a circuit overload negligible.
2009-01-01
representation to a simple curve in 3D by using the Whitney embedding theorem. In a very ludic way, we propose to combine phases one and two to...elimination principle which takes advantage of the designed parametrization. To further refine discrimination among objects, we introduce a post...packing numbers and design of principal curves. IEEE transactions on Pattern Analysis and Machine Intel- ligence, 22(3):281-297, 2000. [68] M. H. Yang, Face
A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.
ERIC Educational Resources Information Center
Liu, Tung; Stone, Courtenay C.
1999-01-01
Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…
Experimental Investigations on Two Potential Sound Diffuseness Measures in Enclosures
NASA Astrophysics Data System (ADS)
Bai, Xin
This study investigates two different approaches to measure sound field diffuseness in enclosures from monophonic room impulse responses. One approach quantifies sound field diffuseness in enclosures by calculating the kurtosis of the pressure samples of room impulse responses. Kurtosis is a statistical measure that is known to describe the peakedness or tailedness of the distribution of a set of data. High kurtosis indicates low diffuseness of the sound field of interest. The other one relies on multifractal detrended fluctuation analysis which is a way to evaluate the statistical self-affinity of a signal to measure diffuseness. To test these two approaches, room impulse responses are obtained under varied room-acoustic diffuseness configurations, achieved by using varied degrees of diffusely reflecting interior surfaces. This paper will analyze experimentally measured monophonic room impulse responses, and discuss results from these two approaches.
Digital versus conventional techniques for pattern fabrication of implant-supported frameworks
Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour
2018-01-01
Objective: The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Materials and Methods: Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method (n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal–Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. Statistical Analysis Used: One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. Results: The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups (P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group (P < 0.001), while there was no significant difference between the two other groups (P = 0.54). CAD/CAM group required significantly more adjustments (P < 0.001). Conclusions: CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations. PMID:29657528
Hosalkar, Harish; Bomar, James D
2012-08-01
This study hypothesizes that the use of continuous passive motion (CPM) following open femoroacetabular impingement (FAI) surgery in the adolescent population improves clinical outcomes in terms of the modified Harris hip score (mHHS). Twenty-nine symptomatic adolescent FAI patients were postoperatively divided into one of three groups; no CPM, two days of inpatient CPM, and two weeks of CPM. mHHS was used preoperatively and postoperatively at six weeks, three months, six months, and nine months in all cases. Kruskal-Wallis (KW) analysis was performed to determine statistical differences in mHHS. mHHS was then re-evaluated using the Mann-Whitney test. There were no statistically significant differences in hip scores between the three groups preoperatively (p = 0.158). There were statistically significant differences (p < 0.001) in mHHS between the three groups at all postoperative time periods. The group that received two weeks of CPM had the best outcome scores. The results of this study suggest that postoperative CPM use following open hip preservation surgery for symptomatic FAI in adolescents improves clinical outcomes. These benefits seem to be related to the duration of CPM. Retrospective comparative study, Level III. Patients treated one way compared with patients treated another way at the same institution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
Metaplot: a novel stata graph for assessing heterogeneity at a glance.
Poorolajal, J; Mahmoodi, M; Majdzadeh, R; Fotouhi, A
2010-01-01
Heterogeneity is usually a major concern in meta-analysis. Although there are some statistical approaches for assessing variability across studies, here we present a new approach to heterogeneity using "MetaPlot" that investigate the influence of a single study on the overall heterogeneity. MetaPlot is a two-way (x, y) graph, which can be considered as a complementary graphical approach for testing heterogeneity. This method shows graphically as well as numerically the results of an influence analysis, in which Higgins' I(2) statistic with 95% (Confidence interval) CI are computed omitting one study in each turn and then are plotted against reciprocal of standard error (1/SE) or "precision". In this graph, "1/SE" lies on x axis and "I(2) results" lies on y axe. Having a first glance at MetaPlot, one can predict to what extent omission of a single study may influence the overall heterogeneity. The precision on x-axis enables us to distinguish the size of each trial. The graph describes I(2) statistic with 95% CI graphically as well as numerically in one view for prompt comparison. It is possible to implement MetaPlot for meta-analysis of different types of outcome data and summary measures. This method presents a simple graphical approach to identify an outlier and its effect on overall heterogeneity at a glance. We wish to suggest MetaPlot to Stata experts to prepare its module for the software.
Metaplot: A Novel Stata Graph for Assessing Heterogeneity at a Glance
Poorolajal, J; Mahmoodi, M; Majdzadeh, R; Fotouhi, A
2010-01-01
Background: Heterogeneity is usually a major concern in meta-analysis. Although there are some statistical approaches for assessing variability across studies, here we present a new approach to heterogeneity using “MetaPlot” that investigate the influence of a single study on the overall heterogeneity. Methods: MetaPlot is a two-way (x, y) graph, which can be considered as a complementary graphical approach for testing heterogeneity. This method shows graphically as well as numerically the results of an influence analysis, in which Higgins’ I2 statistic with 95% (Confidence interval) CI are computed omitting one study in each turn and then are plotted against reciprocal of standard error (1/SE) or “precision”. In this graph, “1/SE” lies on x axis and “I2 results” lies on y axe. Results: Having a first glance at MetaPlot, one can predict to what extent omission of a single study may influence the overall heterogeneity. The precision on x-axis enables us to distinguish the size of each trial. The graph describes I2 statistic with 95% CI graphically as well as numerically in one view for prompt comparison. It is possible to implement MetaPlot for meta-analysis of different types of outcome data and summary measures. Conclusion: This method presents a simple graphical approach to identify an outlier and its effect on overall heterogeneity at a glance. We wish to suggest MetaPlot to Stata experts to prepare its module for the software. PMID:23113013
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
... statistically significant relationship is evaluated by way of the correlation coefficient (r) with statistical... . The analysis revealed a significant high correlation between reduced predicted crew effectiveness (as...
Cutting efficiency of Reciproc and waveOne reciprocating instruments.
Plotino, Gianluca; Giansiracusa Rubini, Alessio; Grande, Nicola M; Testarelli, Luca; Gambarini, Gianluca
2014-08-01
The aim of the present study was to evaluate the cutting efficiency of 2 new reciprocating instruments, Reciproc and WaveOne. Twenty-four new Reciproc R25 and 24 new WaveOne Primary files were activated by using a torque-controlled motor (Silver Reciproc) and divided into 4 groups (n = 12): group 1, Reciproc activated by Reciproc ALL program; group 2, Reciproc activated by WaveOne ALL program; group 3, WaveOne activated by Reciproc ALL program; and group 4, WaveOne activated by WaveOne ALL program. The device used for the cutting test consisted of a main frame to which a mobile plastic support for the handpiece is connected and a stainless steel block containing a Plexiglas block (inPlexiglass, Rome, Italy) against which the cutting efficiency of the instruments was tested. The length of the block cut in 1 minute was measured in a computerized program with a precision of 0.1 mm. Means and standard deviations of each group were calculated, and data were statistically analyzed with 1-way analysis of variance and Bonferroni test (P < .05). Reciproc R25 displayed greater cutting efficiency than WaveOne Primary for both the movements used (P < .05); in particular, Reciproc instruments used with their proper reciprocating motion presented a statistically significant higher cutting efficiency than WaveOne instruments used with their proper reciprocating motion (P < .05). There was no statistically significant difference between the 2 movements for both instruments (P > .05). Reciproc instruments demonstrated statistically higher cutting efficiency than WaveOne instruments. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Barrows, Russell D.
2007-01-01
A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…
Chi-Square Statistics, Tests of Hypothesis and Technology.
ERIC Educational Resources Information Center
Rochowicz, John A.
The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…
MANCOVA for one way classification with homogeneity of regression coefficient vectors
NASA Astrophysics Data System (ADS)
Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.
2017-11-01
The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.
Albonetti, Paolo; Marletta, Antonio; Repetto, Ivano; Sasso, Emanuela Assunta
2015-01-01
This study describes the results of a retrospective evaluation (8 years: 2005-2012) of the efficacy of the anti-fertility drug, Ovistop® nicarbazin (800 ppm) added to corn kernels used to feed non-migratory feral pigeon colonies, Columba livia var. domestica, in the city of Genoa, Italy. The observation interested 4 non-migratory feral pigeon colonies located into well‑defined areas of the city of Genoa, Italy. Three of these colonies were treated for 12 months, with 10 g of drug (Ovistop®) provided per bird per day for 5 days each week; the other colony was treated in the same way but with a placebo (control station). Each colony and the relative area where the colony was located were both monitored with the same daily examination. Statistical analysis techniques were applied to the findings recorded - both descriptive (indices of central and dispersion trends) and comparative (one-way variance analysis). In the colonies treated with the drug, following an initial increase in the population ('magnet effect'), a reduction was observed over the following 4 years (-35% >x> -45%) and a further decrease (-65% >x> -70%) was observed over the subsequent 4 years (statistically significant one-way ANOVA p<0.01). This phenomenon was recorded across the board in the 3 treated stations, compared to the overall unstable trend observed for the control station. As no external or exceptional anthropic or natural factors were observed, it can be stated that, given the results observed, the drug seemed effective in reducing the treated bird populations.
Students' Perspectives of Using Cooperative Learning in a Flipped Statistics Classroom
ERIC Educational Resources Information Center
Chen, Liwen; Chen, Tung-Liang; Chen, Nian-Shing
2015-01-01
Statistics has been recognised as one of the most anxiety-provoking subjects to learn in the higher education context. Educators have continuously endeavoured to find ways to integrate digital technologies and innovative pedagogies in the classroom to eliminate the fear of statistics. The purpose of this study is to systematically identify…
Sruamsiri, Kamphee; Chenthanakij, Boriboon; Wittayachamnankul, Borwon
2014-09-01
Management of patients with severe hypertension without progressive target organ damage remains controversial. Some guidelines mentioned oral anti-hypertensive medication as a treatment to reduce blood pressure in the emergency department, while others recommended against such treatment. To review the management ofpatients with severe hypertension without progressive target organ damage in the emergency department, Maharaj Nakorn Chiang Mai hospital. In a retrospective descriptive analysis study, medical records ofadult patients diagnosed with severe hypertension without progressive target organ damage between January 2011 and December 2012 were reviewed. Patient demographics, data on management including investigation sent and treatment given were collected. Statistical analysis was done by using descriptive statistics and Kruskal-Wallis one-way analysis of variance test. One hundred fifty one medical records were reviewed. Four oral anti-hypertensive medication were used to reduce blood pressure, Amlodipine, Captopril, Hydralazine, and Nifedipine. There were no significant diference between each medication in terms of their effect on bloodpressure reduction (p = 0.513). No side effect or other complications from the use of oral anti-hypertensive medication were recorded The choice of medication used for the treatment of hypertensive urgency ranged from Amlodipine, Captopril, Hydralazine, and Nifedipine, which varied in dosage. However their efficacies were the same when compared with each other and none produced any notable side effects.
One-way ANOVA based on interval information
NASA Astrophysics Data System (ADS)
Hesamian, Gholamreza
2016-08-01
This paper deals with extending the one-way analysis of variance (ANOVA) to the case where the observed data are represented by closed intervals rather than real numbers. In this approach, first a notion of interval random variable is introduced. Especially, a normal distribution with interval parameters is introduced to investigate hypotheses about the equality of interval means or test the homogeneity of interval variances assumption. Moreover, the least significant difference (LSD method) for investigating multiple comparison of interval means is developed when the null hypothesis about the equality of means is rejected. Then, at a given interval significance level, an index is applied to compare the interval test statistic and the related interval critical value as a criterion to accept or reject the null interval hypothesis of interest. Finally, the method of decision-making leads to some degrees to accept or reject the interval hypotheses. An applied example will be used to show the performance of this method.
Effects of Inaccurate Identification of Interictal Epileptiform Discharges in Concurrent EEG-fMRI
NASA Astrophysics Data System (ADS)
Gkiatis, K.; Bromis, K.; Kakkos, I.; Karanasiou, I. S.; Matsopoulos, G. K.; Garganis, K.
2017-11-01
Concurrent continuous EEG-fMRI is a novel multimodal technique that is finding its way into clinical practice in epilepsy. EEG timeseries are used to identify the timing of interictal epileptiform discharges (IEDs) which is then included in a GLM analysis in fMRI to localize the epileptic onset zone. Nevertheless, there are still some concerns about its reliability concerning BOLD changes correlated with IEDs. Even though IEDs are identified by an experienced neurologist-epiliptologist, the reliability and concordance of the mark-ups is depending on many factors including the level of fatigue, the amount of time that he spent or, in some cases, even the screen that is being used for the display of timeseries. This investigation is aiming to unravel the effect of misidentification or inaccuracy in the mark-ups of IEDs in the fMRI statistical parametric maps. Concurrent EEG-fMRI was conducted in six subjects with various types of epilepsy. IEDs were identified by an experienced neurologist-epiliptologist. Analysis of EEG was performed with EEGLAB and analysis of fMRI was conducted in FSL. Preliminary results revealed lower statistical significance for missing events or larger period of IEDs than the actual ones and the introduction of false positives and false negatives in statistical parametric maps when random events were included in the GLM on top of the IEDs. Our results suggest that mark-ups in EEG for simultaneous EEG-fMRI should be done with caution from an experienced and restful neurologist as it affects the fMRI results in various and unpredicted ways.
NASA Astrophysics Data System (ADS)
McPhee, Sidney A.
1985-12-01
This study was designed to survey and compare attitudes and perceptions toward school counseling and student personnel programs as held by educators in the Caribbean. The subjects in the study comprised 275 teachers and administrators employed in public and private junior and senior high schools in Nassau, Bahamas. The statistical tests used to analyze the data were the Kruskal-Wallis one-way analysis of variance and the Friedman two-way analysis for repeated measures. The findings indicate that administrators at all levels expressed significantly more favorable attitudes and perceptions toward counseling and student personnel programs in the schools than teachers. Teachers in the study expressed the following: (a) serious concern regarding the competency of practicing counselors in their schools; (b) a need for clarification of their role and function in the guidance process and a clarification of the counselor's role; and (c) minimum acceptable standards should be established for school counseling positions.
Statistics for People Who (Think They) Hate Statistics. Third Edition
ERIC Educational Resources Information Center
Salkind, Neil J.
2007-01-01
This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…
Implication of correlations among some common stability statistics - a Monte Carlo simulations.
Piepho, H P
1995-03-01
Stability analysis of multilocation trials is often based on a mixed two-way model. Two stability measures in frequent use are the environmental variance (S i (2) )and the ecovalence (W i). Under the two-way model the rank orders of the expected values of these two statistics are identical for a given set of genotypes. By contrast, empirical rank correlations among these measures are consistently low. This suggests that the two-way mixed model may not be appropriate for describing real data. To check this hypothesis, a Monte Carlo simulation was conducted. It revealed that the low empirical rank correlation amongS i (2) and W i is most likely due to sampling errors. It is concluded that the observed low rank correlation does not invalidate the two-way model. The paper also discusses tests for homogeneity of S i (2) as well as implications of the two-way model for the classification of stability statistics.
Resolvent analysis of shear flows using One-Way Navier-Stokes equations
NASA Astrophysics Data System (ADS)
Rigas, Georgios; Schmidt, Oliver; Towne, Aaron; Colonius, Tim
2017-11-01
For three-dimensional flows, questions of stability, receptivity, secondary flows, and coherent structures require the solution of large partial-derivative eigenvalue problems. Reduced-order approximations are thus required for engineering prediction since these problems are often computationally intractable or prohibitively expensive. For spatially slowly evolving flows, such as jets and boundary layers, the One-Way Navier-Stokes (OWNS) equations permit a fast spatial marching procedure that results in a huge reduction in computational cost. Here, an adjoint-based optimization framework is proposed and demonstrated for calculating optimal boundary conditions and optimal volumetric forcing. The corresponding optimal response modes are validated against modes obtained in terms of global resolvent analysis. For laminar base flows, the optimal modes reveal modal and non-modal transition mechanisms. For turbulent base flows, they predict the evolution of coherent structures in a statistical sense. Results from the application of the method to three-dimensional laminar wall-bounded flows and turbulent jets will be presented. This research was supported by the Office of Naval Research (N00014-16-1-2445) and Boeing Company (CT-BA-GTA-1).
Wright, David K.; MacEachern, Scott; Lee, Jaeyong
2014-01-01
The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883
The effects of hands-on-science instruction on the science achievement of middle school students
NASA Astrophysics Data System (ADS)
Wiggins, Felita
Student achievement in the Twenty First Century demands a new rigor in student science knowledge, since advances in science and technology require students to think and act like scientists. As a result, students must acquire proficient levels of knowledge and skills to support a knowledge base that is expanding exponentially with new scientific advances. This study examined the effects of hands-on-science instruction on the science achievement of middle school students. More specifically, this study was concerned with the influence of hands-on science instruction versus traditional science instruction on the science test scores of middle school students. The subjects in this study were one hundred and twenty sixth-grade students in six classes. Instruction involved lecture/discussion and hands-on activities carried out for a three week period. Specifically, the study ascertained the influence of the variables gender, ethnicity, and socioeconomic status on the science test scores of middle school students. Additionally, this study assessed the effect of the variables gender, ethnicity, and socioeconomic status on the attitudes of sixth grade students toward science. The two instruments used to collect data for this study were the Prentice Hall unit ecosystem test and the Scientific Work Experience Programs for Teachers Study (SWEPT) student's attitude survey. Moreover, the data for the study was treated using the One-Way Analysis of Covariance and the One-Way Analysis of Variance. The following findings were made based on the results: (1) A statistically significant difference existed in the science performance of middle school students exposed to hands-on science instruction. These students had significantly higher scores than the science performance of middle school students exposed to traditional instruction. (2) A statistically significant difference did not exist between the science scores of male and female middle school students. (3) A statistically significant difference did not exist between the science scores of African American and non-African American middle school students. (4) A statistically significant difference existed in the socioeconomic status of students who were not provided with assisted lunches. Students with unassisted lunches had significantly higher science scores than those middle school students who were provided with assisted lunches. (5) A statistically significant difference was not found in the attitude scores of middle school students who were exposed to hands-on or traditional science instruction. (6) A statistically significant difference was not found in the observed attitude scores of middle school students who were exposed to either hands-on or traditional science instruction by their socioeconomic status. (7) A statistically significant difference was not found in the observed attitude scores of male and female students. (8) A statistically significant difference was not found in the observed attitude scores of African American and non African American students.
Data Warehousing: How To Make Your Statistics Meaningful.
ERIC Educational Resources Information Center
Flaherty, William
2001-01-01
Examines how one school district found a way to turn data collection from a disparate mountain of statistics into more useful information by using their Instructional Decision Support System. System software is explained as is how the district solved some data management challenges. (GR)
CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi
2010-01-01
Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003
Lakshmi, K Bhagya; Yelchuru, Sri Harsha; Chandrika, V; Lakshmikar, O G; Sagar, V Lakshmi; Reddy, G Vivek
2018-01-01
The main aim is to determine whether growth pattern had an effect on the upper airway by comparing different craniofacial patterns with pharyngeal widths and its importance during the clinical examination. Sixty lateral cephalograms of patients aged between 16 and 24 years with no pharyngeal pathology or nasal obstruction were selected for the study. These were divided into skeletal Class I ( n = 30) and skeletal Class II ( n = 30) using ANB angle subdivided into normodivergent, hyperdivergent, and hypodivergent facial patterns based on SN-GoGn angle. McNamara's airway analysis was used to determine the upper- and lower-airway dimensions. One-way ANOVA was used to do the intergroup comparisons and the Tukey's test as the secondary statistical analysis. Statistically significant difference exists between the upper-airway dimensions in both the skeletal malocclusions with hyperdivergent growth patterns when compared to other growth patterns. In both the skeletal malocclusions, vertical growers showed a significant decrease in the airway size than the horizontal and normal growers. There is no statistical significance between the lower airway and craniofacial growth pattern.
NASA Astrophysics Data System (ADS)
Pearl, Judea
2000-03-01
Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.
Effects of perceived parental attitudes on children's views of smoking.
Ozturk, Candan; Kahraman, Seniha; Bektas, Murat
2013-01-01
The aim of this study was to examine the effects of perceived parental attitudes on children's discernment of cigarettes. The study sample consisted of 250 children attending grades 6, 7 and 8. Data were collected via a socio-demographic survey questionnaire, the Parental Attitude Scale (PAS) and the Decisional Balance Scale (DBS). Data analysis covered percentages, medians, one-way analysis of variance (ANOVA) and post-hoc tests using a statistical package. There were 250 participants; 117 were male, 133 were female. The mean age was 13.1 ± 0.98 for the females and 13.3 ± 0.88 for the males. A statistically significant difference was found in the children's mean scores for 'pros' subscale on the Decisional Balance Scale (DBS) according to perceived parental attitudes (F=3.172, p=0.025). There were no statistically significant differences in the DBS 'cons' subscale scores by perceived parental attitudes. It was determined that while perceived parental attitudes affect children's views on advantages of smoking, they have no effect on children's views on its disadvantages.
Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?
Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R
2013-01-01
The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.
Data series embedding and scale invariant statistics.
Michieli, I; Medved, B; Ristov, S
2010-06-01
Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.
Muhammad, Said; Tahir Shah, M; Khan, Sardar
2010-10-01
The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were <1 in rest of the samples. This level of contamination should have low chronic risk and medium cancer risk when compared with US EPA guidelines. Furthermore, the inter-dependence of physio-chemical parameters and pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis
NASA Astrophysics Data System (ADS)
Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.
2013-05-01
Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.
The effects of academic grouping on student performance in science
NASA Astrophysics Data System (ADS)
Scoggins, Sally Smykla
The current action research study explored how student placement in heterogeneous or homogeneous classes in seventh-grade science affected students' eighth-grade Science State of Texas Assessment of Academic Readiness (STAAR) scores, and how ability grouping affected students' scores based on race and socioeconomic status. The population included all eighth-grade students in the target district who took the regular eighth-grade science STAAR over four academic school years. The researcher ran three statistical tests: a t-test for independent samples, a one-way between subjects analysis of variance (ANOVA) and a two-way between subjects ANOVA. The results showed no statistically significant difference between eighth-grade Pre-AP students from seventh-grade Pre-AP classes and eighth-grade Pre-AP students from heterogeneous seventh-grade classes and no statistically significant difference between Pre-AP students' scores based on socioeconomic status. There was no statistically significant interaction between socioeconomic status and the seventh-grade science classes. The scores between regular eighth-grade students who were in heterogeneous seventh-grade classes were statistically significantly higher than the scores of regular eighth-grade students who were in regular seventh-grade classes. The results also revealed that the scores of students who were White were statistically significantly higher than the scores of students who were Black and Hispanic. Black and Hispanic scores did not differ significantly. Further results indicated that the STAAR Level II and Level III scores were statistically significantly higher for the Pre-AP eighth-grade students who were in heterogeneous seventh-grade classes than the STAAR Level II and Level III scores of Pre-AP eighth-grade students who were in Pre-AP seventh-grade classes.
Zhang, Jian; Li, Li; Gao, Nianfa; Wang, Depei; Gao, Qiang; Jiang, Shengping
2010-03-10
This work was undertaken to evaluate whether it is possible to determine the variety of a Chinese wine on the basis of its volatile compounds, and to investigate if discrimination models could be developed with the experimental wines that could be used for the commercial ones. A headspace solid-phase microextraction gas chromatographic (HS-SPME-GC) procedure was used to determine the volatile compounds and a blind analysis based on Ac/Ais (peak area of volatile compound/peak area of internal standard) was carried out for statistical purposes. One way analysis of variance (ANOVA), principal component analysis (PCA) and stepwise linear discriminant analysis (SLDA) were used to process data and to develop discriminant models. Only 11 peaks enabled to differentiate and classify the experimental wines. SLDA allowed 100% recognition ability for three grape varieties, 100% prediction ability for Cabernet Sauvignon and Cabernet Gernischt wines, but only 92.31% for Merlot wines. A more valid and robust way was to use the PCA scores to do the discriminant analysis. When we performed SLDA this way, 100% recognition ability and 100% prediction ability were obtained. At last, 11 peaks which selected by SLDA from raw analysis set had been identified. When we demonstrated the models using commercial wines, the models showed 100% recognition ability for the wines collected directly from winery and without ageing, but only 65% for the others. Therefore, the varietal factor was currently discredited as a differentiating parameter for commercial wines in China. Nevertheless, this method could be applied as a screening tool and as a complement to other methods for grape base liquors which do not need ageing and blending procedures. 2010 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nielsen, Erik; Blume-Kohout, Robin; Rudinger, Kenneth
PyGSTi is an implementation of Gate Set Tomography in the python programming language. Gate Set Tomography (GST) is a theory and protocol for simultaneously estimating the state preparation, gate operations, and measurement effects of a physical system of one or many quantum bits (qubits). These estimates are based entirely on the statistics of experimental measurements, and their interpretation and analysis can provide a detailed understanding of the types of errors/imperfections in the physical system. In this way, GST provides not only a means of certifying the "goodness" of qubits but also a means of debugging (i.e. improving) them.
Did Tanzania Achieve the Second Millennium Development Goal? Statistical Analysis
ERIC Educational Resources Information Center
Magoti, Edwin
2016-01-01
Development Goal "Achieve universal primary education", the challenges faced, along with the way forward towards achieving the fourth Sustainable Development Goal "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". Statistics show that Tanzania has made very promising steps…
NASA Astrophysics Data System (ADS)
Licquia, Timothy C.; Newman, Jeffrey A.
2016-11-01
The exponential scale length (L d ) of the Milky Way’s (MW’s) disk is a critical parameter for describing the global physical size of our Galaxy, important both for interpreting other Galactic measurements and helping us to understand how our Galaxy fits into extragalactic contexts. Unfortunately, current estimates span a wide range of values and are often statistically incompatible with one another. Here, we perform a Bayesian meta-analysis to determine an improved, aggregate estimate for L d , utilizing a mixture-model approach to account for the possibility that any one measurement has not properly accounted for all statistical or systematic errors. Within this machinery, we explore a variety of ways of modeling the nature of problematic measurements, and then employ a Bayesian model averaging technique to derive net posterior distributions that incorporate any model-selection uncertainty. Our meta-analysis combines 29 different (15 visible and 14 infrared) photometric measurements of L d available in the literature; these involve a broad assortment of observational data sets, MW models and assumptions, and methodologies, all tabulated herein. Analyzing the visible and infrared measurements separately yields estimates for L d of {2.71}-0.20+0.22 kpc and {2.51}-0.13+0.15 kpc, respectively, whereas considering them all combined yields 2.64 ± 0.13 kpc. The ratio between the visible and infrared scale lengths determined here is very similar to that measured in external spiral galaxies. We use these results to update the model of the Galactic disk from our previous work, constraining its stellar mass to be {4.8}-1.1+1.5× {10}10 M ⊙, and the MW’s total stellar mass to be {5.7}-1.1+1.5× {10}10 M ⊙.
Jindal, Rahul; Singh, Smita; Gupta, Siddharth; Jindal, Punita
2012-01-01
The purpose of this study was to evaluate and compare the apical extrusion of debris and irrigant using various rotary instruments with crown down technique in the instrumentation of root canals. Thirty freshly extracted human permanent straight rooted mandibular premolars with minimum root curvature of 0-10 ° were divided in three groups with 10 teeth in each group. Each group was instrumented using one of the three rotary instrumentation systems: Rotary Hero shapers, Rotary ProTaper and Rotary Mtwo. One ml of sterile water was used as an irrigant after using each instrument. Debris extruded was collected in pre weighed glass vials and the extruded irrigant was measured quantitatively by Myers and Montgomery method and was later evaporated. The weight of the dry extruded debris was calculated by comparing the pre and post instrumentation weight of glass vials for each group. Statistical analysis was done by using by a Kruskal-Wallis One-way ANOVA test. Statistical analysis showed that all the rotary instruments used in this study caused apical extrusion of debris and irrigant. A Statistically significant difference was observed with Rotary ProTaper and Rotary Mtwo groups when compared with Rotary Hero shapers. But no significant difference was observed between Rotary ProTaper and Rotary Mtwo groups. After instrumentation with different rotary instruments, Hero shapers showed a less apical extrusion of debris and irrigant.
ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antcheva, I.; /CERN; Ballintijn, M.
2009-01-01
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose outmore » of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way.« less
Luebbert, Joshua; Ghoneima, Ahmed; Lagravère, Manuel O
2016-03-01
The aim of this study was to determine the skeletal and dental changes in rapid maxillary expansion treatments in two different populations assessed through cone-beam computer tomography (CBCT). Twenty-one patients from Edmonton, Canada and 16 patients from Cairo, Egypt with maxillary transverse deficiency (11-17 years old) were treated with a tooth-borne maxillary expander (Hyrax). CBCTs were obtained from each patient at two time points (initial T1 and at removal of appliance at 3-6 months T2). CBCTs were analyzed using AVIZO software and landmarks were placed on skeletal and dental anatomical structures on the cranial base, maxilla and mandible. Descriptive statistics, intraclass correlation coefficients and one-way ANOVA analysis were used to determine if there were skeletal and dental changes and if these changes were statistically different between both populations. Descriptive statistics show that dental changes were larger than skeletal changes for both populations. Skeletal and dental changes between populations were not statistically different (P<0.05) from each other with the exception of the upper incisor proclination being larger in the Indiana group (P>0.05). Rapid maxillary expansion treatments in different populations demonstrate similar skeletal and dental changes. These changes are greater on the dental structures compared to the skeletal ones in a 4:1 ratio. Copyright © 2015 CEO. Published by Elsevier Masson SAS. All rights reserved.
Pawar, Ajinkya M.; Pawar, Mansing G.; Metzger, Zvi; Kokate, Sharad R.
2015-01-01
Aim: The present ex vivo study aimed to evaluate the debris extrusion after instrumenting the root canals by three different files systems. Materials and Methods: Sixty extracted human mandibular premolars with single canals were selected and randomly divided into three groups (n = 20) for instrumentation with three different files. Group 1: WaveOne (primary) single reciprocating file (WO; Dentsply Maillefer, Ballaigues, Switzerland) (25/08), Group 2: Self-adjusting file (SAF; ReDent-Nova, Ra’anana, Israel) (1.5 mm), and Group 3: ProTaper NEXT X1 and X2 (PTN; Dentsply Tulsa Dental, Tulsa, OK) (25/06). Debris extruding by instrumentation were collected into pre-weighed Eppendorf tubes. These tubes were then stored in an incubator at 70°C for 5 days. The tubes were then weighed to obtain the final weight, with the extruded debris. Statistical analysis for the debris extruded apically was performed using one-way analysis of variance and post hoc Tukey's test. Results: The statistical analysis showed a significant difference between all the three groups tested (P < 0.01). The following post hoc Tukey's test confirmed that Group 2 (SAF) exhibited significantly least (P < 0.01) debris extrusion between the three groups tested. Conclusions: The SAF resulted in significantly less extrusion of debris when compared to reciprocating WO and rotary PTN. PMID:25829683
SQC: secure quality control for meta-analysis of genome-wide association studies.
Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre
2017-08-01
Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Bradshaw, Elizabeth J; Keogh, Justin W L; Hume, Patria A; Maulder, Peter S; Nortje, Jacques; Marnewick, Michel
2009-06-01
The purpose of this study was to examine the role of neuromotor noise on golf swing performance in high- and low-handicap players. Selected two-dimensional kinematic measures of 20 male golfers (n=10 per high- or low-handicap group) performing 10 golf swings with a 5-iron club was obtained through video analysis. Neuromotor noise was calculated by deducting the standard error of the measurement from the coefficient of variation obtained from intra-individual analysis. Statistical methods included linear regression analysis and one-way analysis of variance using SPSS. Absolute invariance in the key technical positions (e.g., at the top of the backswing) of the golf swing appears to be a more favorable technique for skilled performance.
Interactive Visualisations and Statistical Literacy
ERIC Educational Resources Information Center
Sutherland, Sinclair; Ridgway, Jim
2017-01-01
Statistical literacy involves engagement with the data one encounters. New forms of data and new ways to engage with data--notably via interactive data visualisations--are emerging. Some of the skills required to work effectively with these new visualisation tools are described. We argue that interactive data visualisations will have as profound…
Automatic analysis of attack data from distributed honeypot network
NASA Astrophysics Data System (ADS)
Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel
2013-05-01
There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.
NASA Astrophysics Data System (ADS)
Gifford, Fay Evan
The purpose of this study was to determine the difference in gender participation in the college physical science laboratory as perceived by students. The sample n this study consisted of 168 college sophomore architecture students (56 males and 33 females) and engineering students (61 males and 18 females). Depending on the type of information desired, a number of analyses were used including independent samples t-test, two-way Anova, general linear model analysis, Univariate analysis of variance, and descriptive statistics. In the analysis of data for the first fourteen questions of the questionnaire, which are called descriptive data, both gender and academic discipline differences were examined. It was found both genders picked personal choice as the role they played in the lab, and they were recorder, computer operator, and set up. There was no major difference here for the two disciplines except for engineers (by four to one over the architectures), who thought one member took the lead and assigned the role. There was no statistically significant difference in attitude toward group laboratory work between the two genders, but there was a significant difference by academic discipline here. There was a significant difference between genders for the way that students were assigned to small groups (i.e., the females would prefer the professor assign the role). For the open-ended student question dealing with suggestions for improving student participation in the labs, about one-third responded. One major difference between the disciplines was the architectural students by a twenty to one ratio over the engineers thought they didn't need a physics lab. For Hypothesis 4, there was a general agreement between the students' and the instructors' that there was not a difference in the students' gender responses and the instructors'. For Hypothesis 5, the responses from the four special gender questions for the students and instructors show that the males don't agree with the instructors on any of the four questions, but the females agree with the instructors on two of the questions.
The impact of clinical use on the torsional behavior of Reciproc and WaveOne instruments.
Magalhães, Rafael Rodrigues Soares de; Braga, Lígia Carolina Moreira; Pereira, Érika Sales Joviano; Peixoto, Isabella Faria da Cunha; Buono, Vicente Tadeu Lopes; Bahia, Maria Guiomar de Azevedo
2016-01-01
The aim of this study was to assess the influence of clinical use, in vivo, on the torsional behavior of Reciproc and WaveOne instruments considering the possibility that they degraded with use. Diameter at each millimeter, pitch length, and area at 3 mm from the tip were determined for both types of instruments. Twenty-four instruments, size 25, 0.08 taper, of each system were divided into two groups (n=12 each): Control Group (CG), in which new Reciproc (RC) and WaveOne Primary (WO) instruments were tested in torsion until rupture based on ISO 3630-1; and Experimental Group (EG), in which each new instrument was clinically used to clean and shape the root canals of one molar. After clinical use, the instruments were analyzed using optical and scanning electron microscopy and subsequently tested in torsion until fracture. Data were analyzed using one-way analysis of variance at a=.05. WO instruments showed significantly higher mean values of cross-sectional area A3 (P=0.000) and smaller pitch lengths than RC instruments with no statistically significant differences in the diameter at D3 (P=0.521). No significant differences in torsional resistance between the RC and WO new instruments (P=0.134) were found. The clinical use resulted in a tendency of reduction in the maximum torque of the analyzed instruments but no statistically significant difference was observed between them (P=0.327). During the preparation of the root canals, two fractured RC instruments and longitudinal and transversal cracks in RC and WO instruments were observed through SEM analysis. After clinical use, no statistically significant reduction in the torsional resistance was observed.
Characterising the disintegration properties of tablets in opaque media using texture analysis.
Scheuerle, Rebekah L; Gerrard, Stephen E; Kendall, Richard A; Tuleu, Catherine; Slater, Nigel K H; Mahbubani, Krishnaa T
2015-01-01
Tablet disintegration characterisation is used in pharmaceutical research, development, and quality control. Standard methods used to characterise tablet disintegration are often dependent on visual observation in measurement of disintegration times. This presents a challenge for disintegration studies of tablets in opaque, physiologically relevant media that could be useful for tablet formulation optimisation. This study has explored an application of texture analysis disintegration testing, a non-visual, quantitative means of determining tablet disintegration end point, by analysing the disintegration behaviour of two tablet formulations in opaque media. In this study, the disintegration behaviour of one tablet formulation manufactured in-house, and Sybedia Flashtab placebo tablets in water, bovine, and human milk were characterised. A novel method is presented to characterise the disintegration process and to quantify the disintegration end points of the tablets in various media using load data generated by a texture analyser probe. The disintegration times in the different media were found to be statistically different (P<0.0001) from one another for both tablet formulations using one-way ANOVA. Using the Tukey post-hoc test, the Sybedia Flashtab placebo tablets were found not to have statistically significant disintegration times from each other in human versus bovine milk (adjusted P value 0.1685). Copyright © 2015 Elsevier B.V. All rights reserved.
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
Murga Oporto, L; Menéndez-de León, C; Bauzano Poley, E; Núñez-Castaín, M J
Among the differents techniques for motor unit number estimation (MUNE) there is the statistical one (Poisson), in which the activation of motor units is carried out by electrical stimulation and the estimation performed by means of a statistical analysis based on the Poisson s distribution. The study was undertaken in order to realize an approximation to the MUNE Poisson technique showing a coprehensible view of its methodology and also to obtain normal results in the extensor digitorum brevis muscle (EDB) from a healthy population. One hundred fourteen normal volunteers with age ranging from 10 to 88 years were studied using the MUNE software contained in a Viking IV system. The normal subjects were divided into two age groups (10 59 and 60 88 years). The EDB MUNE from all them was 184 49. Both, the MUNE and the amplitude of the compound muscle action potential (CMAP) were significantly lower in the older age group (p< 0.0001), showing the MUNE a better correlation with age than CMAP amplitude ( 0.5002 and 0.4142, respectively p< 0.0001). Statistical MUNE method is an important way for the assessment to the phisiology of the motor unit. The value of MUNE correlates better with the neuromuscular aging process than CMAP amplitude does.
Herbison, N; Cobb, S; Gregson, R; Ash, I; Eastgate, R; Purdy, J; Hepburn, T; MacKeith, D; Foss, A
2013-09-01
A computer-based interactive binocular treatment system (I-BiT) for amblyopia has been developed, which utilises commercially available 3D 'shutter glasses'. The purpose of this pilot study was to report the effect of treatment on visual acuity (VA) in children with amblyopia. Thirty minutes of I-BiT treatment was given once weekly for 6 weeks. Treatment sessions consisted of playing a computer game and watching a DVD through the I-BiT system. VA was assessed at baseline, mid-treatment, at the end of treatment, and at 4 weeks post treatment. Standard summary statistics and an exploratory one-way analysis of variance (ANOVA) were performed. Ten patients were enrolled with strabismic, anisometropic, or mixed amblyopia. The mean age was 5.4 years. Nine patients (90%) completed the full course of I-BiT treatment with a mean improvement of 0.18 (SD=0.143). Six out of nine patients (67%) who completed the treatment showed a clinically significant improvement of 0.125 LogMAR units or more at follow-up. The exploratory one-way ANOVA showed an overall effect over time (F=7.95, P=0.01). No adverse effects were reported. This small, uncontrolled study has shown VA gains with 3 hours of I-BiT treatment. Although it is recognised that this pilot study had significant limitations-it was unblinded, uncontrolled, and too small to permit formal statistical analysis-these results suggest that further investigation of I-BiT treatment is worthwhile.
Spectral transmittance of UV-blocking soft contact lenses: a comparative study.
Rahmani, Saeed; Mohammadi Nia, Mohadeseh; Akbarzadeh Baghban, Alireza; Nazari, Mohammad Reza; Ghassemi-Broumand, Mohammad
2014-12-01
Three major parts of sunlight consist of visible, ultraviolet and infrared radiation. Exposure to ultraviolet radiation (UVR) can result in a spectrum of skin and ocular diseases. UV-blocking contact lenses help provide protection against harmful UV radiation. We studied the ultraviolet and visible light rays transmission in some soft UV-blocking contact lenses. Four available tinted soft lenses (Acuvue Moist, Zeiss CONTACT Day 30 Air spheric, Pretty Eyes and Sauflon 56 UV) have been evaluated for UV and visible transmission. One-way ANOVA testing was performed to establish is there a statistically significant difference between the UV regions and visible spectra means for the contact lenses (α=0.05). Pretty Eyes, Zeiss CONTACT, Acuvue Moist and Sauflon 56 UV showed UV-B transmittance value of 0.65%, 10.69%, 1.22%, and 5.78%, respectively. Pretty Eyes and Acuvue Moist had UV-A transmittance values of 32% and 34%, Sauflon 56 UV and Zeiss CONTACT had transmittance values of 48% and 43%, respectively. All of the studied lenses transmitted at least 94.6% on the visible spectrum. The results of the one-way ANOVA statistical analysis show that a statistically significant difference exists within the group of contact lenses tested for the visible (p<0.001), UV-B (p<0.001) and UV-A (p<0.001) portions of the spectrum (α=0.05). Acuvue Moist has the best UV-blocking property and also visible transmission between other tested contact lenses in this study. Copyright © 2014 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.
The use and misuse of statistical analyses. [in geophysics and space physics
NASA Technical Reports Server (NTRS)
Reiff, P. H.
1983-01-01
The statistical techniques most often used in space physics include Fourier analysis, linear correlation, auto- and cross-correlation, power spectral density, and superposed epoch analysis. Tests are presented which can evaluate the significance of the results obtained through each of these. Data presented without some form of error analysis are frequently useless, since they offer no way of assessing whether a bump on a spectrum or on a superposed epoch analysis is real or merely a statistical fluctuation. Among many of the published linear correlations, for instance, the uncertainty in the intercept and slope is not given, so that the significance of the fitted parameters cannot be assessed.
A Vignette (User's Guide) for “An R Package for Statistical ...
StatCharrms is a graphical user front-end for ease of use in analyzing data generated from OCSPP 890.2200, Medaka Extended One Generation Reproduction Test (MEOGRT) and OCSPP 890.2300, Larval Amphibian Gonad Development Assay (LAGDA). The analyses StatCharrms is capable of performing are: Rao-Scott adjusted Cochran-Armitage test for trend By Slices (RSCABS), a Standard Cochran-Armitage test for trend By Slices (SCABS), mixed effects Cox proportional model, Jonckheere-Terpstra step down trend test, Dunn test, one way ANOVA, weighted ANOVA, mixed effects ANOVA, repeated measures ANOVA, and Dunnett test. This document provides a User’s Manual (termed a Vignette by the Comprehensive R Archive Network (CRAN)) for the previously created R-code tool called StatCharrms (Statistical analysis of Chemistry, Histopathology, and Reproduction endpoints using Repeated measures and Multi-generation Studies). The StatCharrms R-code has been publically available directly from EPA staff since the approval of OCSPP 890.2200 and 890.2300, and now is available publically available at the CRAN.
Enhancing the Development of Statistical Literacy through the Robot Bioglyph
ERIC Educational Resources Information Center
Bragg, Leicha A.; Koch, Jessica; Willis, Ashley
2017-01-01
One way to heighten students' interest in the classroom is by personalising tasks. Through designing Robot Bioglyphs students are able to explore personalised data through a creative and engaging process. By understanding, producing and interpreting data, students are able to develop their statistical literacy, which is an essential skill in…
Diversity of social ties in scientific collaboration networks
NASA Astrophysics Data System (ADS)
Shi, Quan; Xu, Bo; Xu, Xiaomin; Xiao, Yanghua; Wang, Wei; Wang, Hengshan
2011-11-01
Diversity is one of the important perspectives to characterize behaviors of individuals in social networks. It is intuitively believed that diversity of social ties accounts for competition advantage and idea innovation. However, quantitative evidences in a real large social network can be rarely found in the previous research. Thanks to the availability of scientific publication records on WWW; now we can construct a large scientific collaboration network, which provides us a chance to gain insight into the diversity of relationships in a real social network through statistical analysis. In this article, we dedicate our efforts to perform empirical analysis on a scientific collaboration network extracted from DBLP, an online bibliographic database in computer science, in a systematical way, finding the following: distributions of diversity indices tend to decay in an exponential or Gaussian way; diversity indices are not trivially correlated to existing vertex importance measures; authors of diverse social ties tend to connect to each other and these authors are generally more competitive than others.
Method for Identifying Probable Archaeological Sites from Remotely Sensed Data
NASA Technical Reports Server (NTRS)
Tilton, James C.; Comer, Douglas C.; Priebe, Carey E.; Sussman, Daniel
2011-01-01
Archaeological sites are being compromised or destroyed at a catastrophic rate in most regions of the world. The best solution to this problem is for archaeologists to find and study these sites before they are compromised or destroyed. One way to facilitate the necessary rapid, wide area surveys needed to find these archaeological sites is through the generation of maps of probable archaeological sites from remotely sensed data. We describe an approach for identifying probable locations of archaeological sites over a wide area based on detecting subtle anomalies in vegetative cover through a statistically based analysis of remotely sensed data from multiple sources. We further developed this approach under a recent NASA ROSES Space Archaeology Program project. Under this project we refined and elaborated this statistical analysis to compensate for potential slight miss-registrations between the remote sensing data sources and the archaeological site location data. We also explored data quantization approaches (required by the statistical analysis approach), and we identified a superior data quantization approached based on a unique image segmentation approach. In our presentation we will summarize our refined approach and demonstrate the effectiveness of the overall approach with test data from Santa Catalina Island off the southern California coast. Finally, we discuss our future plans for further improving our approach.
Tsvetkova, A V; Murtazina, Z A; Markusheva, T V; Mavzutov, A R
2015-05-01
The bacterial vaginosis is one of the most frequent causes of women visiting gynecologist. The diagnostics of bacterial vaginosis is predominantly based on Amsel criteria (1983). Nowadays, the objectivity of these criteria is disputed more often. The analysis of excretion of mucous membranes of posterolateral fornix of vagina was applied to 640 women with clinical diagnosis bacterial vaginosis. The application of light microscopy to mounts of excretion confirmed in laboratory way the diagnosis of bacterial vaginosis in 100 (15.63%) women. The complaints of burning and unpleasant smell and the Amsel criterion of detection of "key cells" against the background of pH > 4.5 were established as statistically significant for bacterial vaginosis. According study data, the occurrence of excretions has no statistical reliable obligation for differentiation of bacterial vaginosis form other inflammatory pathological conditions of female reproductive sphere. At the same time, detection of "key cells" in mount reliably correlated with bacterial vaginosis.
NASA Astrophysics Data System (ADS)
Salmahaminati; Husnaqilati, Atina; Yahya, Amri
2017-01-01
Trash management is one of the society participation to have a good hygiene for each area or nationally. Trash is known as the remainder of regular consumption that should be disposed to do waste processing which will be beneficial and improve the hygiene. The way to do is by sorting plastic which is processed into goods in accordance with the waste. In this study, we will know what are the factors that affect the desire of citizens to process the waste. The factors would have the identity and the state of being of each resident, having known of these factors will be the education about waste management, so it can be compared how the results of the extension by using preliminary data prior to the extension and the final data after extension. The analysis uses multiple logistic regression is the identify factors that influence people’s to desire the waste while the comparison results using t analysis. Data is derived from statistical instrument in the form of a questionnaire.
Navigation with noncoherent data - A demonstration for VEGA Venus flyby phase
NASA Technical Reports Server (NTRS)
Bhat, Ramachandra S.; Ellis, Jordan; Mcelrath, Timothy P.
1988-01-01
Deep Space navigation with noncoherent (one-way) data types is demonstrated for the VEGA Venus flyby phase under extreme conditions. Estimates and statistics are computed using one-way Doppler and wideband Very Long Baseline Interferometry (VLBI) data. The behavior of the onboard oscillator is modeled for both spacecraft to obtain useful orbit determination results. Even with this limitation, it is demonstrated that one-way data solutions are comparable with the solutions using both Soviet sparse coherent (two-way) and wideband VLBI data. During the useful life time of VEGA balloons, the two solutions differ by a maximum of 4.7 km in position and 7.6 cm/sec in velocity for VEGA 1 and by a maximum of 8 km and 42 cm/sec for VEGA 2.
Physiological Efficacy of a Lightweight Ambient Air Cooling Unit for Various Applications
1993-10-01
Acei 10. Mean skin temperature during continuous work . N*TIS. CRjI DTIC TAB 11. Thermal comfort rate during continuous work. . U nagnounf. 4...perceived exertion (RPE) and thermal comfort (TC) were taken every 10 min. Statistical analysis using a 3-way analysis of variance (ANOVA) was conducted...may account for the fact that no statistically significant differences were seen for thermal comfort and ratings of perceived exertion between the IC
Wang, Ling; Xia, Jie-lai; Yu, Li-li; Li, Chan-juan; Wang, Su-zhen
2008-06-01
To explore several numerical methods of ordinal variable in one-way ordinal contingency table and their interrelationship, and to compare corresponding statistical analysis methods such as Ridit analysis and rank sum test. Formula deduction was based on five simplified grading approaches including rank_r(i), ridit_r(i), ridit_r(ci), ridit_r(mi), and table scores. Practical data set was verified by SAS8.2 in clinical practice (to test the effect of Shiwei solution in treatment for chronic tracheitis). Because of the linear relationship of rank_r(i) = N ridit_r(i) + 1/2 = N ridit_r(ci) = (N + 1) ridit_r(mi), the exact chi2 values in Ridit analysis based on ridit_r(i), ridit_r(ci), and ridit_r(mi), were completely the same, and they were equivalent to the Kruskal-Wallis H test. Traditional Ridit analysis was based on ridit_r(i), and its corresponding chi2 value calculated with an approximate variance (1/12) was conservative. The exact chi2 test of Ridit analysis should be used when comparing multiple groups in the clinical researches because of its special merits such as distribution of mean ridit value on (0,1) and clear graph expression. The exact chi2 test of Ridit analysis can be output directly by proc freq of SAS8.2 with ridit and modridit option (SCORES =). The exact chi2 test of Ridit analysis is equivalent to the Kruskal-Wallis H test, and should be used when comparing multiple groups in the clinical researches.
What can health care professionals in the United Kingdom learn from Malawi?
Neville, Ron; Neville, Jemma
2009-01-01
Debate on how resource-rich countries and their health care professionals should help the plight of sub-Saharan Africa appears locked in a mind-set dominated by gloomy statistics and one-way monetary aid. Having established a project to link primary care clinics based on two-way sharing of education rather than one-way aid, our United Kingdom colleagues often ask us: "But what can we learn from Malawi?" A recent fact-finding visit to Malawi helped us clarify some aspects of health care that may be of relevance to health care professionals in the developed world, including the United Kingdom. This commentary article is focused on encouraging debate and discussion as to how we might wish to re-think our relationship with colleagues in other health care environments and consider how we can work together on a theme of two-way shared learning rather than one-way aid. PMID:19327137
Gap Shape Classification using Landscape Indices and Multivariate Statistics
Wu, Chih-Da; Cheng, Chi-Chuan; Chang, Che-Chang; Lin, Chinsu; Chang, Kun-Cheng; Chuang, Yung-Chung
2016-01-01
This study proposed a novel methodology to classify the shape of gaps using landscape indices and multivariate statistics. Patch-level indices were used to collect the qualified shape and spatial configuration characteristics for canopy gaps in the Lienhuachih Experimental Forest in Taiwan in 1998 and 2002. Non-hierarchical cluster analysis was used to assess the optimal number of gap clusters and canonical discriminant analysis was used to generate the discriminant functions for canopy gap classification. The gaps for the two periods were optimally classified into three categories. In general, gap type 1 had a more complex shape, gap type 2 was more elongated and gap type 3 had the largest gaps that were more regular in shape. The results were evaluated using Wilks’ lambda as satisfactory (p < 0.001). The agreement rate of confusion matrices exceeded 96%. Differences in gap characteristics between the classified gap types that were determined using a one-way ANOVA showed a statistical significance in all patch indices (p = 0.00), except for the Euclidean nearest neighbor distance (ENN) in 2002. Taken together, these results demonstrated the feasibility and applicability of the proposed methodology to classify the shape of a gap. PMID:27901127
Gap Shape Classification using Landscape Indices and Multivariate Statistics.
Wu, Chih-Da; Cheng, Chi-Chuan; Chang, Che-Chang; Lin, Chinsu; Chang, Kun-Cheng; Chuang, Yung-Chung
2016-11-30
This study proposed a novel methodology to classify the shape of gaps using landscape indices and multivariate statistics. Patch-level indices were used to collect the qualified shape and spatial configuration characteristics for canopy gaps in the Lienhuachih Experimental Forest in Taiwan in 1998 and 2002. Non-hierarchical cluster analysis was used to assess the optimal number of gap clusters and canonical discriminant analysis was used to generate the discriminant functions for canopy gap classification. The gaps for the two periods were optimally classified into three categories. In general, gap type 1 had a more complex shape, gap type 2 was more elongated and gap type 3 had the largest gaps that were more regular in shape. The results were evaluated using Wilks' lambda as satisfactory (p < 0.001). The agreement rate of confusion matrices exceeded 96%. Differences in gap characteristics between the classified gap types that were determined using a one-way ANOVA showed a statistical significance in all patch indices (p = 0.00), except for the Euclidean nearest neighbor distance (ENN) in 2002. Taken together, these results demonstrated the feasibility and applicability of the proposed methodology to classify the shape of a gap.
More Powerful Tests of Simple Interaction Contrasts in the Two-Way Factorial Design
ERIC Educational Resources Information Center
Hancock, Gregory R.; McNeish, Daniel M.
2017-01-01
For the two-way factorial design in analysis of variance, the current article explicates and compares three methods for controlling the Type I error rate for all possible simple interaction contrasts following a statistically significant interaction, including a proposed modification to the Bonferroni procedure that increases the power of…
NASA Astrophysics Data System (ADS)
Abba, Habu Tela; Hassan, Wan Muhamad Saridan Wan; Saleh, Muneer Aziz; Aliyu, Abubakar Sadiq; Ramli, Ahmad Termizi
2017-11-01
In- situ measurement of terrestrial gamma radiation dose rates (TGRD) was conducted in northern zone of Jos Plateau and a statistical relationship between the TGRD and the underlying geological formations was investigated. The TGRD rates in all the measurements ranged from 40 to 1265 nGy h-1 with a mean value of 250 nGy h-1. The maximum TGDR was recorded on geological type G8 (Younger Granites) at Bisitchi, and the lowest TGDR was recorded on G6 (Basaltic rocks) at Gabia. One way analysis of variance (ANOVA) statistical test was used to compared the data. Significantly, the results of this study inferred a strong relationship between TGRD levels with geological structures of a place. An isodose map was plotted to represent exposure rates due to TGRD. The results of this investigation could be useful for multiple public interest such as evaluating public dose for the area.
Mechanical properties of experimental composites with different calcium phosphates fillers.
Okulus, Zuzanna; Voelkel, Adam
2017-09-01
Calcium phosphates (CaPs)-containing composites have already shown good properties from the point of view of dental restorative materials. The purpose of this study was to examine the crucial mechanical properties of twelve hydroxyapatite- or tricalcium phosphate-filled composites. The raw and surface-treated forms of both CaP fillers were applied. As a reference materials two experimental glass-containing composites and one commercial dental restorative composite were applied. Nano-hardness, elastic modulus, compressive, flexural and diametral tensile strength of all studied materials were determined. Application of statistical methods (one-way analysis of variance and cluster agglomerative analysis) allowed for assessing the similarities between examined materials according to the values of studied parameters. The obtained results show that in almost all cases the mechanical properties of experimental CaPs-composites are comparable or even better than mechanical properties of examined reference materials. Copyright © 2017 Elsevier B.V. All rights reserved.
Creating an Amazing Montessori Toddler Home Environment
ERIC Educational Resources Information Center
Woo, Stephanie
2014-01-01
The author states that raising her twins the Montessori way has made her life easy. Imagine two 1-year-olds eating entire meals on their own, setting their own tables by 20 months, and becoming potty-trained before 2. These are not statistics found in just one household. Children raised the Montessori way can take care of themselves and their…
Replicate This! Creating Individual-Level Data from Summary Statistics Using R
ERIC Educational Resources Information Center
Morse, Brendan J.
2013-01-01
Incorporating realistic data and research examples into quantitative (e.g., statistics and research methods) courses has been widely recommended for enhancing student engagement and comprehension. One way to achieve these ends is to use a data generator to emulate the data in published research articles. "MorseGen" is a free data generator that…
Radiation shielding quality assurance
NASA Astrophysics Data System (ADS)
Um, Dallsun
For the radiation shielding quality assurance, the validity and reliability of the neutron transport code MCNP, which is now one of the most widely used radiation shielding analysis codes, were checked with lot of benchmark experiments. And also as a practical example, follows were performed in this thesis. One integral neutron transport experiment to measure the effect of neutron streaming in iron and void was performed with Dog-Legged Void Assembly in Knolls Atomic Power Laboratory in 1991. Neutron flux was measured six different places with the methane detectors and a BF-3 detector. The main purpose of the measurements was to provide benchmark against which various neutron transport calculation tools could be compared. Those data were used in verification of Monte Carlo Neutron & Photon Transport Code, MCNP, with the modeling for that. Experimental results and calculation results were compared in both ways, as the total integrated value of neutron fluxes along neutron energy range from 10 KeV to 2 MeV and as the neutron spectrum along with neutron energy range. Both results are well matched with the statistical error +/-20%. MCNP results were also compared with those of TORT, a three dimensional discrete ordinates code which was developed by Oak Ridge National Laboratory. MCNP results are superior to the TORT results at all detector places except one. This means that MCNP is proved as a very powerful tool for the analysis of neutron transport through iron & air and further it could be used as a powerful tool for the radiation shielding analysis. For one application of the analysis of variance (ANOVA) to neutron and gamma transport problems, uncertainties for the calculated values of critical K were evaluated as in the ANOVA on statistical data.
Fulzele, Punit; Baliga, Sudhindra; Thosar, Nilima; Pradhan, Debaprya
2011-01-01
Aims: Evaluation of calcium ion and hydroxyl ion release and pH levels in various calcium hydroxide based intracanal medicaments. Objective: The purpose of this study was to evaluate calcium and hydroxyl ion release and pH levels of calcium hydroxide based products, namely, RC Cal, Metapex, calcium hydroxide with distilled water, along with the new gutta-percha points with calcium hydroxide. Materials and Methods: The materials were inserted in polyethylene tubes and immersed in deionized water. The pH variation, Ca++ and OH- release were monitored periodically for 1 week. Statistical Analysis Used: Statistical analysis was carried out using one-way analysis of variance and Tukey's post hoc tests with PASW Statistics version 18 software to compare the statistical difference. Results: After 1 week, calcium hydroxide with distilled water and RC Cal raised the pH to 12.7 and 11.8, respectively, while a small change was observed for Metapex, calcium hydroxide gutta-percha points. The calcium released after 1 week was 15.36 mg/dL from RC Cal, followed by 13.04, 1.296, 3.064 mg/dL from calcium hydroxide with sterile water, Metapex and calcium hydroxide gutta-percha points, respectively. Conclusions: Calcium hydroxide with sterile water and RC Cal pastes liberate significantly more calcium and hydroxyl ions and raise the pH higher than Metapex and calcium hydroxidegutta-percha points. PMID:22346155
Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.
Open-source platform to benchmark fingerprints for ligand-based virtual screening
2013-01-01
Similarity-search methods using molecular fingerprints are an important tool for ligand-based virtual screening. A huge variety of fingerprints exist and their performance, usually assessed in retrospective benchmarking studies using data sets with known actives and known or assumed inactives, depends largely on the validation data sets used and the similarity measure used. Comparing new methods to existing ones in any systematic way is rather difficult due to the lack of standard data sets and evaluation procedures. Here, we present a standard platform for the benchmarking of 2D fingerprints. The open-source platform contains all source code, structural data for the actives and inactives used (drawn from three publicly available collections of data sets), and lists of randomly selected query molecules to be used for statistically valid comparisons of methods. This allows the exact reproduction and comparison of results for future studies. The results for 12 standard fingerprints together with two simple baseline fingerprints assessed by seven evaluation methods are shown together with the correlations between methods. High correlations were found between the 12 fingerprints and a careful statistical analysis showed that only the two baseline fingerprints were different from the others in a statistically significant way. High correlations were also found between six of the seven evaluation methods, indicating that despite their seeming differences, many of these methods are similar to each other. PMID:23721588
Statistical characterization of short wind waves from stereo images of the sea surface
NASA Astrophysics Data System (ADS)
Mironov, Alexey; Yurovskaya, Maria; Dulov, Vladimir; Hauser, Danièle; Guérin, Charles-Antoine
2013-04-01
We propose a methodology to extract short-scale statistical characteristics of the sea surface topography by means of stereo image reconstruction. The possibilities and limitations of the technique are discussed and tested on a data set acquired from an oceanographic platform at the Black Sea. The analysis shows that reconstruction of the topography based on stereo method is an efficient way to derive non-trivial statistical properties of surface short- and intermediate-waves (say from 1 centimer to 1 meter). Most technical issues pertaining to this type of datasets (limited range of scales, lacunarity of data or irregular sampling) can be partially overcome by appropriate processing of the available points. The proposed technique also allows one to avoid linear interpolation which dramatically corrupts properties of retrieved surfaces. The processing technique imposes that the field of elevation be polynomially detrended, which has the effect of filtering out the large scales. Hence the statistical analysis can only address the small-scale components of the sea surface. The precise cut-off wavelength, which is approximatively half the patch size, can be obtained by applying a high-pass frequency filter on the reference gauge time records. The results obtained for the one- and two-points statistics of small-scale elevations are shown consistent, at least in order of magnitude, with the corresponding gauge measurements as well as other experimental measurements available in the literature. The calculation of the structure functions provides a powerful tool to investigate spectral and statistical properties of the field of elevations. Experimental parametrization of the third-order structure function, the so-called skewness function, is one of the most important and original outcomes of this study. This function is of primary importance in analytical scattering models from the sea surface and was up to now unavailable in field conditions. Due to the lack of precise reference measurements for the small-scale wave field, we could not quantify exactly the accuracy of the retrieval technique. However, it appeared clearly that the obtained accuracy is good enough for the estimation of second-order statistical quantities (such as the correlation function), acceptable for third-order quantities (such as the skwewness function) and insufficient for fourth-order quantities (such as kurtosis). Therefore, the stereo technique in the present stage should not be thought as a self-contained universal tool to characterize the surface statistics. Instead, it should be used in conjunction with other well calibrated but sparse reference measurement (such as wave gauges) for cross-validation and calibration. It then completes the statistical analysis in as much as it provides a snapshot of the three-dimensional field and allows for the evaluation of higher-order spatial statistics.
ERIC Educational Resources Information Center
Sidorov, Oleg V.; Kozub, Lyubov' V.; Goferberg, Alexander V.; Osintseva, Natalya V.
2018-01-01
The article discusses the methodological approach to the technology of the educational experiment performance, the ways of the research data processing by means of research methods and methods of mathematical statistics. The article shows the integrated use of some effective approaches to the training of the students majoring in…
ERIC Educational Resources Information Center
Dancer, Diane; Morrison, Kellie; Tarr, Garth
2015-01-01
Peer-assisted study session (PASS) programs have been shown to positively affect students' grades in a majority of studies. This study extends that analysis in two ways: controlling for ability and other factors, with focus on international students, and by presenting results for PASS in business statistics. Ordinary least squares, random effects…
Anatomy of emotion: a 3D study of facial mimicry.
Ferrario, V F; Sforza, C
2007-01-01
Alterations in facial motion severely impair the quality of life and social interaction of patients, and an objective grading of facial function is necessary. A method for the non-invasive detection of 3D facial movements was developed. Sequences of six standardized facial movements (maximum smile; free smile; surprise with closed mouth; surprise with open mouth; right side eye closure; left side eye closure) were recorded in 20 healthy young adults (10 men, 10 women) using an optoelectronic motion analyzer. For each subject, 21 cutaneous landmarks were identified by 2-mm reflective markers, and their 3D movements during each facial animation were computed. Three repetitions of each expression were recorded (within-session error), and four separate sessions were used (between-session error). To assess the within-session error, the technical error of the measurement (random error, TEM) was computed separately for each sex, movement and landmark. To assess the between-session repeatability, the standard deviation among the mean displacements of each landmark (four independent sessions) was computed for each movement. TEM for the single landmarks ranged between 0.3 and 9.42 mm (intrasession error). The sex- and movement-related differences were statistically significant (two-way analysis of variance, p=0.003 for sex comparison, p=0.009 for the six movements, p<0.001 for the sex x movement interaction). Among four different (independent) sessions, the left eye closure had the worst repeatability, the right eye closure had the best one; the differences among various movements were statistically significant (one-way analysis of variance, p=0.041). In conclusion, the current protocol demonstrated a sufficient repeatability for a future clinical application. Great care should be taken to assure a consistent marker positioning in all the subjects.
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
Statistical hadronization and microcanonical ensemble
Becattini, F.; Ferroni, L.
2004-01-01
We present a Monte Carlo calculation of the microcanonical ensemble of the of the ideal hadron-resonance gas including all known states up to a mass of 1. 8 GeV, taking into account quantum statistics. The computing method is a development of a previous one based on a Metropolis Monte Carlo algorithm, with a the grand-canonical limit of the multi-species multiplicity distribution as proposal matrix. The microcanonical average multiplicities of the various hadron species are found to converge to the canonical ones for moderately low values of the total energy. This algorithm opens the way for event generators based for themore » statistical hadronization model.« less
NASA Astrophysics Data System (ADS)
Guadagnini, A.; Riva, M.; Neuman, S. P.
2016-12-01
Environmental quantities such as log hydraulic conductivity (or transmissivity), Y(x) = ln K(x), and their spatial (or temporal) increments, ΔY, are known to be generally non-Gaussian. Documented evidence of such behavior includes symmetry of increment distributions at all separation scales (or lags) between incremental values of Y with sharp peaks and heavy tails that decay asymptotically as lag increases. This statistical scaling occurs in porous as well as fractured media characterized by either one or a hierarchy of spatial correlation scales. In hierarchical media one observes a range of additional statistical ΔY scaling phenomena, all of which are captured comprehensibly by a novel generalized sub-Gaussian (GSG) model. In this model Y forms a mixture Y(x) = U(x) G(x) of single- or multi-scale Gaussian processes G having random variances, U being a non-negative subordinator independent of G. Elsewhere we developed ways to generate unconditional and conditional random realizations of isotropic or anisotropic GSG fields which can be embedded in numerical Monte Carlo flow and transport simulations. Here we present and discuss expressions for probability distribution functions of Y and ΔY as well as their lead statistical moments. We then focus on a simple flow setting of mean uniform steady state flow in an unbounded, two-dimensional domain, exploring ways in which non-Gaussian heterogeneity affects stochastic flow and transport descriptions. Our expressions represent (a) lead order autocovariance and cross-covariance functions of hydraulic head, velocity and advective particle displacement as well as (b) analogues of preasymptotic and asymptotic Fickian dispersion coefficients. We compare them with corresponding expressions developed in the literature for Gaussian Y.
Effect of various putty-wash impression techniques on marginal fit of cast crowns.
Nissan, Joseph; Rosner, Ofir; Bukhari, Mohammed Amin; Ghelfan, Oded; Pilo, Raphael
2013-01-01
Marginal fit is an important clinical factor that affects restoration longevity. The accuracy of three polyvinyl siloxane putty-wash impression techniques was compared by marginal fit assessment using the nondestructive method. A stainless steel master cast containing three abutments with three metal crowns matching the three preparations was used to make 45 impressions: group A = single-step technique (putty and wash impression materials used simultaneously), group B = two-step technique with a 2-mm relief (putty as a preliminary impression to create a 2-mm wash space followed by the wash stage), and group C = two-step technique with a polyethylene spacer (plastic spacer used with the putty impression followed by the wash stage). Accuracy was assessed using a toolmaker microscope to measure and compare the marginal gaps between each crown and finish line on the duplicated stone casts. Each abutment was further measured at the mesial, buccal, and distal aspects. One-way analysis of variance was used for statistical analysis. P values and Scheffe post hoc contrasts were calculated. Significance was determined at .05. One-way analysis of variance showed significant differences among the three impression techniques in all three abutments and at all three locations (P < .001). Group B yielded dies with minimal gaps compared to groups A and C. The two-step impression technique with 2-mm relief was the most accurate regarding the crucial clinical factor of marginal fit.
Cro, Suzie; Mehta, Saahil; Farhadi, Jian; Coomber, Billie; Cornelius, Victoria
2018-01-01
Essential strategies are needed to help reduce the number of post-operative complications and associated costs for breast cancer patients undergoing reconstructive breast surgery. Evidence suggests that local heat preconditioning could help improve the provision of this procedure by reducing skin necrosis. Before testing the effectiveness of heat preconditioning in a definitive randomised controlled trial (RCT), we must first establish the best way to measure skin necrosis and estimate the event rate using this definition. PREHEAT is a single-blind randomised controlled feasibility trial comparing local heat preconditioning, using a hot water bottle, against standard care on skin necrosis among breast cancer patients undergoing reconstructive breast surgery. The primary objective of this study is to determine the best way to measure skin necrosis and to estimate the event rate using this definition in each trial arm. Secondary feasibility objectives include estimating recruitment and 30 day follow-up retention rates, levels of compliance with the heating protocol, length of stay in hospital and the rates of surgical versus conservative management of skin necrosis. The information from these objectives will inform the design of a larger definitive effectiveness and cost-effectiveness RCT. This article describes the PREHEAT trial protocol and detailed statistical analysis plan, which includes the pre-specified criteria and process for establishing the best way to measure necrosis. This study will provide the evidence needed to establish the best way to measure skin necrosis, to use as the primary outcome in a future RCT to definitively test the effectiveness of local heat preconditioning. The pre-specified statistical analysis plan, developed prior to unblinded data extraction, sets out the analysis strategy and a comparative framework to support a committee evaluation of skin necrosis measurements. It will increase the transparency of the data analysis for the PREHEAT trial. ISRCTN ISRCTN15744669. Registered 25 February 2015.
A Survey of Logic Formalisms to Support Mishap Analysis
NASA Technical Reports Server (NTRS)
Johnson, Chris; Holloway, C. M.
2003-01-01
Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.
Koplenig, Alexander; Meyer, Peter; Wolfer, Sascha; Müller-Spitzer, Carolin
2017-01-01
Languages employ different strategies to transmit structural and grammatical information. While, for example, grammatical dependency relationships in sentences are mainly conveyed by the ordering of the words for languages like Mandarin Chinese, or Vietnamese, the word ordering is much less restricted for languages such as Inupiatun or Quechua, as these languages (also) use the internal structure of words (e.g. inflectional morphology) to mark grammatical relationships in a sentence. Based on a quantitative analysis of more than 1,500 unique translations of different books of the Bible in almost 1,200 different languages that are spoken as a native language by approximately 6 billion people (more than 80% of the world population), we present large-scale evidence for a statistical trade-off between the amount of information conveyed by the ordering of words and the amount of information conveyed by internal word structure: languages that rely more strongly on word order information tend to rely less on word structure information and vice versa. Or put differently, if less information is carried within the word, more information has to be spread among words in order to communicate successfully. In addition, we find that–despite differences in the way information is expressed–there is also evidence for a trade-off between different books of the biblical canon that recurs with little variation across languages: the more informative the word order of the book, the less informative its word structure and vice versa. We argue that this might suggest that, on the one hand, languages encode information in very different (but efficient) ways. On the other hand, content-related and stylistic features are statistically encoded in very similar ways. PMID:28282435
Holt, Peter James Edward; Sinha, Sidhartha; Ozdemir, Baris Ata; Karthikesalingam, Alan; Poloniecki, Jan Dominik; Thompson, Matt Merfyn
2014-06-19
The quality of care delivered and clinical outcomes of care are of paramount importance. Wide variations in the outcome of emergency care have been suggested, but the scale of variation, and the way in which outcomes are inter-related are poorly defined and are critical to understand how best to improve services. This study quantifies the scale of variation in three outcomes for a contemporary cohort of patients undergoing emergency medical and surgical admissions. The way in which the outcomes of different diagnoses relate to each other is investigated. A retrospective study using the English Hospital Episode Statistics 2005-2010 with one-year follow-up for all patients with one of 20 of the commonest and highest-risk emergency medical or surgical conditions. The primary outcome was in-hospital all-cause risk-standardised mortality rate (in-RSMR). Secondary outcomes were 1-year all-cause risk-standardised mortality rate (1 yr-RSMR) and 28-day all-cause emergency readmission rate (RSRR). 2,406,709 adult patients underwent emergency medical or surgical admissions in the groups of interest. Clinically and statistically significant variations in outcome were observed between providers for all three outcomes (p < 0.001). For some diagnoses including heart failure, acute myocardial infarction, stroke and fractured neck of femur, more than 20% of hospitals lay above the upper 95% control limit and were statistical outliers. The risk-standardised outcomes within a given hospital for an individual diagnostic group were significantly associated with the aggregated outcome of the other clinical groups. Hospital-level risk-standardised outcomes for emergency admissions across a range of specialties vary considerably and cross traditional speciality boundaries. This suggests that global institutional infra-structure and processes of care influence outcomes. The implications are far reaching, both in terms of investigating performance at individual hospitals and in understanding how hospitals can learn from the best performers to improve outcomes.
Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C
2015-01-01
Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175
1990-03-01
equation of the statistical energy analysis (SEA) using the procedure indicated in equation (13) [8, 9]. Similarly, one may state the quantities (. (X-)) and...CONGRESS ON ACOUSTICS, July 24-31 1986, Toronto, Canada, Paper D6-1. 5. CUSCHIERI, J.M., Power flow as a compliment to statistical energy analysis and...34Random response of identical one-dimensional subsystems", Journal of Sound and Vibration, 1980, Vol. 70, p. 343-353. 8. LYON, R.H., Statistical Energy Analysis of
Mortazavi, Vajihesadat; Fathi, Mohammadhosein; Ataei, Ebrahim; Khodaeian, Niloufar; Askari, Navid
2012-01-01
In this laboratory study shear bond strengths of three filled and one unfilled adhesive systems to enamel and dentine were compared. Forty-eight extracted intact noncarious human mandibular molars were randomly assigned to two groups of 24 one for bonding to enamel and the other for bonding to dentine. Buccal and lingual surfaces of each tooth were randomly assigned for application of each one of filled (Prime & Bond NT (PBNT), Optibond Solo Plus (OBSP), and Clearfil SE Bond (CSEB)) and unfilled (Single Bond (SB)) adhesive systems (n = 12). A universal resin composite was placed into the translucent plastic cylinders (3 mm in diameter and 2 mm in length) and seated against the enamel and dentine surfaces and polymerized for 40 seconds. Shear bond strength was determined using a universal testing machine, and the results were statistically analyzed using two-way ANOVA, one-way ANOVA, t-test, and Tukey HSD post hoc test with a 5% level of significance.There were no statistically significant differences in bond strength between the adhesive systems in enamel, but CSEB and SB exhibited significantly higher and lower bond strength to dentine, respectively, than the other tested adhesive systems while there were no statistically significant differences between PBNT and OBSP. PMID:23209471
[Personal traits and a sense of job-related stress in a military aviation crew].
Cabarkapa, Milanko; Korica, Vesna; Rodjenkov, Sanja
2011-02-01
Accelerated technological and organizational changes in numerous professions lead to increase in job-related stress. Since these changes are particularly common in military aviation, this study examined the way military aviation crew experiences job-related stress during a regular aviation drill, depending on particular social-demographic factors and personal traits. The modified Cooper questionnaire was used to examine the stress related factors at work. The questionnaire was adapted for the aviation crew in the army environment. Personal characteristics were examined using the NEO-PI-R personality inventory. The study included 50 examinees (37 pilots and 13 other crew members) employed in the Serbian Army. The studies were performed during routine physical examinations at the Institute for Aviation Medicine during the year 2007. Statistical analysis of the study results contained descriptive analysis, one-way analysis of variance and correlation analysis. It was shown that army aviation crew works under high stress. The highest stress value had the intrinsic factor (AS = 40.94) and role in organisation (AS = 39.92), while the lowest one had the interpersonal relationship factor (AS = 29.98). The results also showed that some social-demographic variables (such as younger examinees, shorter working experience) and neuroticism as a personality trait, were in correlation with job-related stress. Stress evaluation and certain personality characteristics examination can be used for the development of the basic anti-stress programs and measures in order to achieve better psychological selection, adaptation career leadership and organization of military pilots and other crew members.
Comparing Networks from a Data Analysis Perspective
NASA Astrophysics Data System (ADS)
Li, Wei; Yang, Jing-Yu
To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.
Application of one-way ANOVA in completely randomized experiments
NASA Astrophysics Data System (ADS)
Wahid, Zaharah; Izwan Latiff, Ahmad; Ahmad, Kartini
2017-12-01
This paper describes an application of a statistical technique one-way ANOVA in completely randomized experiments with three replicates. This technique was employed to a single factor with four levels and multiple observations at each level. The aim of this study is to investigate the relationship between chemical oxygen demand index and location on-sites. Two different approaches are employed for the analyses; critical value and p-value. It also presents key assumptions of the technique to be satisfied by the data in order to obtain valid results. Pairwise comparisons by Turkey method are also considered and discussed to determine where the significant differences among the means is after the ANOVA has been performed. The results revealed that there are statistically significant relationship exist between the chemical oxygen demand index and the location on-sites.
A framework for building hypercubes using MapReduce
NASA Astrophysics Data System (ADS)
Tapiador, D.; O'Mullane, W.; Brown, A. G. A.; Luri, X.; Huedo, E.; Osuna, P.
2014-05-01
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalog will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigm but without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Radiomic analysis in prediction of Human Papilloma Virus status.
Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu
2017-12-01
Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.
Identifying city PV roof resource based on Gabor filter
NASA Astrophysics Data System (ADS)
Ruhang, Xu; Zhilin, Liu; Yong, Huang; Xiaoyu, Zhang
2017-06-01
To identify a city’s PV roof resources, the area and ownership distribution of residential buildings in an urban district should be assessed. To achieve this assessment, remote sensing data analysing is a promising approach. Urban building roof area estimation is a major topic for remote sensing image information extraction. There are normally three ways to solve this problem. The first way is pixel-based analysis, which is based on mathematical morphology or statistical methods; the second way is object-based analysis, which is able to combine semantic information and expert knowledge; the third way is signal-processing view method. This paper presented a Gabor filter based method. This result shows that the method is fast and with proper accuracy.
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
Trial Sequential Analysis in systematic reviews with meta-analysis.
Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian
2017-03-06
Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors) and too many false negative conclusions (type II errors). We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D 2 ) measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated that the Trial Sequential Analysis provides better control of type I errors and of type II errors than the traditional naïve meta-analysis. Trial Sequential Analysis represents analysis of meta-analytic data, with transparent assumptions, and better control of type I and type II errors than the traditional meta-analysis using naïve unadjusted confidence intervals.
The effect of single and repeated UVB radiation on rabbit cornea.
Fris, Miroslav; Tessem, May-Britt; Cejková, Jitka; Midelfart, Anna
2006-12-01
Cumulative effect of ultraviolet radiation (UVR) is an important aspect of UV corneal damage. The purpose of this study was to apply high resolution magic angle spinning proton nuclear magnetic resonance (HR-MAS 1H NMR) spectroscopy to evaluate the effect of single and repeated UV radiation exposure of the same overall dose on the rabbit cornea. Corneal surfaces of 24 normal rabbit eyes were examined for the effects of UVB exposure (312 nm). In the first group (UVB1), animals were irradiated with a single dose (3.12 J/cm2; 21 min) of UVB radiation. The animals in the second group (UVB2) were irradiated three times for 7 min every other day (dose of 1.04 J/cm2; days 1, 3, 5) to give the same overall dose (3.12 J/cm2). The third group served as an untreated control group. One day after the last irradiation, the animals were sacrificed, and the corneas were removed and frozen. HR-MAS 1H NMR spectra from intact corneas were obtained. Special grouping patterns among the tissue samples and the relative percentage changes in particular metabolite concentrations were evaluated using modern statistical methods (multivariate analysis, one-way ANOVA). The metabolic profile of both groups of UVB-irradiated samples was significantly different from the control corneas. Substantial decreases in taurine, hypo-taurine and choline-derivatives concentrations and substantial elevation in glucose and betaine levels were observed following the UVR exposure. There was no significant difference between the effect of a single and repeated UVB irradiation of the same overall dose. For the first time, the effects of single and repeated UVR doses on the metabolic profile of the rabbit cornea were analysed and compared. The combination of HR-MAS 1H NMR spectroscopy and modern statistical methods (multivariate analysis, one-way ANOVA) proved suitable to assess the overall view of the metabolic alterations in the rabbit corneal tissue following UVB radiation exposure.
The slip resistance of common footwear materials measured with two slipmeters.
Chang, W R; Matz, S
2001-12-01
The slip resistance of 16 commonly used footwear materials was measured with the Brungraber Mark II and the English XL on 3 floor surfaces under surface conditions of dry, wet, oily and oily wet. Three samples were used for each material combination and surface condition. The results of a one way ANOVA analysis indicated that the differences among different samples were statistically significant for a large number of material combinations and surface conditions. The results indicated that the ranking of materials based on their slip resistance values depends highly on the slipmeters, floor surfaces and surface conditions. For contaminated surfaces including wet, oily and oily wet surfaces, the slip resistance obtained with the English XL was usually higher than that measured with the Brungraber Mark II. The correlation coefficients between the slip resistance obtained with these two slipmeters calculated for different surface conditions indicated a strong correlation with statistical significance.
Angular power spectrum in publically released ALICE events
NASA Astrophysics Data System (ADS)
Llanes-Estrada, Felipe J.; Muñoz Martinez, Jose L.
2018-02-01
We study the particles emitted in the fireball following a Relativistic Heavy Ion Collision with the traditional angular analysis employed in cosmology and earth sciences, producing Mollweide plots of the number and pt distribution of a few actual, publically released ALICE-collaboration events and calculating their angular power spectrum. We also examine the angular spectrum of a simple two-particle correlation. While this may not be the optimal way of analyzing heavy ion data, our intention is to provide a one to one comparison to analysis in cosmology. With the limited statistics at hand, we do not find evidence for acoustic peaks but a decrease of Cl that is reminiscent of viscous attenuation, but subject to a strong effect from the rapidity acceptance which probably dominates (so we also subtract the m = 0 component). As an exercise, we still extract a characteristic Silk damping length (proportional to the square root of the viscosity over entropy density ratio) to illustrate the method. The absence of acoustic-like peaks is also compatible with a crossover from the QGP to the hadron gas (because a surface tension at domain boundaries would effect a restoring force that could have driven acoustic oscillations). Presently we do not understand a depression of the l = 6 multipole strength; perhaps ALICE could reexamine it with full statistics.
Comparative analysis of the performance of One-Way and Two-Way urban road networks
NASA Astrophysics Data System (ADS)
Gheorghe, Carmen
2017-10-01
The fact that the number of vehicles is increasing year after year represents a challenge in road traffic management because it is necessary to adjust the road traffic, in order to prevent any incidents, using mostly the same road infrastructure. At this moment one-way road network provides efficient traffic flow for vehicles but it is not ideal for pedestrians. Therefore, a proper solution must be found and applied when and where it is necessary. Replacing one-way road network with two-way road network may be a viable solution especially if in the area is high pedestrian traffic. The paper aims to highlight the influence of both, one-way and two-way urban road networks through an experimental research which was performed by using traffic data collected in the field. Each of the two scenarios analyzed were based on the same traffic data, the same geometrical conditions of the road (lane width, total road segment width, road slopes, total length of the road network) and also the same signaling conditions (signalised intersection or roundabout). The analysis which involves two-way scenario reveals changes in the performance parameters like delay average, stops average, delay stop average and vehicle speed average. Based on the values obtained, it was possible to perform a comparative analysis between the real, one-way, scenario and the theoretical, two-way, scenario.
NASA Technical Reports Server (NTRS)
Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.
2011-01-01
Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.
Utilization of the Deep Space Atomic Clock for Europa Gravitational Tide Recovery
NASA Technical Reports Server (NTRS)
Seubert, Jill; Ely, Todd
2015-01-01
Estimation of Europa's gravitational tide can provide strong evidence of the existence of a subsurface liquid ocean. Due to limited close approach tracking data, a Europa flyby mission suffers strong coupling between the gravity solution quality and tracking data quantity and quality. This work explores utilizing Low Gain Antennas with the Deep Space Atomic Clock (DSAC) to provide abundant high accuracy uplink-only radiometric tracking data. DSAC's performance, expected to exhibit an Allan Deviation of less than 3e-15 at one day, provides long-term stability and accuracy on par with the Deep Space Network ground clocks, enabling one-way radiometric tracking data with accuracy equivalent to that of its two-way counterpart. The feasibility of uplink-only Doppler tracking via the coupling of LGAs and DSAC and the expected Doppler data quality are presented. Violations of the Kalman filter's linearization assumptions when state perturbations are included in the flyby analysis results in poor determination of the Europa gravitational tide parameters. B-plane targeting constraints are statistically determined, and a solution to the linearization issues via pre-flyby approach orbit determination is proposed and demonstrated.
Ordinal pattern statistics for the assessment of heart rate variability
NASA Astrophysics Data System (ADS)
Graff, G.; Graff, B.; Kaczkowska, A.; Makowiec, D.; Amigó, J. M.; Piskorski, J.; Narkiewicz, K.; Guzik, P.
2013-06-01
The recognition of all main features of a healthy heart rhythm (the so-called sinus rhythm) is still one of the biggest challenges in contemporary cardiology. Recently the interesting physiological phenomenon of heart rate asymmetry has been observed. This phenomenon is related to unbalanced contributions of heart rate decelerations and accelerations to heart rate variability. In this paper we apply methods based on the concept of ordinal pattern to the analysis of electrocardiograms (inter-peak intervals) of healthy subjects in the supine position. This way we observe new regularities of the heart rhythm related to the distribution of ordinal patterns of lengths 3 and 4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berco, Dan, E-mail: danny.barkan@gmail.com; Tseng, Tseung-Yuen, E-mail: tseng@cc.nctu.edu.tw
This study presents an evaluation method for resistive random access memory retention reliability based on the Metropolis Monte Carlo algorithm and Gibbs free energy. The method, which does not rely on a time evolution, provides an extremely efficient way to compare the relative retention properties of metal-insulator-metal structures. It requires a small number of iterations and may be used for statistical analysis. The presented approach is used to compare the relative robustness of a single layer ZrO{sub 2} device with a double layer ZnO/ZrO{sub 2} one, and obtain results which are in good agreement with experimental data.
NASA Astrophysics Data System (ADS)
Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.
2007-03-01
Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.
Did You Hear the One about the Professor?
ERIC Educational Resources Information Center
Bartlett, Thomas
2003-01-01
Describes the ways in which a professor of statistics uses humor in the classroom. Ronald A. Berk uses humor as systematic teaching tool even though some other faculty and administrators consider his approach frivolous. (SLD)
McLeod, Lauren; Hernández, Ivonne A; Heo, Giseon; Lagravère, Manuel O
2016-09-01
The aim of this study was to determine the presence of condylar spatial changes in patients having rapid maxillary expansion treatments compared to a control group. Thirty-seven patients with maxillary transverse deficiency (11-17 years old) were randomly allocated into two groups (one treatment group - tooth borne expander [hyrax] - and one control group). Cone-beam computer tomographies (CBCT) were obtained from each patient at two time points (initial T1 and at removal of appliance at 6 months T2). CBCTs were analyzed using AVIZO software and landmarks were placed on the upper first molars and premolars, cranial base, condyles and glenoid fossa. Descriptive statistics, intraclass correlation coefficients and one-way Anova analysis were used to determine if there was a change in condyle position with respect to the glenoid fossa and cranial base and if there was a statistically significant difference between groups. Descriptive statistics show that changes in the condyle position with respect to the glenoid fossa were minor in both groups (<1.9mm average for both groups). The largest difference in both groups was found when measuring the distance between the left and right condyle heads. When comparing changes between both groups, no statistically significant difference was found between changes in the condyles (P<0.05). Rapid maxillary expansion treatments present mild effects/changes on the condylar position. Nevertheless, these changes do not present a significant difference with controls, thus not constituting a limitation for applying this treatment. Copyright © 2016 CEO. Published by Elsevier Masson SAS. All rights reserved.
The impact of clinical use on the torsional behavior of Reciproc and WaveOne instruments
de MAGALHÃES, Rafael Rodrigues Soares; BRAGA, Lígia Carolina Moreira; PEREIRA, Érika Sales Joviano; PEIXOTO, Isabella Faria da Cunha; BUONO, Vicente Tadeu Lopes; BAHIA, Maria Guiomar de Azevedo
2016-01-01
ABSTRACT Torsional overload is a fracture representative parameter for instruments in single-file techniques. Objective The aim of this study was to assess the influence of clinical use, in vivo, on the torsional behavior of Reciproc and WaveOne instruments considering the possibility that they degraded with use. Material and Methods Diameter at each millimeter, pitch length, and area at 3 mm from the tip were determined for both types of instruments. Twenty-four instruments, size 25, 0.08 taper, of each system were divided into two groups (n=12 each): Control Group (CG), in which new Reciproc (RC) and WaveOne Primary (WO) instruments were tested in torsion until rupture based on ISO 3630-1; and Experimental Group (EG), in which each new instrument was clinically used to clean and shape the root canals of one molar. After clinical use, the instruments were analyzed using optical and scanning electron microscopy and subsequently tested in torsion until fracture. Data were analyzed using one-way analysis of variance at a=.05. Results WO instruments showed significantly higher mean values of cross-sectional area A3 (P=0.000) and smaller pitch lengths than RC instruments with no statistically significant differences in the diameter at D3 (P=0.521). No significant differences in torsional resistance between the RC and WO new instruments (P=0.134) were found. The clinical use resulted in a tendency of reduction in the maximum torque of the analyzed instruments but no statistically significant difference was observed between them (P=0.327). During the preparation of the root canals, two fractured RC instruments and longitudinal and transversal cracks in RC and WO instruments were observed through SEM analysis. Conclusion After clinical use, no statistically significant reduction in the torsional resistance was observed. PMID:27556200
Comparison of Accuracy Between a Conventional and Two Digital Intraoral Impression Techniques.
Malik, Junaid; Rodriguez, Jose; Weisbloom, Michael; Petridis, Haralampos
To compare the accuracy (ie, precision and trueness) of full-arch impressions fabricated using either a conventional polyvinyl siloxane (PVS) material or one of two intraoral optical scanners. Full-arch impressions of a reference model were obtained using addition silicone impression material (Aquasil Ultra; Dentsply Caulk) and two optical scanners (Trios, 3Shape, and CEREC Omnicam, Sirona). Surface matching software (Geomagic Control, 3D Systems) was used to superimpose the scans within groups to determine the mean deviations in precision and trueness (μm) between the scans, which were calculated for each group and compared statistically using one-way analysis of variance with post hoc Bonferroni (trueness) and Games-Howell (precision) tests (IBM SPSS ver 24, IBM UK). Qualitative analysis was also carried out from three-dimensional maps of differences between scans. Means and standard deviations (SD) of deviations in precision for conventional, Trios, and Omnicam groups were 21.7 (± 5.4), 49.9 (± 18.3), and 36.5 (± 11.12) μm, respectively. Means and SDs for deviations in trueness were 24.3 (± 5.7), 87.1 (± 7.9), and 80.3 (± 12.1) μm, respectively. The conventional impression showed statistically significantly improved mean precision (P < .006) and mean trueness (P < .001) compared to both digital impression procedures. There were no statistically significant differences in precision (P = .153) or trueness (P = .757) between the digital impressions. The qualitative analysis revealed local deviations along the palatal surfaces of the molars and incisal edges of the anterior teeth of < 100 μm. Conventional full-arch PVS impressions exhibited improved mean accuracy compared to two direct optical scanners. No significant differences were found between the two digital impression methods.
Kourgialas, Nektarios N; Dokou, Zoi; Karatzas, George P
2015-05-01
The purpose of this study was to create a modeling management tool for the simulation of extreme flow events under current and future climatic conditions. This tool is a combination of different components and can be applied in complex hydrogeological river basins, where frequent flood and drought phenomena occur. The first component is the statistical analysis of the available hydro-meteorological data. Specifically, principal components analysis was performed in order to quantify the importance of the hydro-meteorological parameters that affect the generation of extreme events. The second component is a prediction-forecasting artificial neural network (ANN) model that simulates, accurately and efficiently, river flow on an hourly basis. This model is based on a methodology that attempts to resolve a very difficult problem related to the accurate estimation of extreme flows. For this purpose, the available measurements (5 years of hourly data) were divided in two subsets: one for the dry and one for the wet periods of the hydrological year. This way, two ANNs were created, trained, tested and validated for a complex Mediterranean river basin in Crete, Greece. As part of the second management component a statistical downscaling tool was used for the creation of meteorological data according to the higher and lower emission climate change scenarios A2 and B1. These data are used as input in the ANN for the forecasting of river flow for the next two decades. The final component is the application of a meteorological index on the measured and forecasted precipitation and flow data, in order to assess the severity and duration of extreme events. Copyright © 2015 Elsevier Ltd. All rights reserved.
Natural time analysis of critical phenomena: The case of pre-fracture electromagnetic emissions
NASA Astrophysics Data System (ADS)
Potirakis, S. M.; Karadimitrakis, A.; Eftaxias, K.
2013-06-01
Criticality of complex systems reveals itself in various ways. One way to monitor a system at critical state is to analyze its observable manifestations using the recently introduced method of natural time. Pre-fracture electromagnetic (EM) emissions, in agreement to laboratory experiments, have been consistently detected in the MHz band prior to significant earthquakes. It has been proposed that these emissions stem from the fracture of the heterogeneous materials surrounding the strong entities (asperities) distributed along the fault, preventing the relative slipping. It has also been proposed that the fracture of heterogeneous material could be described in analogy to the critical phase transitions in statistical physics. In this work, the natural time analysis is for the first time applied to the pre-fracture MHz EM signals revealing their critical nature. Seismicity and pre-fracture EM emissions should be two sides of the same coin concerning the earthquake generation process. Therefore, we also examine the corresponding foreshock seismic activity, as another manifestation of the same complex system at critical state. We conclude that the foreshock seismicity data present criticality features as well.
Natural time analysis of critical phenomena: the case of pre-fracture electromagnetic emissions.
Potirakis, S M; Karadimitrakis, A; Eftaxias, K
2013-06-01
Criticality of complex systems reveals itself in various ways. One way to monitor a system at critical state is to analyze its observable manifestations using the recently introduced method of natural time. Pre-fracture electromagnetic (EM) emissions, in agreement to laboratory experiments, have been consistently detected in the MHz band prior to significant earthquakes. It has been proposed that these emissions stem from the fracture of the heterogeneous materials surrounding the strong entities (asperities) distributed along the fault, preventing the relative slipping. It has also been proposed that the fracture of heterogeneous material could be described in analogy to the critical phase transitions in statistical physics. In this work, the natural time analysis is for the first time applied to the pre-fracture MHz EM signals revealing their critical nature. Seismicity and pre-fracture EM emissions should be two sides of the same coin concerning the earthquake generation process. Therefore, we also examine the corresponding foreshock seismic activity, as another manifestation of the same complex system at critical state. We conclude that the foreshock seismicity data present criticality features as well.
An Empirical Taxonomy of Hospital Governing Board Roles
Lee, Shoou-Yih D; Alexander, Jeffrey A; Wang, Virginia; Margolin, Frances S; Combes, John R
2008-01-01
Objective To develop a taxonomy of governing board roles in U.S. hospitals. Data Sources 2005 AHA Hospital Governance Survey, 2004 AHA Annual Survey of Hospitals, and Area Resource File. Study Design A governing board taxonomy was developed using cluster analysis. Results were validated and reviewed by industry experts. Differences in hospital and environmental characteristics across clusters were examined. Data Extraction Methods One-thousand three-hundred thirty-four hospitals with complete information on the study variables were included in the analysis. Principal Findings Five distinct clusters of hospital governing boards were identified. Statistical tests showed that the five clusters had high internal reliability and high internal validity. Statistically significant differences in hospital and environmental conditions were found among clusters. Conclusions The developed taxonomy provides policy makers, health care executives, and researchers a useful way to describe and understand hospital governing board roles. The taxonomy may also facilitate valid and systematic assessment of governance performance. Further, the taxonomy could be used as a framework for governing boards themselves to identify areas for improvement and direction for change. PMID:18355260
Hailu, Fikadu Balcha; Kassahun, Chanyalew Worku; Kerie, Mirkuzie Woldie
2016-01-01
Nurse-physician communication has been shown to have a significant impact on the job satisfaction and retention of staff. In areas where it has been studied, communication failure between nurses and physicians was found to be one of the leading causes of preventable patient injuries, complications, death and medical malpractice claims. The objective of this study is to determine perception of nurses and physicians towards nurse-physician communication in patient care and associated factors in public hospitals of Jimma zone, southwest Ethiopia. Institution based cross-sectional survey was conducted from March 10 to April 16, 2014 among 341 nurses and 168 physicians working in public hospitals in Jimma zone. Data was collected using a pre-tested self-administered questionnaire; entered into EpiData version 3.1 and exported to Statistical Package for Social Sciences (SPSS) version 16.0 for analysis. Factor analysis was carried out. Descriptive statistics, independent sample t-test, linear regression and one way analysis of variance were used. Variables with P-value < 0.05 were considered as statistically significant. The response rate of the study was 91.55%. The mean perceived nurse-physician communication scores were 50.88±19.7% for perceived professional respect and satisfaction, and 48.52±19.7% for perceived openness and sharing of patient information on nurse-physician communication. Age, salary and organizational factors were statistically significant predictors for perceived respect and satisfaction. Whereas sex, working hospital, work attitude individual factors and organizational factors were significant predictors of perceived openness and sharing of patient information in nurse-physician communication during patient care. Perceived level of nurse-physician communication mean score was low among nurses than physicians and it is attention seeking gap. Hence, the finding of our study suggests the need for developing and implementing nurse-physician communication improvement strategies to solve communication mishaps in patient care.
Hailu, Fikadu Balcha; Kassahun, Chanyalew Worku; Kerie, Mirkuzie Woldie
2016-01-01
Background Nurse–physician communication has been shown to have a significant impact on the job satisfaction and retention of staff. In areas where it has been studied, communication failure between nurses and physicians was found to be one of the leading causes of preventable patient injuries, complications, death and medical malpractice claims. Objective The objective of this study is to determine perception of nurses and physicians towards nurse-physician communication in patient care and associated factors in public hospitals of Jimma zone, southwest Ethiopia. Methods Institution based cross-sectional survey was conducted from March 10 to April 16, 2014 among 341 nurses and 168 physicians working in public hospitals in Jimma zone. Data was collected using a pre-tested self-administered questionnaire; entered into EpiData version 3.1 and exported to Statistical Package for Social Sciences (SPSS) version 16.0 for analysis. Factor analysis was carried out. Descriptive statistics, independent sample t-test, linear regression and one way analysis of variance were used. Variables with P-value < 0.05 were considered as statistically significant. Results The response rate of the study was 91.55%. The mean perceived nurse-physician communication scores were 50.88±19.7% for perceived professional respect and satisfaction, and 48.52±19.7% for perceived openness and sharing of patient information on nurse-physician communication. Age, salary and organizational factors were statistically significant predictors for perceived respect and satisfaction. Whereas sex, working hospital, work attitude individual factors and organizational factors were significant predictors of perceived openness and sharing of patient information in nurse-physician communication during patient care. Conclusion Perceived level of nurse-physician communication mean score was low among nurses than physicians and it is attention seeking gap. Hence, the finding of our study suggests the need for developing and implementing nurse-physician communication improvement strategies to solve communication mishaps in patient care. PMID:27632162
Chopra, Karan; Gowda, Arvind U; Morrow, Chris; Holton, Luther; Singh, Devinder P
2016-04-01
Complex abdominal wall reconstruction is beset by postoperative complications. A recent meta-analysis comparing the use of closed-incision negative-pressure therapy to standard dressings found a statistically significant reduction in surgical-site infection. The use of closed-incision negative-pressure therapy is gaining acceptance in this population; however, the economic impact of this innovative dressing remains unknown. In this study, a cost-utility analysis was performed assessing closed-incision negative-pressure therapy and standard dressings following closure of abdominal incisions in high-risk patients. Cost-utility methodology involved reviewing literature related to closed-incision negative-pressure therapy in abdominal wall surgery, obtaining utility estimates to calculate quality-adjusted life-year scores for successful surgery and surgery complicated by surgical-site infection, summing costs using Medicare Current Procedural Terminology codes, and creating a decision tree illuminating the most cost-effective dressing strategy. One-way sensitivity analysis was performed to assess the robustness of the results. The aforementioned meta-analysis comparing closed-incision negative-pressure therapy to standard dressings included a subset of five studies assessing abdominal wall surgery in 829 patients (260 closed-incision negative-pressure therapy and 569 standard dressings). Decision tree analysis revealed an estimated savings of $1546.52 and a gain of 0.0024 quality-adjusted life-year with closed-incision negative-pressure therapy compared with standard dressings; therefore, closed-incision negative-pressure therapy is a dominant treatment strategy. One-way sensitivity analysis revealed that closed-incision negative-pressure therapy is a cost-effective option when the surgical-site infection rate is greater than 16.39 percent. The use of closed-incision negative-pressure therapy is cost-saving following closure of abdominal incisions in high-risk patients.
Lepesqueur, Laura Soares; de Figueiredo, Viviane Maria Gonçalves; Ferreira, Leandro Lameirão; Sobrinho, Argemiro Soares da Silva; Massi, Marcos; Bottino, Marco Antônio; Nogueira Junior, Lafayette
2015-01-01
To determine the effect of maintaining torque after mechanical cycling of abutment screws that are coated with diamondlike carbon and coated with diamondlike carbon doped with diamond nanoparticles, with external and internal hex connections. Sixty implants were divided into six groups according to the type of connection (external or internal hex) and the type of abutment screw (uncoated, coated with diamondlike carbon, and coated with diamondlike carbon doped with diamond nanoparticles). The implants were inserted into polyurethane resin and crowns of nickel chrome were cemented on the implants. The crowns had a hole for access to the screw. The initial torque and the torque after mechanical cycling were measured. The torque values maintained (in percentages) were evaluated. Statistical analysis was performed using one-way analysis of variance and the Tukey test, with a significance level of 5%. The largest torque value was maintained in uncoated screws with external hex connections, a finding that was statistically significant (P = .0001). No statistically significant differences were seen between the groups with and without coating in maintaining torque for screws with internal hex connections (P = .5476). After mechanical cycling, the diamondlike carbon with and without diamond doping on the abutment screws showed no improvement in maintaining torque in external and internal hex connections.
Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Ma, Hsin-I; Hsu, Hsian-He; Juan, Chun-Jung
2018-01-01
We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey's, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey's formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey's formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas.
Singh, Abhishek; Arunagiri, Doraiswamy; Pushpa, Shankarappa; Sawhny, Asheesh; Misra, Abhinav; Khetan, Kirti
2015-01-01
The purpose of this ex vivo study was to evaluate and compare the weight of debris and volume of irrigant extruded apically from teeth using different preparation techniques. Thirty extracted human mandibular premolars with single canals and similar lengths were instrumented using hand ProTaper F2 (25, 0.08; Dentsply Maillefer, Ballaigues, Switzerland), M-two (25, 0.06; VDW, Munich, Germany) and WaveOne Primary (25, 0.08; Dentsply Maillefer, Ballaigues, Switzerland). Debris and irrigant extruded during instrumentation were collected into preweighed Eppendorf tubes. The volume of the irrigant was measured, and then the tubes were stored in an incubator at 70°C for 2 days. The Eppendorf tubes were weighed to obtain the final weight when the extruded debris was included. Three consecutive weights were obtained for each tube. Data were statistically analyzed by one-way analysis of variance and Student's t-test. There were no statistically significant differences among the groups. The WaveOne reciprocating system showed the maximum amount of apical extrusion of debris and irrigant among all the groups. The least amount of debris and irrigant was observed in ProTaper hand instrument (P > 0.05). All instrumentation techniques were associated with debris and irrigant extrusion.
Heritability of antisocial behaviour at 9: do callous-unemotional traits matter?
Viding, Essi; Jones, Alice P; Frick, Paul J; Moffitt, Terrie E; Plomin, Robert
2008-01-01
A previous finding from our group indicated that teacher-rated antisocial behaviour (AB) among 7-year-olds is particularly heritable in the presence of callous-unemotional (CU) traits. Using a sample of 1865 same-sex twin pairs, we employed DeFries-Fulker extremes analysis to investigate whether teacher-rated AB with/without CU traits also shows aetiological differences among 9-year-olds. Furthermore, we assessed whether the differences in the magnitude of heritability would be evident even when hyperactive symptoms were controlled for in the statistical analysis. AB among 9-year-olds was more heritable with than without concomitant CU. The heritability difference was even more pronounced in magnitude when hyperactive symptoms were controlled. CU traits thus appear to index one valid way of sub-typing children with early-onset AB.
Yu, Marcia M L; Sandercock, P Mark L
2012-01-01
During the forensic examination of textile fibers, fibers are usually mounted on glass slides for visual inspection and identification under the microscope. One method that has the capability to accurately identify single textile fibers without subsequent demounting is Raman microspectroscopy. The effect of the mountant Entellan New on the Raman spectra of fibers was investigated to determine if it is suitable for fiber analysis. Raman spectra of synthetic fibers mounted in three different ways were collected and subjected to multivariate analysis. Principal component analysis score plots revealed that while spectra from different fiber classes formed distinct groups, fibers of the same class formed a single group regardless of the mounting method. The spectra of bare fibers and those mounted in Entellan New were found to be statistically indistinguishable by analysis of variance calculations. These results demonstrate that fibers mounted in Entellan New may be identified directly by Raman microspectroscopy without further sample preparation. © 2011 American Academy of Forensic Sciences.
Association Analysis in Rice: From Application to Utilization
Zhang, Peng; Zhong, Kaizhen; Shahid, Muhammad Qasim; Tong, Hanhua
2016-01-01
Association analysis based on linkage disequilibrium (LD) is an efficient way to dissect complex traits and to identify gene functions in rice. Although association analysis is an effective way to construct fine maps for quantitative traits, there are a few issues which need to be addressed. In this review, we will first summarize type, structure, and LD level of populations used for association analysis of rice, and then discuss the genotyping methods and statistical approaches used for association analysis in rice. Moreover, we will review current shortcomings and benefits of association analysis as well as specific types of future research to overcome these shortcomings. Furthermore, we will analyze the reasons for the underutilization of the results within association analysis in rice breeding. PMID:27582745
Wang, Cheng; Peng, Jingjin; Kuang, Yanling; Zhang, Jiaqiang; Dai, Luming
2017-01-01
Pleural effusion is a common clinical manifestation with various causes. Current diagnostic and therapeutic methods have exhibited numerous limitations. By involving the analysis of dynamic changes in low molecular weight catabolites, metabolomics has been widely applied in various types of disease and have provided platforms to distinguish many novel biomarkers. However, to the best of our knowledge, there are few studies regarding the metabolic profiling for pleural effusion. In the current study, 58 pleural effusion samples were collected, among which 20 were malignant pleural effusions, 20 were tuberculous pleural effusions and 18 were transudative pleural effusions. The small molecule metabolite spectrums were obtained by adopting 1H nuclear magnetic resonance technology, and pattern-recognition multi-variable statistical analysis was used to screen out different metabolites. One-way analysis of variance, and Student-Newman-Keuls and the Kruskal-Wallis test were adopted for statistical analysis. Over 400 metabolites were identified in the untargeted metabolomic analysis and 26 metabolites were identified as significantly different among tuberculous, malignant and transudative pleural effusions. These metabolites were predominantly involved in the metabolic pathways of amino acids metabolism, glycometabolism and lipid metabolism. Statistical analysis revealed that eight metabolites contributed to the distinction between the three groups: Tuberculous, malignant and transudative pleural effusion. In the current study, the feasibility of identifying small molecule biochemical profiles in different types of pleural effusion were investigated reveal novel biological insights into the underlying mechanisms. The results provide specific insights into the biology of tubercular, malignant and transudative pleural effusion and may offer novel strategies for the diagnosis and therapy of associated diseases, including tuberculosis, advanced lung cancer and congestive heart failure. PMID:28627685
NASA Astrophysics Data System (ADS)
Magazù, Salvatore; Mezei, Ferenc; Migliardo, Federica
2018-05-01
In a variety of applications of inelastic neutron scattering spectroscopy the goal is to single out the elastic scattering contribution from the total scattered spectrum as a function of momentum transfer and sample environment parameters. The elastic part of the spectrum is defined in such a case by the energy resolution of the spectrometer. Variable elastic energy resolution offers a way to distinguish between elastic and quasi-elastic intensities. Correlation spectroscopy lends itself as an efficient, high intensity approach for accomplishing this both at continuous and pulsed neutron sources. On the one hand, in beam modulation methods the Liouville theorem coupling between intensity and resolution is relaxed and time-of-flight velocity analysis of the neutron velocity distribution can be performed with 50 % duty factor exposure for all available resolutions. On the other hand, the (quasi)elastic part of the spectrum generally contains the major part of the integrated intensity at a given detector, and thus correlation spectroscopy can be applied with most favorable signal to statistical noise ratio. The novel spectrometer CORELLI at SNS is an example for this type of application of the correlation technique at a pulsed source. On a continuous neutron source a statistical chopper can be used for quasi-random time dependent beam modulation and the total time-of-flight of the neutron from the statistical chopper to detection is determined by the analysis of the correlation between the temporal fluctuation of the neutron detection rate and the statistical chopper beam modulation pattern. The correlation analysis can either be used for the determination of the incoming neutron velocity or for the scattered neutron velocity, depending of the position of the statistical chopper along the neutron trajectory. These two options are considered together with an evaluation of spectrometer performance compared to conventional spectroscopy, in particular for variable resolution elastic neutron scattering (RENS) studies of relaxation processes and the evolution of mean square displacements. A particular focus of our analysis is the unique feature of correlation spectroscopy of delivering high and resolution independent beam intensity, thus the same statistical chopper scan contains both high intensity and high resolution information at the same time, and can be evaluated both ways. This flexibility for variable resolution data handling represents an additional asset for correlation spectroscopy in variable resolution work. Changing the beam width for the same statistical chopper allows us to additionally trade resolution for intensity in two different experimental runs, similarly for conventional single slit chopper spectroscopy. The combination of these two approaches is a capability of particular value in neutron spectroscopy studies requiring variable energy resolution, such as the systematic study of quasi-elastic scattering and mean square displacement. Furthermore the statistical chopper approach is particularly advantageous for studying samples with low scattering intensity in the presence of a high, sample independent background.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Three Dimensional CFD Analysis of the GTX Combustor
NASA Technical Reports Server (NTRS)
Steffen, C. J., Jr.; Bond, R. B.; Edwards, J. R.
2002-01-01
The annular combustor geometry of a combined-cycle engine has been analyzed with three-dimensional computational fluid dynamics. Both subsonic combustion and supersonic combustion flowfields have been simulated. The subsonic combustion analysis was executed in conjunction with a direct-connect test rig. Two cold-flow and one hot-flow results are presented. The simulations compare favorably with the test data for the two cold flow calculations; the hot-flow data was not yet available. The hot-flow simulation indicates that the conventional ejector-ramjet cycle would not provide adequate mixing at the conditions tested. The supersonic combustion ramjet flowfield was simulated with frozen chemistry model. A five-parameter test matrix was specified, according to statistical design-of-experiments theory. Twenty-seven separate simulations were used to assemble surrogate models for combustor mixing efficiency and total pressure recovery. ScramJet injector design parameters (injector angle, location, and fuel split) as well as mission variables (total fuel massflow and freestream Mach number) were included in the analysis. A promising injector design has been identified that provides good mixing characteristics with low total pressure losses. The surrogate models can be used to develop performance maps of different injector designs. Several complex three-way variable interactions appear within the dataset that are not adequately resolved with the current statistical analysis.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
The epistemology of mathematical and statistical modeling: a quiet methodological revolution.
Rodgers, Joseph Lee
2010-01-01
A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.
The impact of young drivers' lifestyle on their road traffic accident risk in greater Athens area.
Chliaoutakis, J E; Darviri, C; Demakakos, P T
1999-11-01
Young drivers (18-24) both in Greece and elsewhere appear to have high rates of road traffic accidents. Many factors contribute to the creation of these high road traffic accidents rates. It has been suggested that lifestyle is an important one. The main objective of this study is to find out and clarify the (potential) relationship between young drivers' lifestyle and the road traffic accident risk they face. Moreover, to examine if all the youngsters have the same elevated risk on the road or not. The sample consisted of 241 young Greek drivers of both sexes. The statistical analysis included factor analysis and logistic regression analysis. Through the principal component analysis a ten factor scale was created which included the basic lifestyle traits of young Greek drivers. The logistic regression analysis showed that the young drivers whose dominant lifestyle trait is alcohol consumption or drive without destination have high accident risk, while these whose dominant lifestyle trait is culture, face low accident risk. Furthermore, young drivers who are religious in one way or another seem to have low accident risk. Finally, some preliminary observations on how health promotion should be put into practice are discussed.
Huang, Yuan; Teng, Zhongzhao; Sadat, Umar; Graves, Martin J; Bennett, Martin R; Gillard, Jonathan H
2014-04-11
Compositional and morphological features of carotid atherosclerotic plaques provide complementary information to luminal stenosis in predicting clinical presentations. However, they alone cannot predict cerebrovascular risk. Mechanical stress within the plaque induced by cyclical changes in blood pressure has potential to assess plaque vulnerability. Various modeling strategies have been employed to predict stress, including 2D and 3D structure-only, 3D one-way and fully coupled fluid-structure interaction (FSI) simulations. However, differences in stress predictions using different strategies have not been assessed. Maximum principal stress (Stress-P1) within 8 human carotid atherosclerotic plaques was calculated based on geometry reconstructed from in vivo computerized tomography and high resolution, multi-sequence magnetic resonance images. Stress-P1 within the diseased region predicted by 2D and 3D structure-only, and 3D one-way FSI simulations were compared to 3D fully coupled FSI analysis. Compared to 3D fully coupled FSI, 2D structure-only simulation significantly overestimated stress level (94.1 kPa [65.2, 117.3] vs. 85.5 kPa [64.4, 113.6]; median [inter-quartile range], p=0.0004). However, when slices around the bifurcation region were excluded, stresses predicted by 2D structure-only simulations showed a good correlation (R(2)=0.69) with values obtained from 3D fully coupled FSI analysis. 3D structure-only model produced a small yet statistically significant stress overestimation compared to 3D fully coupled FSI (86.8 kPa [66.3, 115.8] vs. 85.5 kPa [64.4, 113.6]; p<0.0001). In contrast, one-way FSI underestimated stress compared to 3D fully coupled FSI (78.8 kPa [61.1, 100.4] vs. 85.5 kPa [64.4, 113.7]; p<0.0001). A 3D structure-only model seems to be a computationally inexpensive yet reasonably accurate approximation for stress within carotid atherosclerotic plaques with mild to moderate luminal stenosis as compared to fully coupled FSI analysis. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
A critique of the usefulness of inferential statistics in applied behavior analysis
Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.
1998-01-01
Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304
Some historical relationships between science and technology with implications for behavior analysis
Moxley, Roy A.
1989-01-01
The relationship between science and technology is examined in terms of some implications for behavior analysis. Problems result when this relationship is seen as one in which science generally begets technology in a one-way, or hierarchical, relationship. These problems are not found when the relationship between science and technology is seen as two-way, or symmetrical, within a larger context of relationships. Some historical examples are presented. Collectively, these and other examples in the references weaken the case for a prevailing one-way, hierarchical relationship and strengthen the case for a two-way, symmetrical relationship. In addition to being more accurate historically, the symmetrical relationship is also more consistent with the principles of behavior analysis. PMID:22478016
Sener, Sevgi; Guler, Ozkan
2012-01-01
The aim of this research was to compare the differences between patients with myofascial pain and disc displacement and asymptomatic individuals based on aspects of psychologic status and sleep quality. One hundred thirty patients (81 women, 49 men; mean ages: 30.0 and 31.0 years, respectively) with temporomandibular disorder were selected, and 64 control subjects (32 women, 32 men; mean ages: 27.2 and 27.5 years, respectively) were included in the investigation over a period of 1 year. Clinical diagnosis of 65 patients with myofascial pain and 65 patients with disc displacement with or without limitation and joint pain was determined according to the Research Diagnostic Criteria for Temporomandibular Disorders. The Pittsburgh Sleep Quality Index (PSQI) was used to evaluate sleep quality. Psychologic status was assessed using Symptom Checklist-90-Revised (SCL-90-R). Chi-square, Kolmogorov-Smirnov, one-way analysis of variance, and Tukey Honestly Significant Difference post hoc multiple comparison or Tamhane T2 tests were used for statistical analysis. There was a significant difference between patients with myofascial pain and disc displacement regarding somatization and paranoid ideation. No statistically significant difference was found between patients with disc displacements and controls in all dimensions of the SCL-90-R. Total score for the PSQI was statistically significantly different between patients with myofascial pain and controls; no significant differences were found between patients with disc displacement and those with myofascial pain or controls regarding the PSQI. To manage patients with myofascial pain, psychologic assessments including sleep quality should be considered.
González-García, Lorena; Chemello, Clarice; García-Sánchez, Filomena; Serpa-Anaya, Delia C.; Gómez-González, Carmen; Soriano-Carrascosa, Leticia; Muñoz-de Rueda, Paloma; Moya-Molina, Miguel; Sánchez-García, Fernando; Ortega-Calvo, Manuel
2012-01-01
Background: Bearing in mind the philosophical pedagogical significance of short phrases for the training of researchers in the health care ambit, we hence have studied the aphorisms and striking phrases expressed during the epidemiology course at the Andalusian School of Public Health. Methods: Belonging to the qualitative type and applied through the establishment of a multidisciplinary focus group made up of ten post-graduated students, where one of them acted as a moderator. The collection of information lasted four months. Information was classified in two ways: Firstly, aphorisms and short phrases with a pedagogical impact; and secondly, data with statistical, epidemiological, epistemological, pragmatic, or heuristic component, and for scientific diffusion. It was decided to perform a triangulation that included a descriptive presentation and a basic categorical analysis. The two teachers with a highest interpretative load have been identified . Results: A total of 127 elements, regarded as of interest by the focus group, were collected. Forty-four of them (34.6%) were aphorisms, and 83 were short phrases with a pedagogical load (65.3%). Most of all them were classified as statistical elements (35.4%) followed by epistemological (21.3%) and epidemiological (15.7%) elements. There was no tendency towards aphorisms or short phrases (P > 0.05) among the teachers with more informative representation. Conclusion: There has been a tilt in the contents towards the statistical area to the detriment of the epidemiological one. Concept maps have visualized classifications. This sort of qualitative analysis helps the researcher review contents acquired during his/her training process. PMID:22448313
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pande, Monu; Dubey, Vikash K.; Jagannadham, Medicherla V., E-mail: vdubey@iitg.ernet.in
2007-02-01
Cryptolepain is a stable glycosylated novel serine protease was crystallized by hanging-drop method. Crystal data was processed up to 2.25 Å with acceptable statistics and structure determination of the enzyme is under way. Cryptolepain is a stable glycosylated novel serine protease purified from the latex of the medicinally important plant Cryptolepis buchanani. The molecular weight of the enzyme is 50.5 kDa, as determined by mass spectrometry. The sequence of the first 15 N-terminal resides of the protease showed little homology with those of other plant serine proteases, suggesting it to be structurally unique. Thus, it is of interest to solvemore » the structure of the enzyme in order to better understand its structure–function relationship. X-ray diffraction data were collected from a crystal of cryptolepain and processed to 2.25 Å with acceptable statistics. The crystals belong to the orthorhombic space group C222{sub 1}, with unit-cell parameters a = 81.78, b = 108.15, c = 119.86 Å. The Matthews coefficient was 2.62 Å{sup 3} Da{sup −1} with one molecule in the asymmetric unit. The solvent content was found to be 53%. Structure determination of the enzyme is under way.« less
Core, Cynthia; Brown, Janean W; Larsen, Michael D; Mahshie, James
2014-01-01
The objectives of this research were to determine whether an adapted version of a Hybrid Visual Habituation procedure could be used to assess speech perception of phonetic and prosodic features of speech (vowel height, lexical stress, and intonation) in individual pre-school-age children who use cochlear implants. Nine children ranging in age from 3;4 to 5;5 participated in this study. Children were prelingually deaf and used cochlear implants and had no other known disabilities. Children received two speech feature tests using an adaptation of a Hybrid Visual Habituation procedure. Seven of the nine children demonstrated perception of at least one speech feature using this procedure using results from a Bayesian linear regression analysis. At least one child demonstrated perception of each speech feature using this assessment procedure. An adapted version of the Hybrid Visual Habituation Procedure with an appropriate statistical analysis provides a way to assess phonetic and prosodicaspects of speech in pre-school-age children who use cochlear implants.
An Analysis of Variance Framework for Matrix Sampling.
ERIC Educational Resources Information Center
Sirotnik, Kenneth
Significant cost savings can be achieved with the use of matrix sampling in estimating population parameters from psychometric data. The statistical design is intuitively simple, using the framework of the two-way classification analysis of variance technique. For example, the mean and variance are derived from the performance of a certain grade…
Seven ways to increase power without increasing N.
Hansen, W B; Collins, L M
1994-01-01
Many readers of this monograph may wonder why a chapter on statistical power was included. After all, by now the issue of statistical power is in many respects mundane. Everyone knows that statistical power is a central research consideration, and certainly most National Institute on Drug Abuse grantees or prospective grantees understand the importance of including a power analysis in research proposals. However, there is ample evidence that, in practice, prevention researchers are not paying sufficient attention to statistical power. If they were, the findings observed by Hansen (1992) in a recent review of the prevention literature would not have emerged. Hansen (1992) examined statistical power based on 46 cohorts followed longitudinally, using nonparametric assumptions given the subjects' age at posttest and the numbers of subjects. Results of this analysis indicated that, in order for a study to attain 80-percent power for detecting differences between treatment and control groups, the difference between groups at posttest would need to be at least 8 percent (in the best studies) and as much as 16 percent (in the weakest studies). In order for a study to attain 80-percent power for detecting group differences in pre-post change, 22 of the 46 cohorts would have needed relative pre-post reductions of greater than 100 percent. Thirty-three of the 46 cohorts had less than 50-percent power to detect a 50-percent relative reduction in substance use. These results are consistent with other review findings (e.g., Lipsey 1990) that have shown a similar lack of power in a broad range of research topics. Thus, it seems that, although researchers are aware of the importance of statistical power (particularly of the necessity for calculating it when proposing research), they somehow are failing to end up with adequate power in their completed studies. This chapter argues that the failure of many prevention studies to maintain adequate statistical power is due to an overemphasis on sample size (N) as the only, or even the best, way to increase statistical power. It is easy to see how this overemphasis has come about. Sample size is easy to manipulate, has the advantage of being related to power in a straight-forward way, and usually is under the direct control of the researcher, except for limitations imposed by finances or subject availability. Another option for increasing power is to increase the alpha used for hypothesis-testing but, as very few researchers seriously consider significance levels much larger than the traditional .05, this strategy seldom is used. Of course, sample size is important, and the authors of this chapter are not recommending that researchers cease choosing sample sizes carefully. Rather, they argue that researchers should not confine themselves to increasing N to enhance power. It is important to take additional measures to maintain and improve power over and above making sure the initial sample size is sufficient. The authors recommend two general strategies. One strategy involves attempting to maintain the effective initial sample size so that power is not lost needlessly. The other strategy is to take measures to maximize the third factor that determines statistical power: effect size.
2011-01-01
Background Vaginitis is a common complaint in primary care. In uncomplicated candidal vaginitis, there are no differences in effectiveness between oral or vaginal treatment. Some studies describe that the preferred treatment is the oral one, but a Cochrane's review points out inconsistencies associated with the report of the preferred way that limit the use of such data. Risk factors associated with recurrent vulvovaginal candidiasis still remain controversial. Methods/Design This work describes a protocol of a multicentric prospective observational study with one year follow up, to describe the women's reasons and preferences to choose the way of administration (oral vs topical) in the treatment of not complicated candidal vaginitis. The number of women required is 765, they are chosen by consecutive sampling. All of whom are aged 16 and over with vaginal discharge and/or vaginal pruritus, diagnosed with not complicated vulvovaginitis in Primary Care in Madrid. The main outcome variable is the preferences of the patients in treatment choice; secondary outcome variables are time to symptoms relief and adverse reactions and the frequency of recurrent vulvovaginitis and the risk factors. In the statistical analysis, for the main objective will be descriptive for each of the variables, bivariant analysis and multivariate analysis (logistic regression).. The dependent variable being the type of treatment chosen (oral or topical) and the independent, the variables that after bivariant analysis, have been associated to the treatment preference. Discussion Clinical decisions, recommendations, and practice guidelines must not only attend to the best available evidence, but also to the values and preferences of the informed patient. PMID:21281464
APPLICATION OF STATISTICAL ENERGY ANALYSIS TO VIBRATIONS OF MULTI-PANEL STRUCTURES.
cylindrical shell are compared with predictions obtained from statistical energy analysis . Generally good agreement is observed. The flow of mechanical...the coefficients of proportionality between power flow and average modal energy difference, which one must know in order to apply statistical energy analysis . No
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Practical statistics in pain research.
Kim, Tae Kyun
2017-10-01
Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.
Interpreting Association from Graphical Displays
ERIC Educational Resources Information Center
Fitzallen, Noleine
2016-01-01
Research that has explored students' interpretations of graphical representations has not extended to include how students apply understanding of particular statistical concepts related to one graphical representation to interpret different representations. This paper reports on the way in which students' understanding of covariation, evidenced…
DOT National Transportation Integrated Search
2011-01-01
Transportation Satellite Accounts (TSA), produced by the Bureau of Economic Analysis and the Bureau of Transportation Statistics, provides measures of national transportation output. TSA includes both in-house and for-hire transportation services. Fo...
Quantum random oracle model for quantum digital signature
NASA Astrophysics Data System (ADS)
Shang, Tao; Lei, Qi; Liu, Jianwei
2016-10-01
The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.
DOT National Transportation Integrated Search
2000-09-01
This special issue of the Journal of Transportation and Statistics is devoted to the statistical analysis and modeling of automotive emissions. It contains many of the papers presented in the mini-symposium last August and also includes one additiona...
Comparative evaluation of low cost materials as constructed wetland filling media
NASA Astrophysics Data System (ADS)
Pinho, Henrique J. O.; Vaz, Mafalda M.; Mateus, Dina M. R.
2017-11-01
Three waste materials from civil construction activities were assessed as low cost alternative filling materials used in Constructed Wetlands (CW). CW are green processes for wastewater treatment, whose design includes an appropriate selection of vegetation and filling material. The sustainability of such processes may be incremented using recovered wastes as filling materials. The abilities of the materials to support plant growth and to contribute to pollutants removal from wastewater were assessed and compared to expanded clay, a filling usually used in CW design. Statistical analysis, using one-way ANOVA and Welch's ANOVA, demonstrate that limestone fragments are a better choice of filling material than brick fragments and basalt gravel.
NASA Astrophysics Data System (ADS)
Hashim, S. H. A.; Hamid, F. A.; Kiram, J. J.; Sulaiman, J.
2017-09-01
This paper aims to investigate the relationship between factors that affecting the demand for broadband and the level of satisfaction. Previous researchers have found that the adoption of broadband is greatly influenced by many factors. Thus, in this study, a self-administered questionnaire was developed to obtain the factors affecting demand for broadband among broadband customers as well as their level of satisfaction. Pearson correlation, one-way analysis of variance (ANOVA) and t-test were used for statistical interpretation of the relationship. This study shows that there are better relationships between several factors over demand for broadband and satisfaction level.
Gupta, Rajesh K; Reddy, Pooja S
2013-10-01
Jasminum grandiflorum belongs to the family Oleaceae and is known to have anti-inflammatory, antimicrobial, antioxidant, and antiulcer activities. The present study was undertaken to study its analgesic and anticonvulsant effects in rats and mice. The antinociceptive activity of the hydroalcoholic extract of J. grandiflorum leaves (HEJGL) was studied using tail flick and acetic acid - induced writhing method. Similarly, its anticonvulsant activity was observed by maximal electroshock (MES) method and pentylenetetrazol (PTZ) method. Statistical analysis was performed using one-way analysis of variance (ANOVA) followed by Dunnett's test. At doses of 50, 100, and 200 mg/kg, HEJGL showed significant analgesic and anticonvulsant effects in experimental animals. In view of its analgesic and anticonvulsant activity, the JGL extract can be used in painful conditions as well as in seizure disorders.
Gupta, Rajesh K.; Reddy, Pooja S.
2013-01-01
Jasminum grandiflorum belongs to the family Oleaceae and is known to have anti-inflammatory, antimicrobial, antioxidant, and antiulcer activities. The present study was undertaken to study its analgesic and anticonvulsant effects in rats and mice. The antinociceptive activity of the hydroalcoholic extract of J. grandiflorum leaves (HEJGL) was studied using tail flick and acetic acid – induced writhing method. Similarly, its anticonvulsant activity was observed by maximal electroshock (MES) method and pentylenetetrazol (PTZ) method. Statistical analysis was performed using one-way analysis of variance (ANOVA) followed by Dunnett's test. At doses of 50, 100, and 200 mg/kg, HEJGL showed significant analgesic and anticonvulsant effects in experimental animals. In view of its analgesic and anticonvulsant activity, the JGL extract can be used in painful conditions as well as in seizure disorders. PMID:24174823
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
Callan, Richard S; Palladino, Christie L; Furness, Alan R; Bundy, Emily L; Ange, Brittany L
2014-10-01
Recent efforts have been directed towards utilizing CAD/CAM technology in the education of future dentists. The purpose of this pilot study was to investigate the feasibility of implementing CAD/CAM technology in instruction on preparing a tooth for restoration. Students at one dental school were assigned access to CAD/CAM technology vs. traditional preparation methods in a randomized, crossover design. In a convenience sample of a second-year class, seventy-six of the seventy-nine students volunteered to participate, for a response rate of 96 percent. Two analyses were performed on this pilot data: a primary effectiveness analysis comparing students' competency exam scores by intervention group (intention-to-treat analysis) and a secondary efficacy analysis comparing competency exam scores among students who reported using CAD/CAM versus those who did not. The effectiveness analysis showed no difference in outcomes by intervention group assignment. While student survey results indicated interest in utilizing the technology, the actual utilization rate was much less than one might anticipate, yielding a sample size that limited statistical power. The secondary analysis demonstrated higher mean competency exam scores for students reporting use of CAD/CAM compared to those who did not use the technology, but these results did not reach statistical significance (p=0.075). Prior research has investigated the efficacy of CAD/CAM in a controlled educational trial, but this study adds to the literature by investigating student use of CAD/CAM in a real-world, self-study fashion. Further studies should investigate ways in which to increase student utilization of CAD/CAM and whether or not increased utilization, with a larger sample size, would yield significant outcomes.
Do regional methods really help reduce uncertainties in flood frequency analyses?
NASA Astrophysics Data System (ADS)
Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric
2013-04-01
Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.
ERIC Educational Resources Information Center
Luh, Wei-Ming; Guo, Jiin-Huarng
2011-01-01
Sample size determination is an important issue in planning research. In the context of one-way fixed-effect analysis of variance, the conventional sample size formula cannot be applied for the heterogeneous variance cases. This study discusses the sample size requirement for the Welch test in the one-way fixed-effect analysis of variance with…
Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G
2015-01-01
Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.
Applying Regression Analysis to Problems in Institutional Research.
ERIC Educational Resources Information Center
Bohannon, Tom R.
1988-01-01
Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)
Papadopoulou, Soultana L.; Exarchakos, Georgios; Christodoulou, Dimitrios; Theodorou, Stavroula; Beris, Alexandre; Ploumis, Avraam
2016-01-01
Introduction The Ohkuma questionnaire is a validated screening tool originally used to detect dysphagia among patients hospitalized in Japanese nursing facilities. Objective The purpose of this study is to evaluate the reliability and validity of the adapted Greek version of the Ohkuma questionnaire. Methods Following the steps for cross-cultural adaptation, we delivered the validated Ohkuma questionnaire to 70 patients (53 men, 17 women) who were either suffering from dysphagia or not. All of them completed the questionnaire a second time within a month. For all of them, we performed a bedside and VFSS study of dysphagia and asked participants to undergo a second VFSS screening, with the exception of nine individuals. Statistical analysis included measurement of internal consistency with Cronbach's α coefficient, reliability with Cohen's Kappa, Pearson's correlation coefficient and construct validity with categorical components, and One-Way Anova test. Results According to Cronbach's α coefficient (0.976) for total score, there was high internal consistency for the Ohkuma Dysphagia questionnaire. Test-retest reliability (Cohen's Kappa) ranged from 0.586 to 1.00, exhibiting acceptable stability. We also estimated the Pearson's correlation coefficient for the test-retest total score, which reached high levels (0.952; p = 0.000). The One-Way Anova test in the two measurement times showed statistically significant correlation in both measurements (p = 0.02 and p = 0.016). Conclusion The adapted Greek version of the questionnaire is valid and reliable and can be used for the screening of dysphagia in the Greek-speaking patients. PMID:28050209
Papadopoulou, Soultana L; Exarchakos, Georgios; Christodoulou, Dimitrios; Theodorou, Stavroula; Beris, Alexandre; Ploumis, Avraam
2017-01-01
Introduction The Ohkuma questionnaire is a validated screening tool originally used to detect dysphagia among patients hospitalized in Japanese nursing facilities. Objective The purpose of this study is to evaluate the reliability and validity of the adapted Greek version of the Ohkuma questionnaire. Methods Following the steps for cross-cultural adaptation, we delivered the validated Ohkuma questionnaire to 70 patients (53 men, 17 women) who were either suffering from dysphagia or not. All of them completed the questionnaire a second time within a month. For all of them, we performed a bedside and VFSS study of dysphagia and asked participants to undergo a second VFSS screening, with the exception of nine individuals. Statistical analysis included measurement of internal consistency with Cronbach's α coefficient, reliability with Cohen's Kappa, Pearson's correlation coefficient and construct validity with categorical components, and One-Way Anova test. Results According to Cronbach's α coefficient (0.976) for total score, there was high internal consistency for the Ohkuma Dysphagia questionnaire. Test-retest reliability (Cohen's Kappa) ranged from 0.586 to 1.00, exhibiting acceptable stability. We also estimated the Pearson's correlation coefficient for the test-retest total score, which reached high levels (0.952; p = 0.000). The One-Way Anova test in the two measurement times showed statistically significant correlation in both measurements ( p = 0.02 and p = 0.016). Conclusion The adapted Greek version of the questionnaire is valid and reliable and can be used for the screening of dysphagia in the Greek-speaking patients.
Digital image analysis techniques for fiber and soil mixtures.
DOT National Transportation Integrated Search
1999-05-01
The objective of image processing is to visually enhance, quantify, and/or statistically evaluate some aspect of an image not readily apparent in its original form. Processed digital image data can be analyzed in numerous ways. In order to summarize ...
Ecological covariates based predictive model of malaria risk in the state of Chhattisgarh, India.
Kumar, Rajesh; Dash, Chinmaya; Rani, Khushbu
2017-09-01
Malaria being an endemic disease in the state of Chhattisgarh and ecologically dependent mosquito-borne disease, the study is intended to identify the ecological covariates of malaria risk in districts of the state and to build a suitable predictive model based on those predictors which could assist developing a weather based early warning system. This secondary data based analysis used one month lagged district level malaria positive cases as response variable and ecological covariates as independent variables which were tested with fixed effect panelled negative binomial regression models. Interactions among the covariates were explored using two way factorial interaction in the model. Although malaria risk in the state possesses perennial characteristics, higher parasitic incidence was observed during the rainy and winter seasons. The univariate analysis indicated that the malaria incidence risk was statistically significant associated with rainfall, maximum humidity, minimum temperature, wind speed, and forest cover ( p < 0.05). The efficient predictive model include the forest cover [IRR-1.033 (1.024-1.042)], maximum humidity [IRR-1.016 (1.013-1.018)], and two-way factorial interactions between district specific averaged monthly minimum temperature and monthly minimum temperature, monthly minimum temperature was statistically significant [IRR-1.44 (1.231-1.695)] whereas the interaction term has a protective effect [IRR-0.982 (0.974-0.990)] against malaria infections. Forest cover, maximum humidity, minimum temperature and wind speed emerged as potential covariates to be used in predictive models for modelling the malaria risk in the state which could be efficiently used for early warning systems in the state.
Statistical Issues for Uncontrolled Reentry Hazards
NASA Technical Reports Server (NTRS)
Matney, Mark
2008-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.
Wojtas, Katarzyna; Oskedra, Iwona; Cepuch, Grazyna; Świderska, Elzbiera
2014-01-01
Child epilepsy can be source of negative emotions and stress in pa- rents. Social support is an external personal resource coupled with the process of coping with difficult situations. The aim of the study evaluation the severity of negative emotions and ways of dealing with the stress of parents of children with epilepsy in relation to the received social support. The study was conducted in one of the children's hospitals in Małopolska, on 213 parents (148 women and 65 men) aged 21 to 66 years. The study used: HADS-M (Hospital Anxiety and Depression Scale Modified HADS-M, BSSS (Berlin Social Suport Scale), Inventory for Measurement Coping Mini-COPE and the author's questionnaire. Statistical analysis used Pearson and Spearman's correlation and Mann-Whitney test and t-test. Calculations were performed using IBM SPSS Statistics 20 Statistical significance was p ≤0.05. The dominant emotion that accompanied the parents was anxiety. Parents have used more strategies based on active coping, and seeking emotional support or instrumental. It has been shown there is a correlation between the level of intensity of negative emotions and social support, and also correlation between social support and ways of coping with stress. Parents expressed negative emotions but not of high severity which may be related to the choice of active coping strategies. Steps should be taken not only to assess the prevalence of negative emotions in a group of parents, but also to reduce the level of their intensity by exposing the importance of social support.
The statistical reporting quality of articles published in 2010 in five dental journals.
Vähänikkilä, Hannu; Tjäderhane, Leo; Nieminen, Pentti
2015-01-01
Statistical methods play an important role in medical and dental research. In earlier studies it has been observed that current use of methods and reporting of statistics are responsible for some of the errors in the interpretation of results. The aim of this study was to investigate the quality of statistical reporting in dental research articles. A total of 200 articles published in 2010 were analysed covering five dental journals: Journal of Dental Research, Caries Research, Community Dentistry and Oral Epidemiology, Journal of Dentistry and Acta Odontologica Scandinavica. Each paper underwent careful scrutiny for the use of statistical methods and reporting. A paper with at least one poor reporting item has been classified as 'problems with reporting statistics' and a paper without any poor reporting item as 'acceptable'. The investigation showed that 18 (9%) papers were acceptable and 182 (91%) papers contained at least one poor reporting item. The proportion of at least one poor reporting item in this survey was high (91%). The authors of dental journals should be encouraged to improve the statistical section of their research articles and to present the results in such a way that it is in line with the policy and presentation of the leading dental journals.
Moazami, Fariborz; Mirhadi, Hosein; Geramizadeh, Bita; Sahebi, Safoura
2012-04-01
The purpose of this study was to evaluate the ability of soymilk, powdered milk, and Hank's balanced salt solution (HBSS) to maintain human periodontal ligament (PDL) cell viability in vitro. PDL cells were obtained from extracted healthy third molars and cultured in Dulbecco's modified Eagles medium (DMEM). The cultures were exposed for 1, 2, 4, and 8 h to experimental solutions (tap water served as negative control and DMEM as positive control) at 37°C. The viable cells were then counted using the trypan blue exclusion technique. Data were analyzed by using one-way anova, post hoc Scheffe and two-way anova test. Statistical analysis showed that HBSS, powdered baby formula, and soymilk maintain cell viability equally well in different periods of times. Tap water cannot keep cells viable as well as other solutions. Soymilk and powdered baby formula can be recommended as suitable storage media for avulsed teeth for up to 8 h. © 2011 John Wiley & Sons A/S.
Bonding characteristics of self-etching adhesives to intact versus prepared enamel.
Perdigão, Jorge; Geraldeli, Saulo
2003-01-01
This study tested the null hypothesis that the preparation of the enamel surface would not affect the enamel microtensile bond strengths of self-etching adhesive materials. Ten bovine incisors were trimmed with a diamond saw to obtain a squared enamel surface with an area of 8 x 8 mm. The specimens were randomly assigned to five adhesives: (1) ABF (Kuraray), an experimental two-bottle self-etching adhesive; (2) Clearfil SE Bond (Kuraray), a two-bottle self-etching adhesive; (3) One-Up Bond F (Tokuyama), an all-in-one adhesive; (4) Prompt L-Pop (3M ESPE), an all-in-one adhesive; and (5) Single Bond (3M ESPE), a two-bottle total-etch adhesive used as positive control. For each specimen, one half was roughened with a diamond bur for 5 seconds under water spray, whereas the other half was left unprepared. The adhesives were applied as per manufacturers' directions. A universal hybrid composite resin (Filtek Z250, 3M ESPE) was inserted in three layers of 1.5 mm each and light-cured. Specimens were sectioned in X and Y directions to obtain bonded sticks with a cross-sectional area of 0.8 +/- 0.2 mm2. Sticks were tested in tension in an Instron at a cross-speed of 1 mm per minute. Statistical analysis was carried out with two-way analysis of variance and Duncan's test at p < .05. Ten extra specimens were processed for observation under a field-emission scanning electron microscope. Single Bond, the total-etch adhesive, resulted in statistically higher microtensile bond strength than any of the other adhesives regardless of the enamel preparation (unprepared = 31.5 MPa; prepared = 34.9 MPa, not statistically different at p < .05). All the self-etching adhesives resulted in higher microtensile bond strength when enamel was roughened than when enamel was left unprepared. However, for ABF and for Clearfil SE Bond this difference was not statistically significant at p > .05. When applied to ground enamel, mean bond strengths of Prompt L-Pop were not statistically different from those of Clearfil SE Bond and ABF. One-Up Bond F did not bond to unprepared enamel. Commercial self-etching adhesives performed better on prepared enamel than on unprepared enamel. The field-emission scanning electron microscope revealed a deep interprismatic etching pattern for the total-etch adhesive, whereas the self-etching systems resulted in an etching pattern ranging from absent to moderate.
NASA Astrophysics Data System (ADS)
Driscoll, Dennis; Stillman, Daniel
2002-08-01
Previous research has revealed that an emotional response to weather might be indicated by calls to telephone counseling services. We analyzed call frequency from such "hotlines", each serving communities in a major metropolitan area of the United States (Detroit, Washington DC, Dallas and Seattle). The periods examined were all, or parts of, the years 1997 and 1998. Associations with subjectively derived synoptic weather types for all cities except Seattle, as well as with individual weather elements [cloudiness (sky cover), precipitation, windspeed, and interdiurnal temperature change] for all four cities, were investigated. Analysis of variance and t-tests (significance of means) were applied to test the statistical significance of differences. Although statistically significant results were obtained in scattered instances, the total number was within that expected by chance, and there was little in the way of consistency to these associations. One clear exception was the increased call frequency during destructive (severe) weather, when there is obvious concern about the damage done by it.
Effect of trapidil in myocardial ischemia-reperfusion injury in rabbit.
Liu, Mingjie; Sun, Qi; Wang, Qiang; Wang, Xiuying; Lin, Peng; Yang, Ming; Yan, Yuanyuan
2014-01-01
To evaluate the cardioprotective effects of trapidil on myocardial ischemia-reperfusion injury (MIRI) in rabbits. Rabbits were subjected to 40 min of myocardial ischemia followed by 120 min of reperfusion. Blood for superoxide dismutase (SOD) and malondialdehyde (MDA) were estimated. At the end of reperfusion, the rabbits were sacrificed and the hearts were isolated for histological examination. An apoptotic index (AI) was determined using the terminal deoxynucleotidyl transferase (TdT)-mediated dUTP nick-end-labeling (TUNEL) method. The expression of apoptosis-related proteins Bax and Bcl-2 was analyzed using immunohistochemistry. Statistical analyses were performed by one-way analysis of variance (ANOVA), P < 0.05 considered statistically significant. Trapidil caused a significant (P < 0.05) increase in SOD activity, as decreased MDA levels and significantly (P < 0.05) reduced the expression of Bax as compared with the ischemia-reperfusion (IR) control group. Trapidil may attenuate the myocardial damage produced by IR injury and offer potential cardioprotective action.
Leptin to adiponectin ratio in preeclampsia.
Khosrowbeygi, A; Ahmadvand, H
2013-04-01
The aim of the present study was to assess leptin/adiponectin ratio in preeclamptic patients compared with normal pregnant women. A cross-sectional study was designed. The study population consisted of 30 preeclamptic patients and 30 healthy pregnant women. Serum levels of total leptin and adiponectin were assessed using commercially available enzyme-linked immunosorbent assay methods. The one-way ANOVA and Student's t tests and Pearson's correlation analysis were used for statistical calculations. Levels of leptin and adiponectin were also adjusted for BMI. A p-value < 0.05 was considered statistically significant. The leptin/adiponectin ratio was increased significantly in preeclamptic patients. The leptin/adiponectin ratio was significantly higher in severe preeclamptic patient than in mild preeclampsia. Adjusted leptin/adiponectin ratio was also significantly increased in preeclamptic patients than in normal pregnant women. The findings of the present study suggest that the leptin/adiponectin ratio was increased in preeclamsia and imbalance between the adipocytokines could be involved in the pathogenesis of preeclampsia.
NASA Astrophysics Data System (ADS)
Cernesson, Flavie; Tournoud, Marie-George; Lalande, Nathalie
2018-06-01
Among the various parameters monitored in river monitoring networks, bioindicators provide very informative data. Analysing time variations in bioindicator data is tricky for water managers because the data sets are often short, irregular, and non-normally distributed. It is then a challenging methodological issue for scientists, as it is in Saône basin (30 000 km2, France) where, between 1998 and 2010, among 812 IBGN (French macroinvertebrate bioindicator) monitoring stations, only 71 time series have got more than 10 data values and were studied here. Combining various analytical tools (three parametric and non-parametric statistical tests plus a graphical analysis), 45 IBGN time series were classified as stationary and 26 as non-stationary (only one of which showing a degradation). Series from sampling stations located within the same hydroecoregion showed similar trends, while river size classes seemed to be non-significant to explain temporal trends. So, from a methodological point of view, combining statistical tests and graphical analysis is a relevant option when striving to improve trend detection. Moreover, it was possible to propose a way to summarise series in order to analyse links between ecological river quality indicators and land use stressors.
Non-operative management (NOM) of blunt hepatic trauma: 80 cases.
Özoğul, Bünyami; Kısaoğlu, Abdullah; Aydınlı, Bülent; Öztürk, Gürkan; Bayramoğlu, Atıf; Sarıtemur, Murat; Aköz, Ayhan; Bulut, Özgür Hakan; Atamanalp, Sabri Selçuk
2014-03-01
Liver is the most frequently injured organ upon abdominal trauma. We present a group of patients with blunt hepatic trauma who were managed without any invasive diagnostic tools and/or surgical intervention. A total of 80 patients with blunt liver injury who were hospitalized to the general surgery clinic or other clinics due to the concomitant injuries were followed non-operatively. The normally distributed numeric variables were evaluated by Student's t-test or one way analysis of variance, while non-normally distributed variables were analyzed by Mann-Whitney U-test or Kruskal-Wallis variance analysis. Chi-square test was also employed for the comparison of categorical variables. Statistical significance was assumed for p<0.05. There was no significant relationship between patients' Hgb level and liver injury grade, outcome, and mechanism of injury. Also, there was no statistical relationship between liver injury grade, outcome, and mechanism of injury and ALT levels as well as AST level. There was no mortality in any of the patients. During the last quarter of century, changes in the diagnosis and treatment of liver injury were associated with increased survival. NOM of liver injury in patients with stable hemodynamics and hepatic trauma seems to be the gold standard.
Nagarajappa, Ramesh; Batra, Mehak; Sharda, Archana J; Asawa, Kailash; Sanadhya, Sudhanshu; Daryani, Hemasha; Ramesh, Gayathri
2015-01-01
To assess and compare the antimicrobial potential and determine the minimum inhibitory concentration (MIC) of Jasminum grandiflorum and Hibiscus rosa-sinensis extracts as potential anti-pathogenic agents in dental caries. Aqueous and ethanol (cold and hot) extracts prepared from leaves of Jasminum grandiflorum and Hibiscus rosa-sinensis were screened for in vitro antimicrobial activity against Streptococcus mutans and Lactobacillus acidophilus using the agar well diffusion method. The lowest concentration of every extract considered as the minimum inhibitory concentration (MIC) was determined for both test organisms. Statistical analysis was performed with one-way analysis of variance (ANOVA). At lower concentrations, hot ethanol Jasminum grandiflorum (10 μg/ml) and Hibiscus rosa-sinensis (25 μg/ml) extracts were found to have statistically significant (P≤0.05) antimicrobial activity against S. mutans and L. acidophilus with MIC values of 6.25 μg/ml and 25 μg/ml, respectively. A proportional increase in their antimicrobial activity (zone of inhibition) was observed. Both extracts were found to be antimicrobially active and contain compounds with therapeutic potential. Nevertheless, clinical trials on the effect of these plants are essential before advocating large-scale therapy.
Al-Qarni, Mohammed A; Shakeela, Nasim Vahid; Alamri, Mohammed Abdullah; Alshaikh, Yahya A
2016-10-01
Eco-friendly or green dentistry can be a reality by effectively designing dental clinics and using more eco-friendly materials in the clinical practice. To determine the awareness of eco-friendly dentistry among dental faculty and students in preparation for future implementation. Assessment of knowledge regarding eco-friendly dentistry was done using an 18 item self-administered questionnaire among 160 participants. After baseline data collection, the intervention was done by educating participants with a power point presentation. The post-intervention data was then collected for analysis. Statistical analysis was done using Wilcoxon's signed rank test and one-way ANOVA. The educational intervention increased the knowledge about eco-friendly dentistry confirming the importance of continuing education. There was a statistically significant gain in knowledge among the participants after the presentation. The gain was highest for department of Preventive Dental Sciences (PDS) followed by Substitute Dental Sciences (SDS), No specialty, Maxillofacial Dental Sciences (MDS), and Restorative Dental Sciences (RDS) respectively. (F=5.5091, p<0.05). Lack of knowledge of green dentistry amongst the dental fraternity is highly prevailing. The same can be substantiated with effective training in the respective fields if channelized through the curriculum in an educational set-up.
Garcia, Luís Filipe; de Oliveira, Luís Caldas; de Matos, David Martins
2016-01-01
This study compared the performance of two statistical location-aware pictogram prediction mechanisms, with an all-purpose (All) pictogram prediction mechanism, having no location knowledge. The All approach had a unique language model under all locations. One of the location-aware alternatives, the location-specific (Spec) approach, made use of specific language models for pictogram prediction in each location of interest. The other location-aware approach resulted from combining the Spec and the All approaches, and was designated the mixed approach (Mix). In this approach, the language models acquired knowledge from all locations, but a higher relevance was assigned to the vocabulary from the associated location. Results from simulations showed that the Mix and Spec approaches could only outperform the baseline in a statistically significant way if pictogram users reuse more than 50% and 75% of their sentences, respectively. Under low sentence reuse conditions there were no statistically significant differences between the location-aware approaches and the All approach. Under these conditions, the Mix approach performed better than the Spec approach in a statistically significant way.
Survival analysis in hematologic malignancies: recommendations for clinicians
Delgado, Julio; Pereira, Arturo; Villamor, Neus; López-Guillermo, Armando; Rozman, Ciril
2014-01-01
The widespread availability of statistical packages has undoubtedly helped hematologists worldwide in the analysis of their data, but has also led to the inappropriate use of statistical methods. In this article, we review some basic concepts of survival analysis and also make recommendations about how and when to perform each particular test using SPSS, Stata and R. In particular, we describe a simple way of defining cut-off points for continuous variables and the appropriate and inappropriate uses of the Kaplan-Meier method and Cox proportional hazard regression models. We also provide practical advice on how to check the proportional hazards assumption and briefly review the role of relative survival and multiple imputation. PMID:25176982
Power-law statistics of neurophysiological processes analyzed using short signals
NASA Astrophysics Data System (ADS)
Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.
2018-04-01
We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.
Trend Analysis and Affecting Components of Human Brucellosis Incidence During 2006 to 2016.
Marvi, Abolfazl; Asadi-Aliabadi, Mehran; Darabi, Mehdi; Abedi, Ghassem; Siamian, Hasan; Rostami-Maskopaee, Fereshteh
2018-02-01
Brucellosis is communicable between humans and animals. In spite of having an active health care system. Iran is considered as an endemic area and it stands in the fourth place in world ranking. One of the common methods for identifying the disease incidence is a regression analysis. Therefore, the aim of the study was to investigate the trend of brucellosis incidence during 2006 to 2016 and the components affecting such disease. This was a trend study which was conducted on the total of 144 brucellosis cases were recorded in the registration software in CDC of Iranian, Ministry of Health. We analyzed the changes in brucellosis incidence during 2006 to 2016 in Juybar province by the join point regression. Moreover, comparing the changes of incidence in one year intervals was also taken into account. The average age of patients was 18±29 years. About 60% of the patients were men, and 85.4% had used non-pasteurized dairy and meat products. The contact with animals had a significant difference between the two genders (P= 0.006). During 2006 to 2016, brucellosis incidence had a decreased trend about 15%. This trend had a breakpoint in a way that during 2006 to 2008, 66.2% decrease and during 2008 to 2016, 7% increase was observed that none of these annual percentage changes (APC) were statistically significant at p= 0.05. Also, APC of brucellosis incidence in groups below 20 and between 20 to 50 years old had a decrease in a way that in groups under 20, it had 26.7% decrease and it was statistically significant. It is necessary to provide appropriate education training, information on the Human Brucellosis for the young and individuals with high risk professions. Moreover, some health behaviors such as not using non-pasteurized dairy, animals' vaccinations, and awareness of the disease symptoms are needed.
Kazdal, Hizir; Kanat, Ayhan; Aydin, Mehmet Dumlu; Yazar, Ugur; Guvercin, Ali Riza; Calik, Muhammet; Gundogdu, Betul
2017-01-01
Context: Sudden death from subarachnoid hemorrhage (SAH) is not uncommon. Aims: The goal of this study is to elucidate the effect of the cervical spinal roots and the related dorsal root ganglions (DRGs) on cardiorespiratory arrest following SAH. Settings and Design: This was an experimental study conducted on rabbits. Materials and Methods: This study was conducted on 22 rabbits which were randomly divided into three groups: control (n = 5), physiologic serum saline (SS; n = 6), SAH groups (n = 11). Experimental SAH was performed. Seven of 11 rabbits with SAH died within the first 2 weeks. After 20 days, other animals were sacrificed. The anterior spinal arteries, arteriae nervorum of cervical nerve roots (C6–C8), DRGs, and lungs were histopathologically examined and estimated stereologically. Statistical Analysis Used: Statistical analysis was performed using the PASW Statistics 18.0 for Windows (SPSS Inc., Chicago, Illinois, USA). Intergroup differences were assessed using a one-way ANOVA. The statistical significance was set at P < 0.05. Results: In the SAH group, histopathologically, severe anterior spinal artery (ASA) and arteriae nervorum vasospasm, axonal and neuronal degeneration, and neuronal apoptosis were observed. Vasospasm of ASA did not occur in the SS and control groups. There was a statistically significant increase in the degenerated neuron density in the SAH group as compared to the control and SS groups (P < 0.05). Cardiorespiratory disturbances, arrest, and lung edema more commonly developed in animals in the SAH group. Conclusion: We noticed interestingly that C6–C8 DRG degenerations were secondary to the vasospasm of ASA, following SAH. Cardiorespiratory disturbances or arrest can be explained with these mechanisms. PMID:28250634
Hematological Alterations on Sub-acute Exposure to Flubendiamide in Sprague Dawley Rats.
Vemu, Bhaskar; Dumka, Vinod Kumar
2014-01-01
Pesticide poisoning is a common occurrence around the world. Pesticides can act on various body systems resulting in toxicity. Flubendiamide is a new generation pesticide, reported to have better activity against Lepidopteran insects. The present study was carried out with an objective to analyze the effects of flubendiamide sub-acute exposure on hematology of rats. Male and female Sprague Dawley (SD) rats (9-11 weeks) were divided into five groups with six animals in each group. First group served as control, while the rest were exposed to ascending oral doses of flubendiamide (125, 250, 500 and 1000 mg/kg) for 28 days. After the trial period, blood was collected in heparinized vials and analyzed using Siemens ADVIA 2120(®) autoanalyzer. Various erythrocytic, platelet and leukocyte parameters were measured and analyzed using statistical tests by one-way analysis of variance (ANOVA) and t-test using Statistical Package for Social Sciences (SPSS)(®) 20 software. After processing the data through statistical analysis, it was observed that the effect of flubendiamide exposure on female rats was negligible. The only significant change observed in the female rats was that in total erythrocytic count, while rest of the parameters showed non-significant bidirectional changes. In males, many parameters viz., total leukocyte count (TLC), total erythrocyte count (TEC), packed cell volume (PCV), mean corpuscular volume (MCV), platelet count (PC), mean platelet volume (MPV), platelet distribution width (PDW), hemoglobin distribution width (HDW), large platelets (LPT) and plateletcrit (PCT) expressed significant difference when compared to control. Many of the changes were dose independent, but sex specific. This lead to the hypothesis that saturation toxicokinetics might be one of the reasons for this varied response, which can only be evaluated after further testing.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
Evaluating collective significance of climatic trends: A comparison of methods on synthetic data
NASA Astrophysics Data System (ADS)
Huth, Radan; Dubrovský, Martin
2017-04-01
The common approach to determine whether climatic trends are significantly different from zero is to conduct individual (local) tests at each single site (station or gridpoint). Whether the number of sites where the trends are significantly non-zero can or cannot occur by random, is almost never evaluated in trend studies. That is, collective (global) significance of trends is ignored. We compare three approaches to evaluating collective statistical significance of trends at a network of sites, using the following statistics: (i) the number of successful local tests (a successful test means here a test in which the null hypothesis of no trend is rejected); this is a standard way of assessing collective significance in various applications in atmospheric sciences; (ii) the smallest p-value among the local tests (Walker test); and (iii) the counts of positive and negative trends regardless of their magnitudes and local significance. The third approach is a new procedure that we propose; the rationale behind it is that it is reasonable to assume that the prevalence of one sign of trends at individual sites is indicative of a high confidence in the trend not being zero, regardless of the (in)significance of individual local trends. A potentially large amount of information contained in trends that are not locally significant, which are typically deemed irrelevant and neglected, is thus not lost and is retained in the analysis. In this contribution we examine the feasibility of the proposed way of significance testing on synthetic data, produced by a multi-site stochastic generator, and compare it with the two other ways of assessing collective significance, which are well established now. The synthetic dataset, mimicking annual mean temperature on an array of stations (or gridpoints), is constructed assuming a given statistical structure characterized by (i) spatial separation (density of the station network), (ii) local variance, (iii) temporal and spatial autocorrelations, and (iv) the trend magnitude. The probabilistic distributions of the three test statistics (null distributions) and critical values of the tests are determined from multiple realizations of the synthetic dataset, in which no trend is imposed at each site (that is, any trend is a result of random fluctuations only). The procedure is then evaluated by determining the type II error (the probability of a false detection of a trend) in the presence of a trend with a known magnitude, for which the synthetic dataset with an imposed spatially uniform non-zero trend is used. A sensitivity analysis is conducted for various combinations of the trend magnitude and spatial autocorrelation.
Optimal Access to NASA Water Cycle Data for Water Resources Management
NASA Astrophysics Data System (ADS)
Teng, W. L.; Arctur, D. K.; Espinoza, G. E.; Rui, H.; Strub, R. F.; Vollmer, B.
2016-12-01
A "Digital Divide" in data representation exists between the preferred way of data access by the hydrology community (i.e., as time series of discrete spatial objects) and the common way of data archival by earth science data centers (i.e., as continuous spatial fields, one file per time step). This Divide has been an obstacle, specifically, between the Consortium of Universities for the Advancement of Hydrologic Science, Inc. Hydrologic Information System (CUAHSI HIS) and NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). An optimal approach to bridging the Divide, developed by the GES DISC, is to reorganize data from the way they are archived to some way that is optimal for the desired method of data access. Specifically for CUAHSI HIS, selected data sets were reorganized into time series files, one per geographical "point." These time series files, termed "data rods," are pre-generated or virtual (generated on-the-fly). Data sets available as data rods include North American Land Data Assimilation System (NLDAS), Global Land Data Assimilation System (GLDAS), TRMM Multi-satellite Precipitation Analysis (TMPA), Land Parameter Retrieval Model (LPRM), Modern-Era Retrospective Analysis for Research and Applications (MERRA)-Land, and Groundwater and Soil Moisture Conditions from Gravity Recovery and Climate Experiment (GRACE) Data Assimilation drought indicators for North America Drought Monitor (GRACE-DA-DM). In order to easily avail the operational water resources community the benefits of optimally reorganized data, we have developed multiple methods of making these data more easily accessible and usable. These include direct access via RESTful Web services, a browser-based Web map and statistical tool for selected NLDAS variables for the U.S. (CONUS), a HydroShare app (Data Rods Explorer, under development) on the Tethys Platform, and access via the GEOSS Portal. Examples of drought-related applications of these data and data access methods are provided.
Spatial parameters of walking gait and footedness.
Zverev, Y P
2006-01-01
The present study was undertaken to assess whether footedness has effects on selected spatial and angular parameters of able-bodied gait by evaluating footprints of young adults. A total of 112 males and 93 females were selected from among students and staff members of the University of Malawi using a simple random sampling method. Footedness of subjects was assessed by the Waterloo Footedness Questionnaire Revised. Gait at natural speed was recorded using the footprint method. The following spatial parameters of gait were derived from the inked footprint sequences of subjects: step and stride lengths, gait angle and base of gait. The anthropometric measurements taken were weight, height, leg and foot length, foot breadth, shoulder width, and hip and waist circumferences. The prevalence of right-, left- and mix-footedness in the whole sample of young Malawian adults was 81%, 8.3% and 10.7%, respectively. One-way analysis of variance did not reveal a statistically significant difference between footedness categories in the mean values of anthropometric measurements (p > 0.05 for all variables). Gender differences in step and stride length values were not statistically significant. Correction of these variables for stature did not change the trend. Males had significantly broader steps than females. Normalized values of base of gait had similar gender difference. The group means of step length and normalized step length of the right and left feet were similar, for males and females. There was a significant side difference in the gait angle in both gender groups of volunteers with higher mean values on the left side compared to the right one (t = 2.64, p < 0.05 for males, and t = 2.78, p < 0.05 for females). One-way analysis of variance did not demonstrate significant difference between footedness categories in the mean values of step length, gait angle, bilateral differences in step length and gait angle, stride length, gait base and normalized gait variables of male and female volunteers (p > 0.05 for all variables). The present study demonstrated that footedness does not affect spatial and angular parameters of walking gait.
Statistical Characteristics of Wrong-Way Driving Crashes on Illinois Freeways.
Zhou, Huaguo; Zhao, Jiguang; Pour-Rouholamin, Mahdi; Tobias, Priscilla A
2015-01-01
Driving the wrong way on freeways, namely wrong-way driving (WWD), has been found to be a major concern for more than 6 decades. The purpose of this study was to identify characteristics of this type of crash as well as to rank the locations/interchanges according to their vulnerability to WWD entries. The WWD crash data on Illinois freeways were statistically analyzed for a 6-year time period (2004 to 2009) from 3 aspects: crash, vehicle, and person. The temporal distributions, geographical distributions, roadway characteristics, and crash locations were analyzed for WWD crashes. The driver demographic information, physical condition, and injury severity were analyzed for wrong-way drivers. The vehicle characteristics, vehicle operation, and collision results were analyzed for WWD vehicles. A method was brought about to identify wrong-way entry points that was then used to develop a relative-importance technique and rank different interchange types in terms of potential WWD incidents. The findings revealed that a large proportion of WWD crashes occurred during the weekend from midnight to 5 a.m. Approximately 80% of WWD crashes were located in urban areas and nearly 70% of wrong-way vehicles were passenger cars. Approximately 58% of wrong-way drivers were driving under the influence (DUI). Of those, nearly 50% were confirmed to be impaired by alcohol, about 4% were impaired by drugs, and more than 3% had been drinking. The analysis of interchange ranking found that compressed diamond interchanges, single point diamond interchanges (SPDIs), partial cloverleaf interchanges, and freeway feeders had the highest wrong-way crash rates (wrong-way crashes per 100 interchanges per year). The findings of this study call for more attention to WWD crashes from different aspects such as driver age group, time of day, day of week, and DUI drivers. Based on the analysis results of WWD distance, the study explained why a 5-mile radius of WWD crash location should be studied for WWD fatal crashes with unknown entry points.
On the Experimental Determination of the One-Way Speed of Light
ERIC Educational Resources Information Center
Perez, Israel
2011-01-01
In this paper the question of the isotropy of the one-way speed of light is addressed from an experimental perspective. In particular, we analyse two experimental methods commonly used in its determination. The analysis is aimed at clarifying the view that the one-way speed of light cannot be determined by techniques in which physical entities…
Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.
ERIC Educational Resources Information Center
Peacock, Darren
This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
Statistical analysis of arthroplasty data
2011-01-01
It is envisaged that guidelines for statistical analysis and presentation of results will improve the quality and value of research. The Nordic Arthroplasty Register Association (NARA) has therefore developed guidelines for the statistical analysis of arthroplasty register data. The guidelines are divided into two parts, one with an introduction and a discussion of the background to the guidelines (Ranstam et al. 2011a, see pages x-y in this issue), and this one with a more technical statistical discussion on how specific problems can be handled. This second part contains (1) recommendations for the interpretation of methods used to calculate survival, (2) recommendations on howto deal with bilateral observations, and (3) a discussion of problems and pitfalls associated with analysis of factors that influence survival or comparisons between outcomes extracted from different hospitals. PMID:21619500
Tipton, John; Hooten, Mevin B.; Goring, Simon
2017-01-01
Scientific records of temperature and precipitation have been kept for several hundred years, but for many areas, only a shorter record exists. To understand climate change, there is a need for rigorous statistical reconstructions of the paleoclimate using proxy data. Paleoclimate proxy data are often sparse, noisy, indirect measurements of the climate process of interest, making each proxy uniquely challenging to model statistically. We reconstruct spatially explicit temperature surfaces from sparse and noisy measurements recorded at historical United States military forts and other observer stations from 1820 to 1894. One common method for reconstructing the paleoclimate from proxy data is principal component regression (PCR). With PCR, one learns a statistical relationship between the paleoclimate proxy data and a set of climate observations that are used as patterns for potential reconstruction scenarios. We explore PCR in a Bayesian hierarchical framework, extending classical PCR in a variety of ways. First, we model the latent principal components probabilistically, accounting for measurement error in the observational data. Next, we extend our method to better accommodate outliers that occur in the proxy data. Finally, we explore alternatives to the truncation of lower-order principal components using different regularization techniques. One fundamental challenge in paleoclimate reconstruction efforts is the lack of out-of-sample data for predictive validation. Cross-validation is of potential value, but is computationally expensive and potentially sensitive to outliers in sparse data scenarios. To overcome the limitations that a lack of out-of-sample records presents, we test our methods using a simulation study, applying proper scoring rules including a computationally efficient approximation to leave-one-out cross-validation using the log score to validate model performance. The result of our analysis is a spatially explicit reconstruction of spatio-temporal temperature from a very sparse historical record.
Ozsu, Damla; Karatas, Ertugrul; Arslan, Hakan; Topcu, Meltem C.
2014-01-01
Objectives: The aim of this study was to compare the amount of apically extruded debris during preparation with ProTaper Universal (Dentsply Maillefer, Ballaigues, Switzerland), ProTaper Next (Dentsply Maillefer), a reciprocating single-file (WaveOne; VDW GmbH, Munich, Germany), and a self-adjusting file (SAF; ReDent Nova, Ra’anna, Israel). Materials and Methods: Fifty-six intact mandibular premolar teeth were randomly assigned to four groups. The root canals were prepared according to the manufacturers’ instructions using the ProTaper Universal, ProTaper Next, WaveOne, and SAF. Apically extruded debris was collected in preweighted Eppendorf tubes during instrumentation. The net weight of the apically extruded debris was determined by subtracting the preweights and postweights of the tubes. The data were statistically analyzed using the one-way analysis of variance and the least significant difference tests at a significance level of P < 0.05. Results: A measurable amount of debris was apically extruded in all groups, and the amounts of debris extrusion in the groups were statistically significant (P < 0.001). The ProTaper Next and WaveOne groups resulted in less debris extrusion than the ProTaper Universal group (P < 0.05), and the SAF group resulted in the least debris extrusion. Conclusions: Within the limitations of the present study, it can be concluded that all systems extruded debris beyond the apical foramen. PMID:25512732
Spirituality and religious coping are related to cancer-bereaved siblings' long-term grief.
Lövgren, Malin; Sveen, Josefin; Steineck, Gunnar; Wallin, Alexandra Eilegård; Eilertsen, Mary-Elizabeth B; Kreicbergs, Ulrika
2017-12-20
Many bereaved siblings have still not come to terms with their grief many years after the loss, but few studies have focused on what can help. The aims of this study were to identify cancer-bereaved adolescents' and young adults' ways of coping with grief after loss of a sibling, and examine whether these ways of coping were related to their experience of having worked through their grief. This nationwide survey of 174 cancer-bereaved siblings (73% participation rate) is based on one open-ended question about coping with grief ("What has helped you to cope with your grief after your sibling's death?") and one closed-ended question about siblings' long-term grief ("Do you think you have worked through your grief over your sibling's death?"). The open-ended question was analyzed with content analysis; descriptive statistics and Fisher's exact test were used to examine the relation between type of coping and siblings' long-term grief. Result The siblings described four ways of coping: (1) thinking of their dead brother/sister and feeling and expressing their grief; (2) distracting or occupying themselves; (3) engaging in spiritual and religious beliefs/activities; and (4) waiting for time to pass. One of these categories of coping with grief, namely, engaging in spiritual and religious beliefs and activities, was associated with siblings' experience of having worked through their grief two to nine years after the loss (p = 0.016). Significance of results Those siblings who had used spirituality, religious beliefs, and activities to cope were more likely to have worked through their grief than those who had not.
Hadrévi, Jenny; Hellström, Fredrik; Kieselbach, Thomas; Malm, Christer; Pedrosa-Domellöf, Fatima
2011-08-10
The trapezius muscle is a neck muscle that is susceptible to chronic pain conditions associated with repetitive tasks, commonly referred to as chronic work-related myalgia, hence making the trapezius a muscle of clinical interest. To provide a basis for further investigations of the proteomic traits of the trapezius muscle in disease, two-dimensional difference gel electrophoresis (2D-DIGE) was performed on the healthy trapezius using vastus lateralis as a reference. To obtain as much information as possible from the vast proteomic data set, both one-way ANOVA, with and without false discovery rate (FDR) correlation, and partial least square projection to latent structures with discriminant analysis (PLS-DA) were combined to compare the outcome of the analysis. The trapezius and vastus lateralis showed significant differences in metabolic, contractile and regulatory proteins, with different results depending on choice of statistical approach and pre-processing technique. Using the standard method, FDR correlated one-way ANOVA, 42 protein spots differed significantly in abundance between the two muscles. Complementary analysis using immunohistochemistry and western blot confirmed the results from the 2D-DIGE analysis. The proteomic approach used in the present study combining 2D-DIGE and multivariate modelling provided a more comprehensive comparison of the protein profiles of the human trapezius and vastus lateralis muscle, than previously possible to obtain with immunohistochemistry or SDS-PAGE alone. Although 2D-DIGE has inherent limitations it is particularly useful to comprehensively screen for important structural and metabolic proteins, and appears to be a promising tool for future studies of patients suffering from chronic work related myalgia or other muscle diseases.
Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris
2011-10-20
Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down-to-earth quantitative analysis works well for the CluPA-aligned spectra. The whole workflow is embedded into a modular and statistically sound framework that is implemented as an R package called "speaq" ("spectrum alignment and quantitation"), which is freely available from http://code.google.com/p/speaq/.
Warren, Johanna B; Hamilton, Andrew
2015-12-01
Seven validated prospective scoring systems, and one unvalidated system, predict a successful TOLAC based on a variety of clinical factors. The systems use different outcome statistics, so their predictive accuracy can't be directly compared.
Valid Statistical Analysis for Logistic Regression with Multiple Sources
NASA Astrophysics Data System (ADS)
Fienberg, Stephen E.; Nardi, Yuval; Slavković, Aleksandra B.
Considerable effort has gone into understanding issues of privacy protection of individual information in single databases, and various solutions have been proposed depending on the nature of the data, the ways in which the database will be used and the precise nature of the privacy protection being offered. Once data are merged across sources, however, the nature of the problem becomes far more complex and a number of privacy issues arise for the linked individual files that go well beyond those that are considered with regard to the data within individual sources. In the paper, we propose an approach that gives full statistical analysis on the combined database without actually combining it. We focus mainly on logistic regression, but the method and tools described may be applied essentially to other statistical models as well.
Gaur, Rajan; Maurya, Madhuri; Kang, Payal Singh
2008-03-01
Somatotypes of a cross-sectional sample of 544 rural adolescents ranging in age from 11 to 17 years are described. The sample included 269 Rajput (141 girls and 128 boys) and 275 Scheduled Caste (135 girls and 140 boys) subjects. Each subject was somatotyped using the Heath-Carter anthropometric somatotype protocol (Carter & Heath 1990). In all, ten anthropometric measurements namely height, weight, bicondylar diameters of humerus and femur, flexed mid-upper-arm and calf circumferences, and triceps, subscapular, supraspinale and calf skinfolds were taken. The mean somatotypes of the Rajput boys and girls were 1.62- 3.30-3.85 (mesomorphic-ectomorph) and 2.42-2.90-3.99 (balanced ectomorph), respectively. The mean somatotypes of the Scheduled Caste subjects were 1.51-3.02-3.74 (mesomorphic-ectomorph) for boys and 2.38-2.64-3.70 (balanced ectomorph) for girls. A one-way ANOVA revealed that females of both the caste groups were significantly (p < or = 0.05) more endomorphic than the males. The sex differences in other two components were not significant (p +/- 0.05). Caste differences, as revealed by a one-way ANOVA analysis, were not significant (p +/- 0.05) in both sexes. With the exception of the Rajput girls, the differences in whole somatotypes between those in an early phase of adolescence and those in an advanced phase of adolescence were not significant (p = 0.05). The results indicate that populations exposed to same environmental situations for a long period of time tend to show similarity in physique. A one-way MANOVA analysis, which used Wilk's Lambda as test statistics, revealed that from 11-17 years there was no significant change (p < or = 0.05) in component dominance of mean somatotypes in the boys and girls of the present sample. Among males of a majority of the Indian populations, ectomorphy dominates over endomorphy and mesomorphy from 11 to 17 years.
Hydrophobicity diversity in globular and nonglobular proteins measured with the Gini index.
Carugo, Oliviero
2017-12-01
Amino acids and their properties are variably distributed in proteins and different compositions determine all protein features, ranging from solubility to stability and functionality. Gini index, a tool to estimate distribution uniformity, is widely used in macroeconomics and has numerous statistical applications. Here, Gini index is used to analyze the distribution of hydrophobicity in proteins and to compare hydrophobicity distribution in globular and intrinsically disordered proteins. Based on the analysis of carefully selected high-quality data sets of proteins extracted from the Protein Data Bank (http://www.rcsb.org) and from the DisProt database (http://www.disprot.org/), it is observed that hydrophobicity is distributed in a more diverse way in intrinsically disordered proteins than in folded and soluble globular proteins. This correlates with the observation that the amino acid composition deviates from the uniformity (estimate with the Shannon and the Gini-Simpson indices) more in intrinsically disordered proteins than in globular and soluble proteins. Although statistical tools tike the Gini index have received little attention in molecular biology, these results show that they allow one to estimate sequence diversity and that they are useful to delineate trends that can hardly be described, otherwise, in simple and concise ways. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Horta, Arturo
1985-01-01
Describes a senior-level course that: (1) focuses on the structure and reactions of macromolecules; (2) treats industrial polymers in a unified way; and (3) uses analysis of conformation and conformational statistics as a unifying approach. Also discusses course topics, including polysaccharides, proteins, nucleic acids, and others. (JN)
Cluster analysis of European Y-chromosomal STR haplotypes using the discrete Laplace method.
Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels
2014-07-01
The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models the probability distribution of the Y-STR haplotypes. Creating a consistent statistical model of the haplotypes enables us to perform a wide range of analyses. Previously, haplotype frequency estimation using the discrete Laplace method has been validated. In this paper we investigate how the discrete Laplace method can be used for cluster analysis to further validate the discrete Laplace method. A very important practical fact is that the calculations can be performed on a normal computer. We identified two sub-clusters of the Eastern and Western European Y-STR haplotypes similar to results of previous studies. We also compared pairwise distances (between geographically separated samples) with those obtained using the AMOVA method and found good agreement. Further analyses that are impossible with AMOVA were made using the discrete Laplace method: analysis of the homogeneity in two different ways and calculating marginal STR distributions. We found that the Y-STR haplotypes from e.g. Finland were relatively homogeneous as opposed to the relatively heterogeneous Y-STR haplotypes from e.g. Lublin, Eastern Poland and Berlin, Germany. We demonstrated that the observed distributions of alleles at each locus were similar to the expected ones. We also compared pairwise distances between geographically separated samples from Africa with those obtained using the AMOVA method and found good agreement. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Saad, Ahmed S; Attia, Ali K; Alaraki, Manal S; Elzanfaly, Eman S
2015-11-05
Five different spectrophotometric methods were applied for simultaneous determination of fenbendazole and rafoxanide in their binary mixture; namely first derivative, derivative ratio, ratio difference, dual wavelength and H-point standard addition spectrophotometric methods. Different factors affecting each of the applied spectrophotometric methods were studied and the selectivity of the applied methods was compared. The applied methods were validated as per the ICH guidelines and good accuracy; specificity and precision were proven within the concentration range of 5-50 μg/mL for both drugs. Statistical analysis using one-way ANOVA proved no significant differences among the proposed methods for the determination of the two drugs. The proposed methods successfully determined both drugs in laboratory prepared and commercially available binary mixtures, and were found applicable for the routine analysis in quality control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.
Visible and Extended Near-Infrared Multispectral Imaging for Skin Cancer Diagnosis
Rey-Barroso, Laura; Burgos-Fernández, Francisco J.; Delpueyo, Xana; Ares, Miguel; Malvehy, Josep; Puig, Susana
2018-01-01
With the goal of diagnosing skin cancer in an early and noninvasive way, an extended near infrared multispectral imaging system based on an InGaAs sensor with sensitivity from 995 nm to 1613 nm was built to evaluate deeper skin layers thanks to the higher penetration of photons at these wavelengths. The outcomes of this device were combined with those of a previously developed multispectral system that works in the visible and near infrared range (414 nm–995 nm). Both provide spectral and spatial information from skin lesions. A classification method to discriminate between melanomas and nevi was developed based on the analysis of first-order statistics descriptors, principal component analysis, and support vector machine tools. The system provided a sensitivity of 78.6% and a specificity of 84.6%, the latter one being improved with respect to that offered by silicon sensors. PMID:29734747
The Shock and Vibration Digest. Volume 14, Number 12
1982-12-01
to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis
Teaching Principles of One-Way Analysis of Variance Using M&M's Candy
ERIC Educational Resources Information Center
Schwartz, Todd A.
2013-01-01
I present an active learning classroom exercise illustrating essential principles of one-way analysis of variance (ANOVA) methods. The exercise is easily conducted by the instructor and is instructive (as well as enjoyable) for the students. This is conducive for demonstrating many theoretical and practical issues related to ANOVA and lends itself…
Miyagi, Atsushi
2017-09-01
Detailed exploration of sensory perception as well as preference across gender and age for a certain food is very useful for developing a vendible food commodity related to physiological and psychological motivation for food preference. Sensory tests including color, sweetness, bitterness, fried peanut aroma, textural preference and overall liking of deep-fried peanuts with varying frying time (2, 4, 6, 9, 12 and 15 min) at 150 °C were carried out using 417 healthy Japanese consumers. To determine the influence of gender and age on sensory evaluation, systematic statistical analysis including one-way analysis of variance, polynomial regression analysis and multiple regression analysis was conducted using the collected data. The results indicated that females were more sensitive to bitterness than males. This may affect sensory preference; female subjects favored peanuts prepared with a shorter frying time more than male subjects did. With advancing age, textural preference played a more important role in overall preference. Older subjects liked deeper-fried peanuts, which are more brittle, more than younger subjects did. In the present study, systematic statistical analysis based on collected sensory evaluation data using deep-fried peanuts was conducted and the tendency of sensory perception and preference across gender and age was clarified. These results may be useful for engineering optimal strategies to target specific segments to gain greater acceptance in the market. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Full in-vitro analyses of new-generation bulk fill dental composites cured by halogen light.
Tekin, Tuçe Hazal; Kantürk Figen, Aysel; Yılmaz Atalı, Pınar; Coşkuner Filiz, Bilge; Pişkin, Mehmet Burçin
2017-08-01
The objective of this study was to investigate the full in-vitro analyses of new-generation bulk-fill dental composites cured by halogen light (HLG). Two types' four composites were studied: Surefill SDR (SDR) and Xtra Base (XB) as bulk-fill flowable materials; QuixFill (QF) and XtraFill (XF) as packable bulk-fill materials. Samples were prepared for each analysis and test by applying the same procedure, but with different diameters and thicknesses appropriate to the analysis and test requirements. Thermal properties were determined by thermogravimetric analysis (TG/DTG) and differential scanning calorimetry (DSC) analysis; the Vickers microhardness (VHN) was measured after 1, 7, 15 and 30days of storage in water. The degree of conversion values for the materials (DC, %) were immediately measured using near-infrared spectroscopy (FT-IR). The surface morphology of the composites was investigated by scanning electron microscopes (SEM) and atomic-force microscopy (AFM) analyses. The sorption and solubility measurements were also performed after 1, 7, 15 and 30days of storage in water. In addition to his, the data were statistically analyzed using one-way analysis of variance, and both the Newman Keuls and Tukey multiple comparison tests. The statistical significance level was established at p<0.05. According to the ISO 4049 standards, all the tested materials showed acceptable water sorption and solubility, and a halogen light source was an option to polymerize bulk-fill, resin-based dental composites. Copyright © 2017 Elsevier B.V. All rights reserved.
Zebra: a web server for bioinformatic analysis of diverse protein families.
Suplatov, Dmitry; Kirilin, Evgeny; Takhaveev, Vakil; Svedas, Vytas
2014-01-01
During evolution of proteins from a common ancestor, one functional property can be preserved while others can vary leading to functional diversity. A systematic study of the corresponding adaptive mutations provides a key to one of the most challenging problems of modern structural biology - understanding the impact of amino acid substitutions on protein function. The subfamily-specific positions (SSPs) are conserved within functional subfamilies but are different between them and, therefore, seem to be responsible for functional diversity in protein superfamilies. Consequently, a corresponding method to perform the bioinformatic analysis of sequence and structural data has to be implemented in the common laboratory practice to study the structure-function relationship in proteins and develop novel protein engineering strategies. This paper describes Zebra web server - a powerful remote platform that implements a novel bioinformatic analysis algorithm to study diverse protein families. It is the first application that provides specificity determinants at different levels of functional classification, therefore addressing complex functional diversity of large superfamilies. Statistical analysis is implemented to automatically select a set of highly significant SSPs to be used as hotspots for directed evolution or rational design experiments and analyzed studying the structure-function relationship. Zebra results are provided in two ways - (1) as a single all-in-one parsable text file and (2) as PyMol sessions with structural representation of SSPs. Zebra web server is available at http://biokinet.belozersky.msu.ru/zebra .
Normality Tests for Statistical Analysis: A Guide for Non-Statisticians
Ghasemi, Asghar; Zahediasl, Saleh
2012-01-01
Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808
ERIC Educational Resources Information Center
Green, Jeffrey J.; Stone, Courtenay C.; Zegeye, Abera; Charles, Thomas A.
2009-01-01
Because statistical analysis requires the ability to use mathematics, students typically are required to take one or more prerequisite math courses prior to enrolling in the business statistics course. Despite these math prerequisites, however, many students find it difficult to learn business statistics. In this study, we use an ordered probit…
Yoo, Ji Won; Lee, Dong Ryul; Cha, Young Joo; You, Sung Hyun
2017-01-01
The purpose of the present study was to compare therapeutic effects of an electromyography (EMG) biofeedback augmented by virtual reality (VR) and EMG biofeedback alone on the triceps and biceps (T:B) muscle activity imbalance and elbow joint movement coordination during a reaching motor taskOBJECTIVE: To compare therapeutic effects of an electromyography (EMG) biofeedback augmented by virtual reality (VR) and EMG biofeedback alone on the triceps and biceps muscle activity imbalance and elbow joint movement coordination during a reaching motor task in normal children and children with spastic cerebral palsy (CP). 18 children with spastic CP (2 females; mean±standard deviation = 9.5 ± 1.96 years) and 8 normal children (3 females; mean ± standard deviation = 9.75 ± 2.55 years) were recruited from a local community center. All children with CP first underwent one intensive session of EMG feedback (30 minutes), followed by one session of the EMG-VR feedback (30 minutes) after a 1-week washout period. Clinical tests included elbow extension range of motion (ROM), biceps muscle strength, and box and block test. EMG triceps and biceps (T:B) muscle activity imbalance and reaching movement acceleration coordination were concurrently determined by EMG and 3-axis accelerometer measurements respectively. Independent t-test and one-way repeated analysis of variance (ANOVA) were performed at p < 0.05. The one-way repeated ANOVA was revealed to be significantly effective in elbow extension ROM (p = 0.01), biceps muscle strength (p = 0.01), and box and block test (p = 0.03). The one-way repeated ANOVA also revealed to be significantly effective in the peak triceps muscle activity (p = 0.01). However, one-way repeated ANOVA produced no statistical significance in the composite 3-dimensional movement acceleration coordination data (p = 0.12). The present study is a first clinical trial that demonstrated the superior benefits of the EMG biofeedback when augmented by virtual reality exercise games in children with spastic CP. The augmented EMG and VR feedback produced better neuromuscular balance control in the elbow joint than the EMG biofeedback alone.
Cytocompatibility, cytotoxicity and genotoxicity analysis of dental implants
NASA Astrophysics Data System (ADS)
Reigosa, M.; Labarta, V.; Molinari, G.; Bernales, D.
2007-11-01
Several types of materials are frequently used for dental prostheses in dental medicine. Different treatments with titanium are the most used. The aim of the present study was to analyze by means of cytotoxicity and cytocompatibility techniques the capacity of dental implants to integrate to the bone tissue. Cultures of UMR 106 cell line derived from an osteosarcoma were used for bioassays mainly because they show many of the properties of osteoblasts. Dental implant samples provided by B&W company were compared with others of recognized trademarks. The first ones contain ASTM titanium (8348 GR2) with acid printing. Cytotoxicity was analyzed by means of lysosome activity, using the neutral red technique and alkaline phosphatase enzyme activity. Cell variability was determined by means of the acridine ethidium-orange bromide technique. One-way ANOVA and Bonferroni and Duncan post-ANOVA tests were used for the statistical analysis. The assays did not show significant differences among the dental implants analyzed. Our findings show that the dental prostheses studied present high biocompatibility, quantified by the bioassays performed. The techniques employed revealed that they can be a useful tool for the analysis of other materials for dental medicine use.
Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Hsu, Hsian-He
2018-01-01
Purpose We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. Methods The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey’s, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Results Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey’s formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). Conclusion The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey’s formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas. PMID:29438424
NASA Astrophysics Data System (ADS)
Most, Sebastian; Nowak, Wolfgang; Bijeljic, Branko
2015-04-01
Fickian transport in groundwater flow is the exception rather than the rule. Transport in porous media is frequently simulated via particle methods (i.e. particle tracking random walk (PTRW) or continuous time random walk (CTRW)). These methods formulate transport as a stochastic process of particle position increments. At the pore scale, geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Hence, it is important to get a better understanding of the processes at pore scale. For our analysis we track the positions of 10.000 particles migrating through the pore space over time. The data we use come from micro CT scans of a homogeneous sandstone and encompass about 10 grain sizes. Based on those images we discretize the pore structure and simulate flow at the pore scale based on the Navier-Stokes equation. This flow field realistically describes flow inside the pore space and we do not need to add artificial dispersion during the transport simulation. Next, we use particle tracking random walk and simulate pore-scale transport. Finally, we use the obtained particle trajectories to do a multivariate statistical analysis of the particle motion at the pore scale. Our analysis is based on copulas. Every multivariate joint distribution is a combination of its univariate marginal distributions. The copula represents the dependence structure of those univariate marginals and is therefore useful to observe correlation and non-Gaussian interactions (i.e. non-Fickian transport). The first goal of this analysis is to better understand the validity regions of commonly made assumptions. We are investigating three different transport distances: 1) The distance where the statistical dependence between particle increments can be modelled as an order-one Markov process. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks start. 2) The distance where bivariate statistical dependence simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW/CTRW). 3) The distance of complete statistical independence (validity of classical PTRW/CTRW). The second objective is to reveal characteristic dependencies influencing transport the most. Those dependencies can be very complex. Copulas are highly capable of representing linear dependence as well as non-linear dependence. With that tool we are able to detect persistent characteristics dominating transport even across different scales. The results derived from our experimental data set suggest that there are many more non-Fickian aspects of pore-scale transport than the univariate statistics of longitudinal displacements. Non-Fickianity can also be found in transverse displacements, and in the relations between increments at different time steps. Also, the found dependence is non-linear (i.e. beyond simple correlation) and persists over long distances. Thus, our results strongly support the further refinement of techniques like correlated PTRW or correlated CTRW towards non-linear statistical relations.
On the properties of stochastic intermittency in rainfall processes.
Molini, A; La, Barbera P; Lanza, L G
2002-01-01
In this work we propose a mixed approach to deal with the modelling of rainfall events, based on the analysis of geometrical and statistical properties of rain intermittency in time, combined with the predictability power derived from the analysis of no-rain periods distribution and from the binary decomposition of the rain signal. Some recent hypotheses on the nature of rain intermittency are reviewed too. In particular, the internal intermittent structure of a high resolution pluviometric time series covering one decade and recorded at the tipping bucket station of the University of Genova is analysed, by separating the internal intermittency of rainfall events from the inter-arrival process through a simple geometrical filtering procedure. In this way it is possible to associate no-rain intervals with a probability distribution both in virtue of their position within the event and their percentage. From this analysis, an invariant probability distribution for the no-rain periods within the events is obtained at different aggregation levels and its satisfactory agreement with a typical extreme value distribution is shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romano, J.D.; Woan, G.
Data from the Laser Interferometer Space Antenna (LISA) is expected to be dominated by frequency noise from its lasers. However, the noise from any one laser appears more than once in the data and there are combinations of the data that are insensitive to this noise. These combinations, called time delay interferometry (TDI) variables, have received careful study and point the way to how LISA data analysis may be performed. Here we approach the problem from the direction of statistical inference, and show that these variables are a direct consequence of a principal component analysis of the problem. We presentmore » a formal analysis for a simple LISA model and show that there are eigenvectors of the noise covariance matrix that do not depend on laser frequency noise. Importantly, these orthogonal basis vectors correspond to linear combinations of TDI variables. As a result we show that the likelihood function for source parameters using LISA data can be based on TDI combinations of the data without loss of information.« less
MCNP Output Data Analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-06-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. Program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 155 373 No. of bytes in distributed program, including test data, etc.: 14 815 461 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PC Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two-dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Nature of problem: The output of an MCNP simulation is an ASCII file. The data processing is usually performed by copying and pasting the relevant parts of the ASCII file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two-step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two-dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two-dimensional data. Running time: The CPU time needed to smear a two-dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two-dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
Six Sigma and Introductory Statistics Education
ERIC Educational Resources Information Center
Maleyeff, John; Kaminsky, Frank C.
2002-01-01
A conflict exists between the way statistics is practiced in contemporary business environments and the way statistics is taught in schools of management. While businesses are embracing programs, such as six sigma and TQM, that bring statistical methods to the forefront of management decision making, students do not graduate with the skills to…
Understanding Non-Suicidal Self-Injury: Perceptions of School Counselors
ERIC Educational Resources Information Center
Simpson, Chris; Armstrong, Stephen A.; Couch, Lisa; Bore, Samuel K.
2010-01-01
This national exploratory study examined the perceptions of secondary school counselors' (n = 81) understanding of non-suicidal self-injury (NSSI). Two one-way ANOVAs revealed no statistically significant differences between middle and high school counselors on their perceptions of the prevalence of NSSI. Descriptive analyses revealed that a…
Academic Motivations of Pre-Service English Language Teachers
ERIC Educational Resources Information Center
Ariogul, Sibel
2009-01-01
This study examines the academic motivation, in a Turkish context, of Turkish pre-service English teachers to contribute field research. Students (n=287) completed the Academic Motivation Scale (AMS) and a demographic questionnaire. Data were analyzed using descriptive statistics, a one-way ANOVA, independent sample t-test, and Pearson product…
Improving Teaching Effectiveness through the Application of SPC Methodology
ERIC Educational Resources Information Center
Cadden, David; Driscoll, Vincent; Thompson, Mark
2008-01-01
One method used extensively to aid in determining instruction effectiveness is Student Evaluations of Instruction (SEI). This paper examines the use of statistical Process Control charts as a way to correctly measure teaching effectiveness. This field studying SEIs has produced a significant literature. It is not surprising that there is…
SSGP: SNP-set based genomic prediction to incorporate biological information
USDA-ARS?s Scientific Manuscript database
Genomic prediction has emerged as an effective approach in plant and animal breeding and in precision medicine. Much research has been devoted to an improved accuracy in genomic prediction, and one of the potential ways is to incorporate biological information. Due to the statistical and computation...
Applications of Stochastic Analyses for Collaborative Learning and Cognitive Assessment
2007-04-01
models (Visser, Maartje, Raijmakers, & Molenaar , 2002). The second part of this paper illustrates two applications of the methods described in the...clustering three-way data sets. Computational Statistics and Data Analysis, 51 (11), 5368–5376. Visser, I., Maartje, E., Raijmakers, E. J., & Molenaar
Meta-Analysis for Primary and Secondary Data Analysis: The Super-Experiment Metaphor.
ERIC Educational Resources Information Center
Jackson, Sally
1991-01-01
Considers the relation between meta-analysis statistics and analysis of variance statistics. Discusses advantages and disadvantages as a primary data analysis tool. Argues that the two approaches are partial paraphrases of one another. Advocates an integrative approach that introduces the best of meta-analytic thinking into primary analysis…
Statistical Learning Analysis in Neuroscience: Aiming for Transparency
Hanke, Michael; Halchenko, Yaroslav O.; Haxby, James V.; Pollmann, Stefan
2009-01-01
Encouraged by a rise of reciprocal interest between the machine learning and neuroscience communities, several recent studies have demonstrated the explanatory power of statistical learning techniques for the analysis of neural data. In order to facilitate a wider adoption of these methods, neuroscientific research needs to ensure a maximum of transparency to allow for comprehensive evaluation of the employed procedures. We argue that such transparency requires “neuroscience-aware” technology for the performance of multivariate pattern analyses of neural data that can be documented in a comprehensive, yet comprehensible way. Recently, we introduced PyMVPA, a specialized Python framework for machine learning based data analysis that addresses this demand. Here, we review its features and applicability to various neural data modalities. PMID:20582270
ERIC Educational Resources Information Center
Bularzik, Joseph
2007-01-01
Measuring the mass of many pennies has been used as an easy way to generate data for exercises with statistical analysis. In this general chemistry laboratory the densities of pennies are measured by weighting the pennies and using two different methods to measure the volumes. There is much to be discovered by the students on the variability of…
The Complexity of Solar and Geomagnetic Indices
NASA Astrophysics Data System (ADS)
Pesnell, W. Dean
2017-08-01
How far in advance can the sunspot number be predicted with any degree of confidence? Solar cycle predictions are needed to plan long-term space missions. Fleets of satellites circle the Earth collecting science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Statistical and timeseries analyses of the sunspot number are often used to predict solar activity. These methods have not been completely successful as the solar dynamo changes over time and one cycle's sunspots are not a faithful predictor of the next cycle's activity. In some ways, using these techniques is similar to asking whether the stock market can be predicted. It has been shown that the Dow Jones Industrial Average (DJIA) can be more accurately predicted during periods when it obeys certain statistical properties than at other times. The Hurst exponent is one such way to partition the data. Another measure of the complexity of a timeseries is the fractal dimension. We can use these measures of complexity to compare the sunspot number with other solar and geomagnetic indices. Our concentration is on how trends are removed by the various techniques, either internally or externally. Comparisons of the statistical properties of the various solar indices may guide us in understanding how the dynamo manifests in the various indices and the Sun.
Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics
NASA Astrophysics Data System (ADS)
Abe, Sumiyoshi
2014-11-01
The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
Alqahtani, Fawaz
2017-01-01
The purpose of this study was to determine the effect of two extraoral computer-aided design (CAD) and computer-aided manufacturing (CAM) systems, in comparison with conventional techniques, on the marginal fit of monolithic CAD/CAM lithium disilicate ceramic crowns. This is an in vitro interventional study. The study was carried out at the Department of Prosthodontics, School of Dentistry, Prince Sattam Bin Abdul-Aziz University, Saudi Arabia, from December 2015 to April 2016. A marginal gap of 60 lithium disilicate crowns was evaluated by scanning electron microscopy. In total, 20 pressable lithium disilicate (IPS e.max Press [Ivoclar Vivadent]) ceramic crowns were fabricated using the conventional lost-wax technique as a control group. The experimental all-ceramic crowns were produced based on a scan stone model and milled using two extraoral CAD/CAM systems: the Cerec group was fabricated using the Cerec CAD/CAM system, and the Trios group was fabricated using Trios CAD and milled using Wieland Zenotec CAM. One-way analysis of variance (ANOVA) and the Scheffe post hoc test were used for statistical comparison of the groups (α=0.05). The mean (±standard deviation) of the marginal gap of each group was as follows: the Control group was 91.15 (±15.35) µm, the Cerec group was 111.07 (±6.33) µm, and the Trios group was 60.17 (±11.09) µm. One-way ANOVA and the Scheffe post hoc test showed a statistically significant difference in the marginal gap between all groups. It can be concluded from the current study that all-ceramic crowns, fabricated using the CAD/CAM system, show a marginal accuracy that is acceptable in clinical environments. The Trios CAD group displayed the smallest marginal gap.
Lewin, Andrew C; Hausmann, Jennifer C; Miller, Paul E
2017-09-01
The purpose of this prospective study was to describe intraocular pressure (IOP) and examination findings in three tree frog species (Cruziohyla craspedopus [fringe leaf frog], Cruziohyla calcarifer [splendid leaf frog], and Anotheca spinosa [spiny-headed or coronated tree frog]). Thirty-one C. craspedopus, four C. calcarifer, and five A. spinosa were weighed, sexed based on phenotype where possible, and examined using slit-lamp biomicroscopy and indirect ophthalmoscopy. IOP was measured using the TonoVet and TonoLab rebound tonometers while the frogs were held two ways (unrestrained, then restrained). Statistical differences were determined using one-way analysis of variance (ANOVA) and t-tests. Mean ± SD IOP (TonoVet and TonoLab, respectively) was 15.1 ± 2.5 mmHg and 15.6 ± 4.1 mmHg in C. craspedopus; 14.8 ± 1.5 mmHg and 18.8 ± 3.1 mmHg in C. calcarifer; and 9.1 ± 2.1 mmHg and 10.8 ± 1.4 mmHg in A. spinosa. There was no significant difference in IOP in C. craspedopus by eye (Right vs Left), tonometer, or restraint method. IOP in female C. craspedopus was 1-3 mm Hg higher than in males with both devices (P < 0.05). IOP was statistically significantly different between all species for the TonoLab and between Cruziohyla genus frogs and A. spinosa for the TonoVet (P < 0.05). There was no difference in IOP measurements between the TonoVet and TonoLab in C. craspedopus. IOP varied by gender in C. craspedopus and between species, but not by tonometer. Ocular abnormalities were minimal in this group of captive bred frogs.
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
Color stability and degree of cure of direct composite restoratives after accelerated aging.
Sarafianou, Aspasia; Iosifidou, Soultana; Papadopoulos, Triantafillos; Eliades, George
2007-01-01
This study evaluated the color changes and amount of remaining C = C bonds (%RDB) in three dental composites after hydrothermal- and photoaging. The materials tested were Estelite sigma, Filtek Supreme and Tetric Ceram. Specimens were fabricated from each material and subjected to L* a* b* colorimetry and FTIR spectroscopy before and after aging. Statistical evaluation of the deltaL,* deltaa,* deltab,* deltaE and %deltaRDB data was performed by one-way ANOVA and Tukey's test. The %RDB data before and after aging were statistically analyzed using two-way ANOVA and Student-Newman-Keuls test. In all cases an alpha = 0.05 significance level was used. No statistically significant differences were found in deltaL*, deltaa*, deltaE and %deltaRDB among the materials tested. Tetric Ceram demonstrated a significant difference in deltab*. All the materials showed visually perceptible (deltaE >1) but clinically acceptable values (deltaE < 3.3). Within each material group, statistically significant differences in %RDB were noticed before and after aging (p < 0.05). Filtek Supreme presented the lowest %RDB before aging, with Tetric Ceram presenting the lowest %RDB after aging (p < 0.05). The %deltaRDB mean values were statistically significantly different among all the groups tested. No correlation was found between deltaE and %deltaRDB.
Evaluation of noise pollution level in the operating rooms of hospitals: A study in Iran.
Giv, Masoumeh Dorri; Sani, Karim Ghazikhanlou; Alizadeh, Majid; Valinejadi, Ali; Majdabadi, Hesamedin Askari
2017-06-01
Noise pollution in the operating rooms is one of the remaining challenges. Both patients and physicians are exposed to different sound levels during the operative cases, many of which can last for hours. This study aims to evaluate the noise pollution in the operating rooms during different surgical procedures. In this cross-sectional study, sound level in the operating rooms of Hamadan University-affiliated hospitals (totally 10) in Iran during different surgical procedures was measured using B&K sound meter. The gathered data were compared with national and international standards. Statistical analysis was performed using descriptive statistics and one-way ANOVA, t -test, and Pearson's correlation test. Noise pollution level at majority of surgical procedures is higher than national and international documented standards. The highest level of noise pollution is related to orthopedic procedures, and the lowest one related to laparoscopic and heart surgery procedures. The highest and lowest registered sound level during the operation was 93 and 55 dB, respectively. Sound level generated by equipments (69 ± 4.1 dB), trolley movement (66 ± 2.3 dB), and personnel conversations (64 ± 3.9 dB) are the main sources of noise. The noise pollution of operating rooms are higher than available standards. The procedure needs to be corrected for achieving the proper conditions.
Development and validation of an instrument to assess job satisfaction in eye-care personnel.
Paudel, Prakash; Cronjé, Sonja; O'Connor, Patricia M; Khadka, Jyoti; Rao, Gullapalli N; Holden, Brien A
2017-11-01
The aim was to develop and validate an instrument to measure job satisfaction in eye-care personnel and assess the job satisfaction of one-year trained vision technicians in India. A pilot instrument for assessing job satisfaction was developed, based on a literature review and input from a public health expert panel. Rasch analysis was used to assess psychometric properties and to undertake an iterative item reduction. The instrument was then administered to vision technicians in vision centres of Andhra Pradesh in India. Associations between vision technicians' job satisfaction and factors such as age, gender and experience were analysed using t-test and one-way analysis of variance. Rasch analysis confirmed that the 15-item job satisfaction in eye-care personnel (JSEP) was a unidimensional instrument with good fit statistics, measurement precisions and absence of differential item functioning. Overall, vision technicians reported high rates of job satisfaction (0.46 logits). Age, gender and experience were not associated with high job satisfaction score. Item score analysis showed non-financial incentives, salary and workload were the most important determinants of job satisfaction. The 15-item JSEP instrument is a valid instrument for assessing job satisfaction among eye-care personnel. Overall, vision technicians in India demonstrated high rates of job satisfaction. © 2016 Optometry Australia.
Derailment-based Fault Tree Analysis on Risk Management of Railway Turnout Systems
NASA Astrophysics Data System (ADS)
Dindar, Serdar; Kaewunruen, Sakdirat; An, Min; Gigante-Barrera, Ángel
2017-10-01
Railway turnouts are fundamental mechanical infrastructures, which allow a rolling stock to divert one direction to another. As those are of a large number of engineering subsystems, e.g. track, signalling, earthworks, these particular sub-systems are expected to induce high potential through various kind of failure mechanisms. This could be a cause of any catastrophic event. A derailment, one of undesirable events in railway operation, often results, albeit rare occurs, in damaging to rolling stock, railway infrastructure and disrupt service, and has the potential to cause casualties and even loss of lives. As a result, it is quite significant that a well-designed risk analysis is performed to create awareness of hazards and to identify what parts of the systems may be at risk. This study will focus on all types of environment based failures as a result of numerous contributing factors noted officially as accident reports. This risk analysis is designed to help industry to minimise the occurrence of accidents at railway turnouts. The methodology of the study relies on accurate assessment of derailment likelihood, and is based on statistical multiple factors-integrated accident rate analysis. The study is prepared in the way of establishing product risks and faults, and showing the impact of potential process by Boolean algebra.
GOIATO, Marcelo Coelho; dos SANTOS, Daniela Micheline; MORENO, Amália; GENNARI-FILHO, Humberto; PELLIZZER, Eduardo Piza
2011-01-01
The use of ocular prostheses for ophthalmic patients aims to rebuild facial aesthetics and provide an artificial substitute to the visual organ. Natural intemperate conditions promote discoloration of artificial irides and many studies have attempted to produce irides with greater chromatic paint durability using different paint materials. Objectives The present study evaluated the color stability of artificial irides obtained with two techniques (oil painting and digital image) and submitted to microwave polymerization. Material and Methods Forty samples were fabricated simulating ocular prostheses. Each sample was constituted by one disc of acrylic resin N1 and one disc of colorless acrylic resin with the iris interposed between the discs. The irides in brown and blue color were obtained by oil painting or digital image. The color stability was determined by a reflection spectrophotometer and measurements were taken before and after microwave polymerization. Statistical analysis of the techniques for reproducing artificial irides was performed by applying the normal data distribution test followed by 2-way ANOVA and Tukey HSD test (α=.05). Results Chromatic alterations occurred in all specimens and statistically significant differences were observed between the oil-painted samples and those obtained by digital imaging. There was no statistical difference between the brown and blue colors. Independently of technique, all samples suffered color alterations after microwave polymerization. Conclusion The digital imaging technique for reproducing irides presented better color stability after microwave polymerization. PMID:21625733
Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic
2017-02-01
Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Clayton, Michelle
Using a mixed methods research design, the author examined the relationships between "highly qualified" status, instructional practices, and students' science achievement for six third grade teachers in three high poverty Louisiana school systems. The study analyzed qualitative and quantitative data for three science classes taught by "highly qualified" teachers and three science classes taught by "non-highly qualified" teachers. The qualitative portion of the study was conducted through classroom observations, teacher interviews, and lesson plan reviews. The qualitative data was coded and triangulated to determine whether the instructional practices of each teacher were more "teacher-centered" or "student-centered." The qualitative data analysis indicated various patterns and consistencies in the instructional practices used by the "highly qualified" and "non-highly qualified" teachers selected for this study. The quantitative portion of the study involved analysis of the students' science achievement data for the six third grade science teachers selected for the study. Science achievement was measured by the third grade Integrated Louisiana Education Assessment Program (iLEAP) scores. A two-way ANOVA indicated that there were statistically significant differences in the mean scores of the three high poverty Louisiana school systems as well as the students taught by "highly qualified" and "non-highly qualified" teachers and the interactions between the two: F(2, 123) = 46.99, p < 0.01; F(1, 123) = 4.54, p = 0.035; F(2, 123) = 3.73, p = 0.027. A separate one-way ANOVA indicated that statistically significant differences existed between the six participating teachers in the study: F (5, 123) = 20.386, p < 0.01). Tukey's HSD post-hoc tests and homogeneous subset analyses were conducted in order to determine which teachers' scores significantly differed from each other.
Morténius, Helena; Fridlund, Bengt; Marklund, Bertil; Palm, Lars; Baigi, Amir
2012-04-01
To evaluate the long-term utilisation of strategic communication as a factor of importance when changing work practices among primary care staff. In many health care organisations, there is a gap between theory and practice. This gap hinders the provision of optimal evidence-based practice and, in the long term, is unfavourable for patient care. One way of overcoming this barrier is systematically structured communication between the scientific theoretical platform and clinical practice. This longitudinal evaluative study was conducted among a primary care staff cohort. Strategic communication was considered to be the intervention platform and included a network of ambassadors who acted as a component of the implementation. Measurements occurred 7 and 12 years after formation of the cohort. A questionnaire was used to obtain information from participants. In total, 846 employees (70%) agreed to take part in the study. After 12 years, the 352 individuals (60%) who had remained in the organisation were identified and followed up. Descriptive statistics and multivariate analysis were used to analyse the data. Continuous information contributed to significant improvements over time with respect to new ideas and the intention to change work practices. There was a statistically significant synergistic effect on the new way of thinking, that is, willingness to change work practices. During the final two years, the network of ambassadors had created a distinctive image for itself in the sense that primary care staff members were aware of it and its activities. This awareness was associated with a positive change with regard to new ways of thinking. More years of practice was inversely associated with willingness to change work practices. Strategic communication may lead to a scientific platform that promotes high-quality patient care by means of new methods and research findings.
Gouveia, Thayla Hellen Nunes; Públio, Juliana do Carmo; Ambrosano, Glaucia Maria Bovi; Paulillo, Luís Alexandre Maffei Sartini; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite
2016-01-01
To evaluate the influence of 16% carbamide peroxide (CP) containing different thickeners on the physical characteristics of a nanocomposite resin submitted or not to accelerated artificial aging (AAA). One hundred samples were randomly distributed into two groups (n = 50) according to AAA. Each group was divided into 5 subgroups (n = 10) depending on the bleaching/thickener treatment: CP + carbopol, CP + natrosol, carbopol, natrosol, and no treatment (control). The physical properties tested were color (ΔE), gloss (GU), mean roughness (Ra), and Knoop microhardness (KHN). The resin surface was performed with atomic force microscopy (AFM). The color (variable Δ E) was assessed with two-way analysis of variance (ANOVA) and additionally with Tukey's and Dunnett's tests, the roughness values were submitted to Kruskal-Wallis, Dunn's, and Mann-Whitney's tests. Data on gloss and KHN were submitted to two-way ANOVA and Tukey's test (α = 0.05). Among the physical properties evaluated, CP + carbopol promoted a reduction in composite microhardness only, thus differing statistically from the controls. As for CP + natrosol, such a change was not observed. The aging process reduced all the physical properties, thus differing statistically from the nonaging group. CP + carbopol increased the roughness and decreased the gloss of aged resins, whereas natrosol reduced gloss only, which differed statistically from the controls. AFM showed evidence of the loss of organic matrix and exposure to load particles in the aged samples. Therefore, the replacement of carbopol with natrosol provided maintenance of the composite microhardness following bleaching. The aging process reduced the physical properties evaluated, and some changes were enhanced by the application of bleaching.
Li, Hongliang; Dai, Jiewen; Si, Jiawen; Zhang, Jianfei; Wang, Minjiao; Shen, Steve Guofang; Yu, Hongbo
2015-01-01
Anterior maxillary segmental distraction (AMSD) is an effective surgical procedure in the treatment of maxillary hypoplasia secondary to cleft lip and palate. Its unique advantage of preserving velopharyngeal function makes this procedure widely applied. In this study, the application of AMSD was described and its long-term stability was explored. Eight patients with severe maxillary hypoplasia secondary to CLP were included in this study. They were treated with AMSD using rigid external distraction (RED) device. Cephalometric analysis was performed twice at three time points for evaluation: before surgery (T1), after distraction (T2), and 2 years after treatment (T3). One-way analysis of variance was used to assess the differences statistically. All the distractions completed smoothly, and maxilla was distracted efficiently. The value of SNA, NA-FH, Ptm-A, U1-PP, overjet and PP (ANS-PNS) increased significantly after the AMSD procedure (P < 0.05), with the mean overjet increased by 14.28 mm. However, comparison of cephalometric analysis between T2 and T3 showed no significant difference (P > 0.05). Changes of palatopharyngeal depth and soft palatal length were insignificant. AMSD with RED device provided an effective way to correct maxillary hypoplasia secondary to CLP, extended the palatal and arch length, avoided damage on velopharyngeal closure function and reduced the relapse rate. It is a promising and valuable technique in this potentially complicated procedure.
ERIC Educational Resources Information Center
Rose, L. Todd; Rouhani, Parisa; Fischer, Kurt W.
2013-01-01
Our goal is to establish a science of the individual, grounded in dynamic systems, and focused on the analysis of individual variability. Our argument is that individuals behave, learn, and develop in distinctive ways, showing patterns of variability that are not captured by models based on statistical averages. As such, any meaningful attempt to…
Shakespeare, Chekhov and the Emergence of the Transcultured Self in Denmark
ERIC Educational Resources Information Center
Klimenko, Svetlana
2003-01-01
This paper discusses ways in which theatre practices reflect the dynamics of historical development seen from the perspective of transculturation (Epstein, 1995). The analysis is centred on modes of appropriation of Shakespeare and Chekhov in Denmark. The argument relies on a broader statistical investigation into repertoire development, but…
Shahi, Shahriar; Yavari, Hamid R; Rahimi, Saeed; Reyhani, Mohammad F; Kamarroosta, Zahra; Abdolrahimi, Majid
2009-03-01
The aim of this study was to evaluate the effect of RaCe, FlexMaster and ProFile rotary instruments on smear layer formation by scanning electron microscopy. Eighty-four caries-free freshly extracted human single-rooted teeth were selected and divided into three groups, each containing 28 teeth. The teeth were instrumented with rotary instruments sequentially: Group A: ProFile Rotary Instruments; Group B: FlexMaster Rotary Instruments; and Group C: RaCe Rotary Instruments. Instrumentation was performed by the crown-down method and according to the manufacturer's instructions. The specimens were then examined with SEM according to Hülsmann's classification. One-way ANOVA and a post hoc Tukey test were used for statistical analysis. The results showed that there were no statistically significant differences among the three groups in the coronal third (P = 0.39), but at the apical and middle thirds there were statistically significant differences between the RaCe group and the other groups (P < 0.05). Smear layer in the RaCe group was less than that in the ProFile and FlexMaster groups, but the difference between the ProFile group and FlexMaster group was not statistically significant (P > 0.05). It was concluded that RaCe Rotary Instruments produce less smear layer than FlexMaster and ProFile Rotary Instruments.
Statistical scaling of pore-scale Lagrangian velocities in natural porous media.
Siena, M; Guadagnini, A; Riva, M; Bijeljic, B; Pereira Nunes, J P; Blunt, M J
2014-08-01
We investigate the scaling behavior of sample statistics of pore-scale Lagrangian velocities in two different rock samples, Bentheimer sandstone and Estaillades limestone. The samples are imaged using x-ray computer tomography with micron-scale resolution. The scaling analysis relies on the study of the way qth-order sample structure functions (statistical moments of order q of absolute increments) of Lagrangian velocities depend on separation distances, or lags, traveled along the mean flow direction. In the sandstone block, sample structure functions of all orders exhibit a power-law scaling within a clearly identifiable intermediate range of lags. Sample structure functions associated with the limestone block display two diverse power-law regimes, which we infer to be related to two overlapping spatially correlated structures. In both rocks and for all orders q, we observe linear relationships between logarithmic structure functions of successive orders at all lags (a phenomenon that is typically known as extended power scaling, or extended self-similarity). The scaling behavior of Lagrangian velocities is compared with the one exhibited by porosity and specific surface area, which constitute two key pore-scale geometric observables. The statistical scaling of the local velocity field reflects the behavior of these geometric observables, with the occurrence of power-law-scaling regimes within the same range of lags for sample structure functions of Lagrangian velocity, porosity, and specific surface area.
Carlioglu, Ayse; Kaygusuz, Ikbal; Karakurt, Feridun; Gumus, Ilknur Inegol; Uysal, Aysel; Kasapoglu, Benan; Armutcu, Ferah; Uysal, Sema; Keskin, Esra Aktepe; Koca, Cemile
2014-11-01
To evaluate the platelet activating factor acetyl hydrolyze (PAF-AH), oxidized low-density lipoprotein (ox-LDL), paraoxonase 1 (PON1), arylesterase (ARE) levels and the effects of metformin and Diane-35 (ethinyl oestradiol + cyproterone acetate) therapies on these parameters and to determine the PON1 polymorphisms among PCOS patients. Ninety patients with PCOS, age 30, and body mass index-matched healthy controls were included in the study. Patients were divided into three groups: metformin treatment, Diane-35 treatment and no medication groups. The treatment with metformin or Diane-35 was continued for 6 months and all subjects were evaluated with clinical and biochemical parameters 6 months later. One-way Anova test, t test and non-parametric Mann-Whitney U tests were used for statistical analysis. PAF-AH and ox-LDL levels were statistically significantly higher in untreated PCOS patients than controls, and they were statistically significantly lower in patients treated with metformin or Diane-35 than untreated PCOS patients. In contrast, there were lower PON1 (not statistically significant) and ARE (statistically significant) levels in untreated PCOS patients than the control group and they significantly increased after metformin and Diane-35 treatments. In PCOS patients serum PON1 levels for QQ, QR and RR phenotypes were statistically significantly lower than the control group. In patients with PCOS, proatherogenic markers increase. The treatment of PCOS with metformin or Diane-35 had positive effects on lipid profile, increased PON1 level, which is a protector from atherosclerosis and decreased the proatherogenic PAF-AH and ox-LDL levels.
Rain rate intensity model for communication link design across the Indian region
NASA Astrophysics Data System (ADS)
Kilaru, Aravind; Kotamraju, Sarat K.; Avlonitis, Nicholas; Sri Kavya, K. Ch.
2016-07-01
A study on rain statistical parameters such as one minute rain intensity, possible number of minute occurrences with respective percentage of time in a year has been evaluated for the purpose of communication link design at Ka, Q, V bands as well as at Free-Space Optical communication links (FSO). To understand possible outage period of a communication links due to rainfall and to investigate rainfall pattern, Automatic Weather Station (AWS) rainfall data is analysed due its ample presence across India. The climates of the examined AWS regions vary from desert to cold climate, heavy rainfall to variable rainfall regions, cyclone effective regions, mountain and coastal regions. In this way a complete and unbiased picture of the rainfall statistics for Indian region is evaluated. The analysed AWS data gives insight into yearly accumulated rainfall, maximum hourly accumulated rainfall, mean hourly accumulated rainfall, number of rainy days and number of rainy hours from 668 AWS locations. Using probability density function the one minute rainfall measurements at KL University is integrated with AWS measurements for estimating number of rain occurrences in terms of one minute rain intensity for annual rainfall accumulated between 100 mm and 5000 mm to give an insight into possible one minute accumulation pattern in an hour for comprehensive analysis of rainfall influence on a communication link for design engineers. So that low availability communications links at higher frequencies can be transformed into a reliable and economically feasible communication links for implementing High Throughput Services (HTS).
[Range of Hip Joint Motion and Weight of Lower Limb Function under 3D Dynamic Marker].
Xia, Q; Zhang, M; Gao, D; Xia, W T
2017-12-01
To explore the range of reasonable weight coefficient of hip joint in lower limb function. When the hip joints of healthy volunteers under normal conditions or fixed at three different positions including functional, flexed and extension positions, the movements of lower limbs were recorded by LUKOtronic motion capture and analysis system. The degree of lower limb function loss was calculated using Fugl-Meyer lower limb function assessment form when the hip joints were fixed at the aforementioned positions. One-way analysis of variance and Tamhane's T2 method were used to proceed statistics analysis and calculate the range of reasonable weight coefficient of hip joint. There were significant differences between the degree of lower limb function loss when the hip joints fixed at flexed and extension positions and at functional position. While the differences between the degree of lower limb function loss when the hip joints fixed at flexed position and extension position had no statistical significance. In 95% confidence interval, the reasonable weight coefficient of hip joint in lower limb function was between 61.05% and 73.34%. Expect confirming the reasonable weight coefficient, the effects of functional and non-functional positions on the degree of lower limb function loss should also be considered for the assessment of hip joint function loss. Copyright© by the Editorial Department of Journal of Forensic Medicine
The effect of kangaroo mother care on mental health of mothers with low birth weight infants.
Badiee, Zohreh; Faramarzi, Salar; MiriZadeh, Tahereh
2014-01-01
The mothers of premature infants are at risk of psychological stress because of separation from their infants. One of the methods influencing the maternal mental health in the postpartum period is kangaroo mother care (KMC). This study was conducted to evaluate the effect of KMC of low birth weight infants on their maternal mental health. The study was conducted in the Department of Pediatrics of Isfahan University of Medical Sciences, Isfahan, Iran. Premature infants were randomly allocated into two groups. The control group received standard caring in the incubator. In the experimental group, caring with three sessions of 60 min KMC daily for 1 week was practiced. Mental health scores of the mothers were evaluated by using the 28-item General Health Questionnaire. Statistical analysis was performed by the analysis of covariance using SPSS. The scores of 50 infant-mother pairs were analyzed totally (25 in KMC group and 25 in standard care group). Results of covariance analysis showed the positive effects of KMC on the rate of maternal mental health scores. There were statistically significant differences between the mean scores of the experimental group and control subjects in the posttest period (P < 0.001). KMC for low birth weight infants is a safe way to improve maternal mental health. Therefore, it is suggested as a useful method that can be recommended for improving the mental health of mothers.
Nonlinear dynamics of the cellular-automaton ``game of Life''
NASA Astrophysics Data System (ADS)
Garcia, J. B. C.; Gomes, M. A. F.; Jyh, T. I.; Ren, T. I.; Sales, T. R. M.
1993-11-01
A statistical analysis of the ``game of Life'' due to Conway [Berlekamp, Conway, and Guy, Winning Ways for Your Mathematical Plays (Academic, New York, 1982), Vol. 2] is reported. The results are based on extensive computer simulations starting with uncorrelated distributions of live sites at t=0. The number n(s,t) of clusters of s live sites at time t, the mean cluster size s¯(t), and the diversity of sizes among other statistical functions are obtained. The dependence of the statistical functions with the initial density of live sites is examined. Several scaling relations as well as static and dynamic critical exponents are found.
Fatima, Nikhat; Khan, Aleem A.; Vishwakarma, Sandeep K.
2017-01-01
Background: Growing evidence shows that dental pulp (DP) tissues could be a potential source of adult stem cells for the treatment of devastating neurological diseases and several other conditions. Aims: Exploration of the expression profile of several key molecular markers to evaluate the molecular dynamics in undifferentiated and differentiated DP-derived stem cells (DPSCs) in vitro. Settings and Design: The characteristics and multilineage differentiation ability of DPSCs were determined by cellular and molecular kinetics. DPSCs were further induced to form adherent (ADH) and non-ADH (NADH) neurospheres under serum-free condition which was further induced into neurogenic lineage cells and characterized for their molecular and cellular diversity at each stage. Statistical Analysis Used: Statistical analysis used one-way analysis of variance, Student's t-test, Livak method for relative quantification, and R programming. Results: Immunophenotypic analysis of DPSCs revealed >80% cells positive for mesenchymal markers CD90 and CD105, >70% positive for transferring receptor (CD71), and >30% for chemotactic factor (CXCR3). These cells showed mesodermal differentiation also and confirmed by specific staining and molecular analysis. Activation of neuronal lineage markers and neurogenic growth factors was observed during lineage differentiation of cells derived from NADH and ADH spheroids. Greater than 80% of cells were found to express β-tubulin III in both differentiation conditions. Conclusions: The present study reported a cascade of immunophenotypic and molecular markers to characterize neurogenic differentiation of DPSCs under serum-free condition. These findings trigger the future analyses for clinical applicability of DP-derived cells in regenerative applications. PMID:28566856
NASA Astrophysics Data System (ADS)
Pardimin, H.; Arcana, N.
2018-01-01
Many types of research in the field of mathematics education apply the Quasi-Experimental method and statistical analysis use t-test. Quasi-experiment has a weakness that is difficult to fulfil “the law of a single independent variable”. T-test also has a weakness that is a generalization of the conclusions obtained is less powerful. This research aimed to find ways to reduce the weaknesses of the Quasi-experimental method and improved the generalization of the research results. The method applied in the research was a non-interactive qualitative method, and the type was concept analysis. Concepts analysed are the concept of statistics, research methods of education, and research reports. The result represented a way to overcome the weaknesses of quasi-Experiments and T-test. In addition, the way was to apply a combination of Factorial Design and Balanced Design, which the authors refer to as Factorial-Balanced Design. The advantages of this design are: (1) almost fulfilling “the low of single independent variable” so no need to test the similarity of the academic ability, (2) the sample size of the experimental group and the control group became larger and equal; so it becomes robust to deal with violations of the assumptions of the ANOVA test.
Escape rates over potential barriers: variational principles and the Hamilton-Jacobi equation
NASA Astrophysics Data System (ADS)
Cortés, Emilio; Espinosa, Francisco
We describe a rigorous formalism to study some extrema statistics problems, like maximum probability events or escape rate processes, by taking into account that the Hamilton-Jacobi equation completes, in a natural way, the required set of boundary conditions of the Euler-Lagrange equation, for this kind of variational problem. We apply this approach to a one-dimensional stochastic process, driven by colored noise, for a double-parabola potential, where we have one stable and one unstable steady states.
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
Langan, Dean; Higgins, Julian P T; Gregory, Walter; Sutton, Alexander J
2012-05-01
We aim to illustrate the potential impact of a new study on a meta-analysis, which gives an indication of the robustness of the meta-analysis. A number of augmentations are proposed to one of the most widely used of graphical displays, the funnel plot. Namely, 1) statistical significance contours, which define regions of the funnel plot in which a new study would have to be located to change the statistical significance of the meta-analysis; and 2) heterogeneity contours, which show how a new study would affect the extent of heterogeneity in a given meta-analysis. Several other features are also described, and the use of multiple features simultaneously is considered. The statistical significance contours suggest that one additional study, no matter how large, may have a very limited impact on the statistical significance of a meta-analysis. The heterogeneity contours illustrate that one outlying study can increase the level of heterogeneity dramatically. The additional features of the funnel plot have applications including 1) informing sample size calculations for the design of future studies eligible for inclusion in the meta-analysis; and 2) informing the updating prioritization of a portfolio of meta-analyses such as those prepared by the Cochrane Collaboration. Copyright © 2012 Elsevier Inc. All rights reserved.
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
Arctic Ocean Model Intercomparison Using Sound Speed
NASA Astrophysics Data System (ADS)
Dukhovskoy, D. S.; Johnson, M. A.
2002-05-01
The monthly and annual means from three Arctic ocean - sea ice climate model simulations are compared for the period 1979-1997. Sound speed is used to integrate model outputs of temperature and salinity along a section between Barrow and Franz Josef Land. A statistical approach is used to test for differences among the three models for two basic data subsets. We integrated and then analyzed an upper layer between 2 m - 50 m, and also a deep layer from 500 m to the bottom. The deep layer is characterized by low time-variability. No high-frequency signals appear in the deep layer having been filtered out in the upper layer. There is no seasonal signal in the deep layer and the monthly means insignificantly oscillate about the long-period mean. For the deep ocean the long-period mean can be considered quasi-constant, at least within the 19 year period of our analysis. Thus we assumed that the deep ocean would be the best choice for comparing the means of the model outputs. The upper (mixed) layer was chosen to contrast the deep layer dynamics. There are distinct seasonal and interannual signals in the sound speed time series in this layer. The mixed layer is a major link in the ocean - air interaction mechanism. Thus, different mean states of the upper layer in the models might cause different responses in other components of the Arctic climate system. The upper layer also strongly reflects any differences in atmosphere forcing. To compare data from the three models we have used a one-way t-test for the population mean, the Wilcoxon one-sample signed-rank test (when the requirement of normality of tested data is violated), and one-way ANOVA method and F-test to verify our hypothesis that the model outputs have the same mean sound speed. The different statistical approaches have shown that all models have different mean characteristics of the deep and upper layers of the Arctic Ocean.
Gender in the allocation of organs in kidney transplants: meta-analysis
Santiago, Erika Vieira Almeida e; Silveira, Micheline Rosa; de Araújo, Vânia Eloisa; Farah, Katia de Paula; Acurcio, Francisco de Assis; Ceccato, Maria das Graças Braga
2015-01-01
OBJECTIVE To analyze whether gender influence survival results of kidney transplant grafts and patients. METHODS Systematic review with meta-analysis of cohort studies available on Medline (PubMed), LILACS, CENTRAL, and Embase databases, including manual searching and in the grey literature. The selection of studies and the collection of data were conducted twice by independent reviewers, and disagreements were settled by a third reviewer. Graft and patient survival rates were evaluated as effectiveness measurements. Meta-analysis was conducted with the Review Manager® 5.2 software, through the application of a random effects model. Recipient, donor, and donor-recipient gender comparisons were evaluated. RESULTS : Twenty-nine studies involving 765,753 patients were included. Regarding graft survival, those from male donors were observed to have longer survival rates as compared to the ones from female donors, only regarding a 10-year follow-up period. Comparison between recipient genders was not found to have significant differences on any evaluated follow-up periods. In the evaluation between donor-recipient genders, male donor-male recipient transplants were favored in a statistically significant way. No statistically significant differences were observed in regards to patient survival for gender comparisons in all follow-up periods evaluated. CONCLUSIONS The quantitative analysis of the studies suggests that donor or recipient genders, when evaluated isolatedly, do not influence patient or graft survival rates. However, the combination between donor-recipient genders may be a determining factor for graft survival. PMID:26465666
NASA Astrophysics Data System (ADS)
Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.
2014-12-01
Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.
Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk
2018-04-01
The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.
Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area
NASA Astrophysics Data System (ADS)
Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.
2015-12-01
Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.
Deconstructing multivariate decoding for the study of brain function.
Hebart, Martin N; Baker, Chris I
2017-08-04
Multivariate decoding methods were developed originally as tools to enable accurate predictions in real-world applications. The realization that these methods can also be employed to study brain function has led to their widespread adoption in the neurosciences. However, prior to the rise of multivariate decoding, the study of brain function was firmly embedded in a statistical philosophy grounded on univariate methods of data analysis. In this way, multivariate decoding for brain interpretation grew out of two established frameworks: multivariate decoding for predictions in real-world applications, and classical univariate analysis based on the study and interpretation of brain activation. We argue that this led to two confusions, one reflecting a mixture of multivariate decoding for prediction or interpretation, and the other a mixture of the conceptual and statistical philosophies underlying multivariate decoding and classical univariate analysis. Here we attempt to systematically disambiguate multivariate decoding for the study of brain function from the frameworks it grew out of. After elaborating these confusions and their consequences, we describe six, often unappreciated, differences between classical univariate analysis and multivariate decoding. We then focus on how the common interpretation of what is signal and noise changes in multivariate decoding. Finally, we use four examples to illustrate where these confusions may impact the interpretation of neuroimaging data. We conclude with a discussion of potential strategies to help resolve these confusions in interpreting multivariate decoding results, including the potential departure from multivariate decoding methods for the study of brain function. Copyright © 2017. Published by Elsevier Inc.
Effect of Dentin Wetness on the Bond Strength of Universal Adhesives.
Choi, An-Na; Lee, Ji-Hye; Son, Sung-Ae; Jung, Kyoung-Hwa; Kwon, Yong Hoon; Park, Jeong-Kil
2017-10-25
The effects of dentin wetness on the bond strength and adhesive interface morphology of universal adhesives have been investigated using micro-tensile bond strength (μTBS) testing and confocal laser scanning microscopy (CLSM). Seventy-two human third molars were wet ground to expose flat dentin surfaces. They were divided into three groups according to the air-drying time of the dentin surfaces: 0 (without air drying), 5, and 10 s. The dentin surfaces were then treated with three universal adhesives: G-Premio Bond, Single Bond Universal, and All-Bond Universal in self-etch or etch-and-rinse mode. After composite build up, a μTBS test was performed. One additional tooth was prepared for each group by staining the adhesives with 0.01 wt % of Rhodamine B fluorescent dye for CLSM analysis. The data were analyzed statistically using ANOVA and Tukey's post hoc tests (α = 0.05). Two-way ANOVA showed significant differences among the adhesive systems and dentin moisture conditions. An interaction effect was also observed ( p < 0.05). One-way ANOVA showed that All-Bond Universal was the only material influenced by the wetness of the dentin surfaces. Wetness of the dentin surface is a factor influencing the micro-tensile bond strength of universal adhesives.
Effect of Dentin Wetness on the Bond Strength of Universal Adhesives
Lee, Ji-Hye; Son, Sung-Ae; Jung, Kyoung-Hwa; Kwon, Yong Hoon
2017-01-01
The effects of dentin wetness on the bond strength and adhesive interface morphology of universal adhesives have been investigated using micro-tensile bond strength (μTBS) testing and confocal laser scanning microscopy (CLSM). Seventy-two human third molars were wet ground to expose flat dentin surfaces. They were divided into three groups according to the air-drying time of the dentin surfaces: 0 (without air drying), 5, and 10 s. The dentin surfaces were then treated with three universal adhesives: G-Premio Bond, Single Bond Universal, and All-Bond Universal in self-etch or etch-and-rinse mode. After composite build up, a μTBS test was performed. One additional tooth was prepared for each group by staining the adhesives with 0.01 wt % of Rhodamine B fluorescent dye for CLSM analysis. The data were analyzed statistically using ANOVA and Tukey’s post hoc tests (α = 0.05). Two-way ANOVA showed significant differences among the adhesive systems and dentin moisture conditions. An interaction effect was also observed (p < 0.05). One-way ANOVA showed that All-Bond Universal was the only material influenced by the wetness of the dentin surfaces. Wetness of the dentin surface is a factor influencing the micro-tensile bond strength of universal adhesives. PMID:29068404
Effect of silver nano particles on flexural strength of acrylic resins.
Sodagar, Ahmad; Kassaee, Mohammad Zaman; Akhavan, Azam; Javadi, Negar; Arab, Sepideh; Kharazifard, Mohammad Javad
2012-04-01
Poly(methyl methacrylate), PMMA, is widely used for fabrication of removable orthodontic appliances. Silver nano particles (AgNps) have been added to PMMA because of their antimicrobial properties. The aim of this study is to investigate the effect of AgNps on the flexural strength of PMMA. Acrylic liquid containing 0.05% and 0.2% AgNps was prepared for two kinds of acrylic resins: Rapid Repair &Selecta Plus. Two groups without AgNps were used as control groups. For each one, flexural strength was investigated via Three Point Bending method for the 15 acrylic blocks. Two-way ANOVA, one way ANOVA and Tukey tests were used for statistical analysis. Rapid Repair without AgNps showed the highest flexural strength. Addition of 0.05% AgNps to Rapid Repair, significantly decreased its flexural strength while, continuing the addition up to 0.2% increased it nearly up to its primary level. In contrast, addition of AgNps to Selecta Plus increased its flexural strength but addition of 0.05% nano particles was more effective than 0.2%. The effect of AgNps on flexural strength of PMMA depends on several factors including the type of acrylics and the concentrations of nano particles. Copyright © 2011 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
García-Roncero, Herminio; Caballé-Serrano, Jordi; Cano-Batalla, Jordi; Cabratosa-Termes, Josep; Figueras-Álvarez, Oscar
2015-04-01
In this study, a temporal abutment fixation screw, designed to fracture in a controlled way upon application of an occlusal force sufficient to produce critical micromotion was developed. The purpose of the screw was to protect the osseointegration of immediate loaded single implants. Seven different screw prototypes were examined by fixing titanium abutments to 112 Mozo-Grau external hexagon implants (MG Osseous®; Mozo-Grau, S.A., Valladolid, Spain). Fracture strength was tested at 30° in two subgroups per screw: one under dynamic loading and the other without prior dynamic loading. Dynamic loading was performed in a single-axis chewing simulator using 150,000 load cycles at 50 N. After normal distribution of obtained data was verified by Kolmogorov-Smirnov test, fracture resistance between samples submitted and not submitted to dynamic loading was compared by the use of Student's t-test. Comparison of fracture resistance among different screw designs was performed by the use of one-way analysis of variance. Confidence interval was set at 95%. Fractures occurred in all screws, allowing easy retrieval. Screw Prototypes 2, 5 and 6 failed during dynamic loading and exhibited statistically significant differences from the other prototypes. Prototypes 2, 5 and 6 may offer a useful protective mechanism during occlusal overload in immediate loaded implants.
Song, Minju; Shin, Yooseok; Park, Jeong-Won; Roh, Byoung-Duck
2015-02-01
This study was performed to determine whether the combined use of one-bottle self-etch adhesives and composite resins from same manufacturers have better bond strengths than combinations of adhesive and resins from different manufacturers. 25 experimental micro-shear bond test groups were made from combinations of five dentin adhesives and five composite resins with extracted human molars stored in saline for 24 hr. Testing was performed using the wire-loop method and a universal testing machine. Bond strength data was statistically analyzed using two way analysis of variance (ANOVA) and Tukey's post hoc test. Two way ANOVA revealed significant differences for the factors of dentin adhesives and composite resins, and significant interaction effect (p < 0.001). All combinations with Xeno V (Dentsply De Trey) and Clearfil S(3) Bond (Kuraray Dental) adhesives showed no significant differences in micro-shear bond strength, but other adhesives showed significant differences depending on the composite resin (p < 0.05). Contrary to the other adhesives, Xeno V and BondForce (Tokuyama Dental) had higher bond strengths with the same manufacturer's composite resin than other manufacturer's composite resin. Not all combinations of adhesive and composite resin by same manufacturers failed to show significantly higher bond strengths than mixed manufacturer combinations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, M; To, D; Giaddui, T
2016-06-15
Purpose: To investigate the significance of using pinpoint ionization chambers (IC) and RadCalc (RC) in determining the quality of lung SBRT VMAT plans with low dose deviation pass percentage (DDPP) as reported by ScandiDos Delta4 (D4). To quantify the relationship between DDPP and point dose deviations determined by IC (ICDD), RadCalc (RCDD), and median dose deviation reported by D4 (D4DD). Methods: Point dose deviations and D4 DDPP were compiled for 45 SBRT VMAT plans. Eighteen patients were treated on Varian Truebeam linear accelerators (linacs); the remaining 27 were treated on Elekta Synergy linacs with Agility collimators. A one-way analysis ofmore » variance (ANOVA) was performed to determine if there were any statistically significant differences between D4DD, ICDD, and RCDD. Tukey’s test was used to determine which pair of means was statistically different from each other. Multiple regression analysis was performed to determine if D4DD, ICDD, or RCDD are statistically significant predictors of DDPP. Results: Median DDPP, D4DD, ICDD, and RCDD were 80.5% (47.6%–99.2%), −0.3% (−2.0%–1.6%), 0.2% (−7.5%–6.3%), and 2.9% (−4.0%–19.7%), respectively. The ANOVA showed a statistically significant difference between D4DD, ICDD, and RCDD for a 95% confidence interval (p < 0.001). Tukey’s test revealed a statistically significant difference between two pairs of groups, RCDD-D4DD and RCDD-ICDD (p < 0.001), but no difference between ICDD-D4DD (p = 0.485). Multiple regression analysis revealed that ICDD (p = 0.04) and D4DD (p = 0.03) are statistically significant predictors of DDPP with an adjusted r{sup 2} of 0.115. Conclusion: This study shows ICDD predicts trends in D4 DDPP; however this trend is highly variable as shown by our low r{sup 2}. This work suggests that ICDD can be used as a method to verify DDPP in delivery of lung SBRT VMAT plans. RCDD may not validate low DDPP discovered in D4 QA for small field SBRT treatments.« less
David C. Chojnacky; Randolph H. Wynne; Christine E. Blinn
2009-01-01
Methodology is lacking to easily map Forest Inventory and Analysis (FIA) inventory statistics for all attribute variables without having to develop separate models and methods for each variable. We developed a mapping method that can directly transfer tabular data to a map on which pixels can be added any way desired to estimate carbon (or any other variable) for a...
Chem-2-Chem: A One-to-One Supportive Learning Environment for Chemistry
NASA Astrophysics Data System (ADS)
Báez-Galib, Rosita; Colón-Cruz, Héctor; Resto, Wilfredo; Rubin, Michael R.
2005-12-01
The Chem-2-Chem (C2C) tutoring mentoring program was developed at the University of Puerto Rico at Cayey, an undergraduate institution serving Hispanic students, to increase student retention and help students achieve successful general chemistry course outcomes. This program provides a supportive learning environment designed to address students' academic and emotional needs in a holistic way. Advanced chemistry students offered peer-led, personalized, and individualized learning experiences through tutoring and mentoring to approximately 21% of students enrolled in the general chemistry course. Final grades from official class lists of all general chemistry course sections were analyzed using Student's t -test, paired t -test, and χ 2 analysis. Results during the seven semesters studied show an increase of 29% in successful course outcomes defined as final letter grades of A, B, and C obtained by Chem-2-Chem participants. For each final grade, highly statistically significant differences between participants and nonparticipants were detected. There were also statistically significant differences between successful course outcomes obtained by participants and nonparticipants for each of the semesters studied. This research supports recent trends in chemical education to provide a social context for learning experiences. This peer-led learning strategy can serve as an effective model to achieve excellence in science courses at a wide range of educational institutions.
An Assessment of Oral Hygiene in 7-14-Year-Old Children undergoing Orthodontic Treatment.
Krupińska-Nanys, Magdalena; Zarzecka, Joanna
2015-01-01
The study is focused on increased risk of dental plaque accumulation among the children undergoing orthodontic treatment in consideration of individual hygiene and dietary habits. The study was conducted among 91 children aged 7-14 including 47 girls and 44 boys. The main objectives of the study were: API index, plaque pH, DMF index, proper hygiene and dietary habits. Statistical analysis was provided in Microsoft Office Exel spreadsheet and STATISTICA statistical software. The average API index among the children wearing removable appliance was 9 (SD = 13), and among children without appliances was 16 (SD = 21). DMF index for patients using appliances was 5 (SD = 3) and for those without appliances was 4 (SD = 2). The average plaque pH was 6 for children with appliances (SD = 0.9) and 6.2 without ones (SD = 0.3). In patients in whom there is a higher risk of dental plaque accumulating, correct oral hygiene supported with regular visits to the dentist is one of the best ways to control dental caries. In the fight against caries the most effective and only approach is to promote awareness of the problem, foster proper hygiene and nutritional habits, as well as educate children from a very young age in how to maintain proper oral hygiene.
Effects of Electromagnetic Fields on Automated Blood Cell Measurements.
Vagdatli, Eleni; Konstandinidou, Vasiliki; Adrianakis, Nikolaos; Tsikopoulos, Ioannis; Tsikopoulos, Alexios; Mitsopoulou, Kyriaki
2014-08-01
The aim of this study is to investigate whether the electromagnetic fields associated with mobile phones and/or laptops interfere with blood cell counts of hematology analyzers. Random blood samples were analyzed on an Aperture Impedance hematology analyzer. The analysis was performed in four ways: (A) without the presence of any mobile phone or portable computer in use, (B) with mobile phones in use (B1: one mobile, B4: four mobiles), (C) with portable computers (laptops) in use (C1: one laptop, C3: three laptops), and (D) with four mobile phones and three laptops in use simultaneously. The results obtained demonstrated a statistically significant decrease in neutrophil, erythrocyte, and platelet count and an increase in lymphocyte count, mean corpuscular volume, and red blood cell distribution width, notably in the B4 group. Despite this statistical significance, in clinical practice, only the red blood cell reduction could be taken into account, as the mean difference between the A and B4 group was 60,000 cells/µL. In group D, the analyzer gave odd results after 11 measurements and finally stopped working. The combined and multiple use of mobile phones and computers affects the function of hematology analyzers, leading to false results. Consequently, the use of such electronic devices must be avoided. © 2014 Society for Laboratory Automation and Screening.
A Manual for Readable Writing.
ERIC Educational Resources Information Center
Klare, George R.
One of the ways to handle the increasing demands on readers' skills is to make writing more readable. The problem has two different aspects: predicting how readable writing will be to a reader, and producing writing that is readable to that reader. Prediction is relatively simple, and can be done statistically with readability formulas. Production…
On the Way to Inclusion: The Vision from the "Maternal Family"
ERIC Educational Resources Information Center
Khanzeruk, L. A.
2016-01-01
The present article covers the Ukrainian experience in one of the most complicated aspects of growing a child with the development disorder pertaining to such child's growing in the maternal family. It provides definition of the notion "maternal family", contains contemporary statistical data on the number of such families in Ukraine. It…
Looking for High Quality Accreditation in Higher Education in Colombia
ERIC Educational Resources Information Center
Pérez Gama, Jesús Alfonso; Vega Vega, Anselmo
2017-01-01
We look for the High Quality Accreditation of tertiary education in two ways: one, involving large amount of information, including issues such as self-assessment, high quality, statistics, indicators, surveys, and field work (process engineering), during several periods of time; and the second, in relation to the information contained there about…
GPU acceleration of Eulerian-Lagrangian particle-laden turbulent flow simulations
NASA Astrophysics Data System (ADS)
Richter, David; Sweet, James; Thain, Douglas
2017-11-01
The Lagrangian point-particle approximation is a popular numerical technique for representing dispersed phases whose properties can substantially deviate from the local fluid. In many cases, particularly in the limit of one-way coupled systems, large numbers of particles are desired; this may be either because many physical particles are present (e.g. LES of an entire cloud), or because the use of many particles increases statistical convergence (e.g. high-order statistics). Solving the trajectories of very large numbers of particles can be problematic in traditional MPI implementations, however, and this study reports the benefits of using graphical processing units (GPUs) to integrate the particle equations of motion while preserving the original MPI version of the Eulerian flow solver. It is found that GPU acceleration becomes cost effective around one million particles, and performance enhancements of up to 15x can be achieved when O(108) particles are computed on the GPU rather than the CPU cluster. Optimizations and limitations will be discussed, as will prospects for expanding to two- and four-way coupled systems. ONR Grant No. N00014-16-1-2472.
A Study on the Development of Service Quality Index for Incheon International Airport
NASA Technical Reports Server (NTRS)
Lee, Kang Seok; Lee, Seung Chang; Hong, Soon Kil
2003-01-01
The main purpose of this study is located at developing Ominibus Monitors System(OMS) for internal management, which will enable to establish standards, finding out matters to be improved, and appreciation for its treatment in a systematic way. It is through developing subjective or objective estimation tool with use importance, perceived level, and complex index at international airport by each principal service items. The direction of this study came towards for the purpose of developing a metric analysis tool, utilizing the Quantitative Second Data, Analysing Perceived Data through airport user surveys, systemizing the data collection-input-analysis process, making data image according to graph of results, planning Service Encounter and endowing control attribution, and ensuring competitiveness at the minimal international standards. It is much important to set up a pre-investigation plan on the base of existent foreign literature and actual inspection to international airport. Two tasks have been executed together on the base of this pre-investigation; one is developing subjective estimation standards for departing party, entering party, and airport residence and the other is developing objective standards as complementary methods. The study has processed for the purpose of monitoring services at airports regularly and irregularly through developing software system for operating standards after ensuring credibility and feasibility of estimation standards with substantial and statistical way.
Lou, Jiunn-Horng; Chen, Sheng-Hwang; Yu, Hsing-Yi; Li, Ren-Hau; Yang, Cheng-I; Eng, Cheng-Joo
2010-06-01
Understanding how male nursing students alleviate life stress during their academic career is conducive to their development as successful nursing professionals. This study was designed to understand the personality traits, social support, and life stresses of male nursing students. The respective influences of personality traits and social support on life stress were also explored. The study used a cross-sectional research design. A college in central Taiwan was targeted as the site for data collection. A total of 158 questionnaires were dispatched, with 145 valid copies returned (valid response rate = 91.7%). Structured questionnaires were designed to collect data on participant demographics, personality traits, social support, and life stress. Statistical methods such as descriptive statistics, one-way analysis of variance, and multiple regression analysis were applied to data analysis. Major findings of this study revealed that (a) in general, the personality traits, social support, and life stress of male nursing students scored in the medium to high range. Participants reported encountering more stress from learning and life goals than from interpersonal stress. (b) Male nursing student demographic variables (e.g., parent [father and mother considered separately] education level) and the personality traits of conscientiousness and family support, respectively, were found to impact significantly on participant life stress perceptions. And (c) the only significant predictors of life stress were support from family and education level of participant fathers and mothers, accounting for about 23.7% of variability. It is suggested that nursing students in each year of their academic career should be exposed to courses geared to reduce the life stress perceptions (especially in the areas of learning and career development) of male nursing students. Increased family support is an effective way to decrease male nursing student life stress. This study could be a reference for the design and application of strategies to reduce the perceived life stress of male nursing students.
Graphical tools for network meta-analysis in STATA.
Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia
2013-01-01
Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.
Graphical Tools for Network Meta-Analysis in STATA
Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia
2013-01-01
Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547
Learning style-based teaching harvests a superior comprehension of respiratory physiology.
Anbarasi, M; Rajkumar, G; Krishnakumar, S; Rajendran, P; Venkatesan, R; Dinesh, T; Mohan, J; Venkidusamy, S
2015-09-01
Students entering medical college generally show vast diversity in their school education. It becomes the responsibility of teachers to motivate students and meet the needs of all diversities. One such measure is teaching students in their own preferred learning style. The present study was aimed to incorporate a learning style-based teaching-learning program for medical students and to reveal its significance and utility. Learning styles of students were assessed online using the visual-auditory-kinesthetic (VAK) learning style self-assessment questionnaire. When respiratory physiology was taught, students were divided into three groups, namely, visual (n = 34), auditory (n = 44), and kinesthetic (n = 28), based on their learning style. A fourth group (the traditional group; n = 40) was formed by choosing students randomly from the above three groups. Visual, auditory, and kinesthetic groups were taught following the appropriate teaching-learning strategies. The traditional group was taught via the routine didactic lecture method. The effectiveness of this intervention was evaluated by a pretest and two posttests, posttest 1 immediately after the intervention and posttest 2 after a month. In posttest 1, one-way ANOVA showed a significant statistical difference (P=0.005). Post hoc analysis showed significance between the kinesthetic group and traditional group (P=0.002). One-way ANOVA showed a significant difference in posttest 2 scores (P < 0.0001). Post hoc analysis showed significance between the three learning style-based groups compared with the traditional group [visual vs. traditional groups (p=0.002), auditory vs. traditional groups (p=0.03), and Kinesthetic vs. traditional groups (p=0.001)]. This study emphasizes that teaching methods tailored to students' style of learning definitely improve their understanding, performance, and retrieval of the subject. Copyright © 2015 The American Physiological Society.
Digital versus conventional techniques for pattern fabrication of implant-supported frameworks.
Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour
2018-01-01
The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method ( n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal-Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups ( P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group ( P < 0.001), while there was no significant difference between the two other groups ( P = 0.54). CAD/CAM group required significantly more adjustments ( P < 0.001). CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations.
Phase dependence of the unnormalized second-order photon correlation function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ciornea, V.; Bardetski, P.; Macovei, M. A., E-mail: macovei@phys.asm.md
2016-10-15
We investigate the resonant quantum dynamics of a multi-qubit ensemble in a microcavity. Both the quantum-dot subsystem and the microcavity mode are pumped coherently. We find that the microcavity photon statistics depends on the phase difference of the driving lasers, which is not the case for the photon intensity at resonant driving. This way, one can manipulate the two-photon correlations. In particular, higher degrees of photon correlations and, eventually, stronger intensities are obtained. Furthermore, the microcavity photon statistics exhibits steady-state oscillatory behaviors as well as asymmetries.
Mollah, Mohammad Manir Hossain; Jamal, Rahman; Mokhtar, Norfilza Mohd; Harun, Roslan; Mollah, Md. Nurul Haque
2015-01-01
Background Identifying genes that are differentially expressed (DE) between two or more conditions with multiple patterns of expression is one of the primary objectives of gene expression data analysis. Several statistical approaches, including one-way analysis of variance (ANOVA), are used to identify DE genes. However, most of these methods provide misleading results for two or more conditions with multiple patterns of expression in the presence of outlying genes. In this paper, an attempt is made to develop a hybrid one-way ANOVA approach that unifies the robustness and efficiency of estimation using the minimum β-divergence method to overcome some problems that arise in the existing robust methods for both small- and large-sample cases with multiple patterns of expression. Results The proposed method relies on a β-weight function, which produces values between 0 and 1. The β-weight function with β = 0.2 is used as a measure of outlier detection. It assigns smaller weights (≥ 0) to outlying expressions and larger weights (≤ 1) to typical expressions. The distribution of the β-weights is used to calculate the cut-off point, which is compared to the observed β-weight of an expression to determine whether that gene expression is an outlier. This weight function plays a key role in unifying the robustness and efficiency of estimation in one-way ANOVA. Conclusion Analyses of simulated gene expression profiles revealed that all eight methods (ANOVA, SAM, LIMMA, EBarrays, eLNN, KW, robust BetaEB and proposed) perform almost identically for m = 2 conditions in the absence of outliers. However, the robust BetaEB method and the proposed method exhibited considerably better performance than the other six methods in the presence of outliers. In this case, the BetaEB method exhibited slightly better performance than the proposed method for the small-sample cases, but the the proposed method exhibited much better performance than the BetaEB method for both the small- and large-sample cases in the presence of more than 50% outlying genes. The proposed method also exhibited better performance than the other methods for m > 2 conditions with multiple patterns of expression, where the BetaEB was not extended for this condition. Therefore, the proposed approach would be more suitable and reliable on average for the identification of DE genes between two or more conditions with multiple patterns of expression. PMID:26413858
Infant oral health: Knowledge, attitude and practices of parents in Udaipur, India
Nagarajappa, Ramesh; Kakatkar, Gauri; Sharda, Archana J; Asawa, Kailash; Ramesh, Gayathri; Sandesh, Nagarajappa
2013-01-01
Background: The aim of this study was to assess the infant oral health (IOH) related knowledge, attitudes and practices (KAP) of parents in Udaipur, India. Materials and Methods: A cross-sectional descriptive study was conducted among 470 parents visiting the Department of Pediatrics, Rabindranath Tagore Medical College and Hospital. A 32-item questionnaire covering socio-demographic characteristics and questions pertaining to KAP regarding IOH care was used to collect the data. Descriptive statistics, Student's t-test, one-way analysis of variance, and Scheffe's test were used for the statistical analysis (P ≤ 0.05). Results: Majority of the parents had good knowledge regarding tooth eruption, but had a poor knowledge of cleaning (58.7%) and development of caries (48.5%). Parents in the age group of 25-30 years showed significantly higher mean knowledge (25.90 ± 3.93), attitude (15.71 ± 2.23), and practice (20.09 ± 2.50) scores. Female parents showed a significantly higher mean knowledge (21.45 ± 4.27) and attitude scores (14.97 ± 2.15) than the male parents. Conclusion: Parent's knowledge on IOH care was inadequate. Health professionals, who are the first to come into contact with expectant and new mothers, need to disseminate appropriate and accurate information about oral health-care for infants. PMID:24348626
Infant oral health: Knowledge, attitude and practices of parents in Udaipur, India.
Nagarajappa, Ramesh; Kakatkar, Gauri; Sharda, Archana J; Asawa, Kailash; Ramesh, Gayathri; Sandesh, Nagarajappa
2013-09-01
The aim of this study was to assess the infant oral health (IOH) related knowledge, attitudes and practices (KAP) of parents in Udaipur, India. A cross-sectional descriptive study was conducted among 470 parents visiting the Department of Pediatrics, Rabindranath Tagore Medical College and Hospital. A 32-item questionnaire covering socio-demographic characteristics and questions pertaining to KAP regarding IOH care was used to collect the data. Descriptive statistics, Student's t-test, one-way analysis of variance, and Scheffe's test were used for the statistical analysis (P ≤ 0.05). Majority of the parents had good knowledge regarding tooth eruption, but had a poor knowledge of cleaning (58.7%) and development of caries (48.5%). Parents in the age group of 25-30 years showed significantly higher mean knowledge (25.90 ± 3.93), attitude (15.71 ± 2.23), and practice (20.09 ± 2.50) scores. Female parents showed a significantly higher mean knowledge (21.45 ± 4.27) and attitude scores (14.97 ± 2.15) than the male parents. Parent's knowledge on IOH care was inadequate. Health professionals, who are the first to come into contact with expectant and new mothers, need to disseminate appropriate and accurate information about oral health-care for infants.
Al-Qarni, Mohammed A.; Alamri, Mohammed Abdullah; Alshaikh, Yahya A.
2016-01-01
Introduction Eco-friendly or green dentistry can be a reality by effectively designing dental clinics and using more eco-friendly materials in the clinical practice. Aim To determine the awareness of eco-friendly dentistry among dental faculty and students in preparation for future implementation. Materials and Methods Assessment of knowledge regarding eco-friendly dentistry was done using an 18 item self-administered questionnaire among 160 participants. After baseline data collection, the intervention was done by educating participants with a power point presentation. The post-intervention data was then collected for analysis. Statistical analysis was done using Wilcoxon’s signed rank test and one-way ANOVA. Results The educational intervention increased the knowledge about eco-friendly dentistry confirming the importance of continuing education. There was a statistically significant gain in knowledge among the participants after the presentation. The gain was highest for department of Preventive Dental Sciences (PDS) followed by Substitute Dental Sciences (SDS), No specialty, Maxillofacial Dental Sciences (MDS), and Restorative Dental Sciences (RDS) respectively. (F=5.5091, p<0.05). Conclusion Lack of knowledge of green dentistry amongst the dental fraternity is highly prevailing. The same can be substantiated with effective training in the respective fields if channelized through the curriculum in an educational set-up. PMID:27891464
Muñoz-García, Daniel; Gil-Martínez, Alfonso; López-López, Almudena; Lopez-de-Uralde-Villanueva, Ibai; La Touche, Roy; Fernández-Carnero, Josué
2016-01-01
Background. Neck pain (NP) is strongly associated with cervico-craniofacial pain (CCFP). The primary aim of the present study was to compare the neck pain-related disability, pain catastrophizing, and cervical and mandibular ROM between patients with chronic mechanical NP and patients with CCFP, as well as asymptomatic subjects. Methods. A total of 64 participants formed three groups. All participants underwent a clinical examination evaluating the cervical range of motion and maximum mouth opening, neck disability index (NDI), and psychological factor of Pain Catastrophizing Scale (PCS). Results. There were no statistically significant differences between patients with NP and CCFP for NDI and PCS (P > 0.05). One- way ANOVA revealed significant differences for all ROM measurements. The post hoc analysis showed no statistically significant differences in cervical extension and rotation between the two patient groups (P > 0.05). The Pearson correlation analysis shows a moderate positive association between NDI and the PCS for the group of patients with NP and CCFP. Conclusion. The CCFP and NP patient groups have similar neck disability levels and limitation in cervical ROM in extension and rotation. Both groups had positively correlated the NDI with the PCS. PMID:27119020
Rajesh, K S; Zareena; Hegde, Shashikanth; Arun Kumar, M S
2015-01-01
This study was conducted to estimate and compare inorganic salivary calcium, phosphate, magnesium, salivary flow rate, and pH of unstimulated saliva and oral hygiene status of healthy subjects, subjects with periodontitis and dental caries, and to correlate salivary calcium level with number of intact teeth. The study population consisted of 48 systemically healthy subjects in the age group of 18-55 years, which was further divided into three groups: healthy, periodontitis, and dental caries. Oral hygiene index-simplified, probing pocket depth, clinical attachment level, the number of intact teeth, and active carious lesions were recorded. Estimation of inorganic salivary calcium, phosphate, and magnesium was performed spectrophotometrically using Vitros 5.1 FS. Statistical analysis was performed using the one-way analysis of variance test at 5% significance level. There was a statistically significant increase in inorganic salivary calcium, phosphate, pH, flow rate, and poor oral hygiene status in periodontitis group compared to dental caries and healthy group. Subjects with increased inorganic salivary calcium, phosphate, pH, flow rate, and poor oral hygiene are at a higher risk of developing periodontitis. Since there is increased remineralization potential, these subjects have more number of intact teeth compared to the dental caries group.
Rajesh, K. S.; Zareena; Hegde, Shashikanth; Arun Kumar, M. S.
2015-01-01
Aim: This study was conducted to estimate and compare inorganic salivary calcium, phosphate, magnesium, salivary flow rate, and pH of unstimulated saliva and oral hygiene status of healthy subjects, subjects with periodontitis and dental caries, and to correlate salivary calcium level with number of intact teeth. Materials and Methods: The study population consisted of 48 systemically healthy subjects in the age group of 18-55 years, which was further divided into three groups: healthy, periodontitis, and dental caries. Oral hygiene index-simplified, probing pocket depth, clinical attachment level, the number of intact teeth, and active carious lesions were recorded. Estimation of inorganic salivary calcium, phosphate, and magnesium was performed spectrophotometrically using Vitros 5.1 FS. Statistical analysis was performed using the one-way analysis of variance test at 5% significance level. Results: There was a statistically significant increase in inorganic salivary calcium, phosphate, pH, flow rate, and poor oral hygiene status in periodontitis group compared to dental caries and healthy group. Conclusion: Subjects with increased inorganic salivary calcium, phosphate, pH, flow rate, and poor oral hygiene are at a higher risk of developing periodontitis. Since there is increased remineralization potential, these subjects have more number of intact teeth compared to the dental caries group. PMID:26681848
Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak
2017-01-01
The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.
Statistical Analysis for Collision-free Boson Sampling.
Huang, He-Liang; Zhong, Han-Sen; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su
2017-11-10
Boson sampling is strongly believed to be intractable for classical computers but solvable with photons in linear optics, which raises widespread concern as a rapid way to demonstrate the quantum supremacy. However, due to its solution is mathematically unverifiable, how to certify the experimental results becomes a major difficulty in the boson sampling experiment. Here, we develop a statistical analysis scheme to experimentally certify the collision-free boson sampling. Numerical simulations are performed to show the feasibility and practicability of our scheme, and the effects of realistic experimental conditions are also considered, demonstrating that our proposed scheme is experimentally friendly. Moreover, our broad approach is expected to be generally applied to investigate multi-particle coherent dynamics beyond the boson sampling.
Different CAD/CAM-processing routes for zirconia restorations: influence on fitting accuracy.
Kohorst, Philipp; Junghanns, Janet; Dittmer, Marc P; Borchers, Lothar; Stiesch, Meike
2011-08-01
The aim of the present in vitro study was to evaluate the influence of different processing routes on the fitting accuracy of four-unit zirconia fixed dental prostheses (FDPs) fabricated by computer-aided design/computer-aided manufacturing (CAD/CAM). Three groups of zirconia frameworks with ten specimens each were fabricated. Frameworks of one group (CerconCAM) were produced by means of a laboratory CAM-only system. The other frameworks were made with different CAD/CAM systems; on the one hand by in-laboratory production (CerconCAD/CAM) and on the other hand by centralized production in a milling center (Compartis) after forwarding geometrical data. Frameworks were then veneered with the recommended ceramics, and marginal accuracy was determined using a replica technique. Horizontal marginal discrepancy, vertical marginal discrepancy, absolute marginal discrepancy, and marginal gap were evaluated. Statistical analyses were performed by one-way analysis of variance (ANOVA), with the level of significance chosen at 0.05. Mean horizontal discrepancies ranged between 22 μm (CerconCAM) and 58 μm (Compartis), vertical discrepancies ranged between 63 μm (CerconCAD/CAM) and 162 μm (CerconCAM), and absolute marginal discrepancies ranged between 94 μm (CerconCAD/CAM) and 181 μm (CerconCAM). The marginal gap varied between 72 μm (CerconCAD/CAM) and 112 μm (CerconCAM, Compartis). Statistical analysis revealed that, with all measurements, the marginal accuracy of the zirconia FDPs was significantly influenced by the processing route used (p < 0.05). Within the limitations of this study, all restorations showed a clinically acceptable marginal accuracy; however, the results suggest that the CAD/CAM systems are more precise than the CAM-only system for the manufacture of four-unit FDPs.
Heydari, Payam; Varmazyar, Sakineh; Variani, Ali Safari; Hashemi, Fariba; Ataei, Seyed Sajad
2017-10-01
Test of maximal oxygen consumption is the gold standard for measuring cardio-pulmonary fitness. This study aimed to determine correlation of Gerkin, Queen's College, George, and Jackson methods in estimating maximal oxygen consumption, and demographic factors affecting maximal oxygen consumption. This descriptive cross-sectional study was conducted in a census of medical emergency students (n=57) in Qazvin University of Medical Sciences in 2016. The subjects firstly completed the General Health Questionnaire (PAR-Q) and demographic characteristics. Then eligible subjects were assessed using exercise tests of Gerkin treadmill, Queen's College steps and non-exercise George, and Jackson. Data analysis was carried out using independent t-test, one way analysis of variance and Pearson correlation in the SPSS software. The mean age of participants was 21.69±4.99 years. The mean of maximal oxygen consumption using Gerkin, Queen's College, George, and Jackson tests was 4.17, 3.36, 3.64, 3.63 liters per minute, respectively. Pearson statistical test showed a significant correlation among fours tests. George and Jackson tests had the greatest correlation (r=0.85, p>0.001). Results of tests of one-way analysis of variance and t-test showed a significant relationship between independent variable of weight and height in four tests, and dependent variable of maximal oxygen consumption. Also, there was a significant relationship between variable of body mass index in two tests of Gerkin and Queen's College and variable of exercise hours per week with the George and Jackson tests (p>0.001). Given the obtained correlation, these tests have the potential to replace each other as necessary, so that the non-exercise Jackson test can be used instead of the Gerkin test.
The Grapefruit: An Alternative Arthroscopic Tool Skill Platform.
Molho, David A; Sylvia, Stephen M; Schwartz, Daniel L; Merwin, Sara L; Levy, I Martin
2017-08-01
To establish the construct validity of an arthroscopic training model that teaches arthroscopic tool skills including triangulation, grasping, precision biting, implant delivery and ambidexterity and uses a whole grapefruit for its training platform. For the grapefruit training model (GTM), an arthroscope and arthroscopic instruments were introduced through portals cut in the grapefruit skin of a whole prepared grapefruit. After institutional review board approval, participants performed a set of tasks inside the grapefruit. Performance for each component was assessed by recording errors, achievement of criteria, and time to completion. A total of 19 medical students, orthopaedic surgery residents, and fellowship-trained orthopaedic surgeons were included in the analysis and were divided into 3 groups based on arthroscopic experience. One-way analysis of variance (ANOVA) and the post hoc Tukey test were used for statistical analysis. One-way ANOVA showed significant differences in both time to completion and errors between groups, F(2, 16) = 16.10, P < .001; F(2, 16) = 17.43, P < .001. Group A had a longer time to completion and more errors than group B (P = .025, P = .019), and group B had a longer time to completion and more errors than group C (P = .023, P = .018). The GTM is an easily assembled and an alternative arthroscopic training model that bridges the gap between box trainers, cadavers, and virtual reality simulators. Our findings suggest construct validity when evaluating its use for teaching the basic arthroscopic tool skills. As such, it is a useful addition to the arthroscopic training toolbox. There is a need for validated low-cost arthroscopic training models that are easily accessible. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Pinheiro, Sergio Luiz; da Silva, Caio Cesar; da Silva, Lucas Augusto; Cicotti, Marina P.; Bueno, Carlos Eduardo da Silveira; Fontana, Carlos Eduardo; Pagrion, Letícia R.; Dalmora, Natália P.; Daque, Thaís T.; de Campos, Francisco UF
2018-01-01
Objective: The aim of this study is to evaluate the antimicrobial efficacy of 2.5% sodium hypochlorite, 2% chlorhexidine, and ozonated water on biofilms of Enterococcus faecalis, Streptococcus mutans, and Candida albicans in mesiobuccal root canals with severe curvature of mandibular molars. Materials and Methods: This was an experimental ex vivo study in microbiologic laboratory. Sixty mesiobuccal root canals with severe curvature of mandibular molars were contaminated with standard strains of E. faecalis, S. mutans, and C. albicans. The specimens were randomly divided into four groups (n = 15) according to irrigating solution: SH: 2.5% sodium hypochlorite; CH: 2% chlorhexidine; O3: ozonated water; and control: double-distilled water. The mesiobuccal root canals of all groups were instrumented with the WaveOne Gold Primary reciprocating system. Three cycles of instrumentation with three short in-and-out brushing motions were performed: (1) in the coronal third, (2) in the middle third, and (3) in the apical third of the canal. A ProGlider file was used before the first cycle. Statistical Analysis: Statistical analysis was performed using one-way analysis of variance followed by Tukey's multiple comparison test. Samples were collected for viable bacterial counts before and after instrumentation. Results: All groups showed significant biofilm reduction after irrigation (P < 0.01). After instrumentation, sodium hypochlorite (98.07%), chlorhexidine (98.31%), and ozonated water (98.02%) produced a significantly reduction in bacterial counts compared with double-distilled water (control, 72.98%) (P < 0.01). Conclusion: All irrigants tested in this study showed similar antimicrobial activity. Thus, ozonated water may be an option for microbial reduction in the root canal system. PMID:29657531
Fourier Transform Infrared Imaging analysis of dental pulp inflammatory diseases.
Giorgini, E; Sabbatini, S; Conti, C; Rubini, C; Rocchetti, R; Fioroni, M; Memè, L; Orilisi, G
2017-05-01
Fourier Transform Infrared microspectroscopy let characterize the macromolecular composition and distribution of tissues and cells, by studying the interaction between infrared radiation and matter. Therefore, we hypothesize to exploit this analytical tool in the analysis of inflamed pulps, to detect the different biochemical features related to various degrees of inflammation. IR maps of 13 irreversible and 12 hyperplastic pulpitis, together with 10 normal pulps, were acquired, compared with histological findings and submitted to multivariate (HCA, PCA, SIMCA) and statistical (one-way ANOVA) analysis. The fit of convoluted bands let calculate meaningful band area ratios (means ± s.d., P < 0.05). The infrared imaging analysis pin-pointed higher amounts of water and lower quantities of type I collagen in all inflamed pulps. Specific vibrational markers were defined for irreversible pulpitis (Lipids/Total Biomass, PhII/Total Biomass, CH 2 /CH 3 , and Ty/AII) and hyperplastic ones (OH/Total Biomass, Collagen/Total Biomass, and CH 3 Collagen/Total Biomass). The study confirmed that FTIR microspectroscopy let discriminate tissues' biological features. The infrared imaging analysis evidenced, in inflamed pulps, alterations in tissues' structure and composition. Changes in lipid metabolism, increasing amounts of tyrosine, and the occurrence of phosphorylative processes were highlighted in irreversible pulpitis, while high amounts of water and low quantities of type I collagen were detected in hyperplastic samples. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.
Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M
2016-09-01
Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Effect of surface treatments on the bond strengths of facing composite resins to zirconia copings.
Tsumita, M; Kokubo, Y; Kano, T
2012-09-01
The present study evaluated and compared the bond strength between zirconia and facing composite resin using different surface conditioning methods before and after thermocycling. Four primers, three opaque resins, and two facing composite resins were used, and 10 surface treatment procedures were conducted. The bond strength was measured before and after 4,000 cycles of thermocycling. The mean values of each group were statistically analyzed using one-way analysis of variance (ANOVA). The bond strengths of facing composite resins to zirconia after various treatments varied depending on the primers, opaque resins, body resins, and thermocycling. The application of primers and opaque resins to the zirconia surface after sandblasting is expected to yield strong bond strength of the facing composite resin (Estenia CG&B) even after thermocycling.
Evaluation and application of summary statistic imputation to discover new height-associated loci.
Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán
2018-05-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.
Evaluation and application of summary statistic imputation to discover new height-associated loci
2018-01-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485
Hegde, Shashikanth; Chatterjee, Elashri; Rajesh, K S; Kumar, M S Arun
2016-01-01
This study was conducted to estimate and correlate salivary thiocyanate (SCN) levels in periodontally healthy subjects, smokers, nonsmokers, and gutka-chewers with chronic periodontitis. The study population consisted of 40 systemically healthy subjects in the age group of 18-55 years that was further divided into four groups: Control, smokers, nonsmokers, and gutka-chewers with chronic periodontitis. Gingival index (GI) (Loe and Silness-1963), probing depth (PD), clinical attachment loss was assessed. Estimation of SCN was performed by ultraviolet spectrophotometer at 447 nm wavelength. Statistical analysis was performed using the one-way ANOVAs Welch test and Pearson's correlation test using SPSS version 17 software. Results showed statistically significant increase in SCN levels in smokers as compared to gutka-chewers with chronic periodontitis, control, and nonsmokers with chronic periodontitis subjects. Significantly higher PD and loss of attachment were seen in smokers group compared with other groups. A negative correlation observed between the GI and thiocyanate levels. The present study revealed a significant increase in SCN levels in smokers with periodontitis as compared to nonsmokers.
NASA Technical Reports Server (NTRS)
Jackson, F. C.
1979-01-01
Two simple microwave radar techniques that are potentially capable of providing routine satellite measurements of the directional spectrum of ocean waves were developed. One technique, the short pulse technique, makes use of very short pulses to resolve ocean surface wave contrast features in the range direction; the other technique, the two frequency correlation technique makes use of coherency in the transmitted waveform to detect the large ocean wave contrast modulation as a beat or mixing frequency in the power backscattered at two closely separated microwave frequencies. A frequency domain analysis of the short pulse and two frequency systems shows that the two measurement systems are essentially duals; they each operate on the generalized (three frequency) fourth-order statistical moment of the surface transfer function in different, but symmetrical ways, and they both measure the same directional contrast modulation spectrum. A three dimensional physical optics solution for the fourth-order moment was obtained for backscatter in the near vertical, specular regime, assuming Gaussian surface statistics.
NASA Astrophysics Data System (ADS)
Yu, Fu-Yun; Liu, Yu-Hsin
2005-09-01
The potential value of a multiple-choice question-construction instructional strategy for the support of students’ learning of physics experiments was examined in the study. Forty-two university freshmen participated in the study for a whole semester. A constant comparison method adopted to categorize students’ qualitative data indicated that the influences of multiple-choice question construction were evident in several significant ways (promoting constructive and productive studying habits; reflecting and previewing course-related materials; increasing in-group communication and interaction; breaking passive learning style and habits, etc.), which, worked together, not only enhanced students’ comprehension and retention of the obtained knowledge, but also helped distil a sense of empowerment and learning community within the participants. Analysis with one-group t-tests, using 3 as the expected mean, on quantitative data further found that students’ satisfaction toward past learning experience, and perceptions toward this strategy’s potentials for promoting learning were statistically significant at the 0.0005 level, while learning anxiety was not statistically significant. Suggestions for incorporating question-generation activities within classroom and topics for future studies were rendered.
Shin, Nayeon; Jang, Youha; Kang, Younhee
2017-04-01
The purposes of this study were to identify the relationships among perceived parental bonding, illness perception, and anxiety and to determine the influences of perceived parental bonding and illness perception on anxiety in adult patients with congenital heart diseases. In this study a descriptive correlational design with survey method was utilized. The participants were 143 adult patients with congenital heart disease being cared for in the cardiology out-patient clinic of A medical center. Data were collected using the Parental Bonding Instrument, Illness Perception Questionnaire Revised Scale, and Cardiac Anxiety Questionnaire Scale. Data were analyzed using descriptive statistics, independent t-test, one-way ANOVA, Pearson correlation analysis, and hierarchial regression analyses. There showed significant positive relationships of anxiety with maternal overprotection, consequences, and personal control respectively. Among predictors, maternal overprotection (β=.45), consequence (β=.26), and personal control (β=-.03) had statistically significant influence on anxiety. Nursing interventions to decrease maternal overprotection and negative consequence, and to enhance personal control are essential to decrease the anxiety of adult patients with congenital heart diseases. © 2017 Korean Society of Nursing Science
Yilmaz, Yucel; Dalmis, Anya; Gurbuz, Taskin; Simsek, Sera
2004-12-01
The aim of this investigation was to compare the tensile strength, microleakage, and Scanning Electron Microscope (SEM) evaluations of SSCs cemented using different adhesive cements on primary molars. Sixty-three extracted primary first molars were used. Tooth preparations were done. Crowns were altered and adapted for investigation purpose, and then cemented using glass ionomer cement (Aqua Meron), resin modified cement (RelyX Luting), and resin cement (Panavia F) on the prepared teeth. Samples were divided into two groups of 30 samples each for tensile strength and microleakage tests. The remaining three samples were used for SEM evaluation. Data were analyzed with one-way ANOVA and Tukey test. The statistical analysis of ANOVA revealed significant differences among the groups for both tensile strength and microleakage tests (p < 0.05). Tukey test showed statistically significant difference between Panavia F and RelyX Luting (p < 0.05), but none between the others (p > 0.05). This study showed that the higher the retentive force a crown possessed, the lower would be the possibility of microleakage.
Solomon, Olga
2015-06-01
'Being autistic' or 'having Autism Spectrum Disorder' implies a limited range of 'being social,' but the in situ organization of interaction, what Maynard and Marlaire (Qual Soc 15(2):177-202, 1992) call the 'interactional substrate,' within which this delimitation enfolds is usually hidden from sight. Analysis of processes constituting different 'interactional substrates' provides a view of how one comes to be known by and to self and others as a certain kind of being who is available (or not) for acting and feeling in certain ways. People diagnosed with Autism Spectrum Disorder (American Psychiatric Association, Diagnostic and statistical manual of mental disorders, 2013) are often described as 'being' impaired in intersubjective understanding of others. But the story of ASD as an impairment of sociality and intersubjectivity becomes more complicated when animals enter into the picture. I consider two interactional substrates: a psychological interview in a mental health clinic, and an animal-assisted activity in a child's neighborhood. I aim to elucidate the practical problems of 'being social' encountered by two children with ASD, both nine-year-old girls, within these two very differently organized interactional substrates. I consider ways in which 'being with' therapy animals provides a way of 'being social' through "sensory modalities of knowing" (Haraway, When species meet, 2008:371).
Methods and apparatuses for information analysis on shared and distributed computing systems
Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA
2011-02-22
Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.
Harris, K K; Price, A J; Beard, D J; Fitzpatrick, R; Jenkinson, C; Dawson, J
2014-11-01
The objective of this study was to explore dimensionality of the Oxford Hip Score (OHS) and examine whether self-reported pain and functioning can be distinguished in the form of subscales. This was a secondary data analysis of the UK NHS hospital episode statistics/patient-reported outcome measures dataset containing pre-operative OHS scores on 97 487 patients who were undergoing hip replacement surgery. The proposed number of factors to extract depended on the method of extraction employed. Velicer's Minimum Average Partial test and the Parallel Analysis suggested one factor, the Cattell's scree test and Kaiser-over-1 rule suggested two factors. Exploratory factor analysis demonstrated that the two-factor OHS had most of the items saliently loading either of the two factors. These factors were named 'Pain' and 'Function' and their respective subscales were created. There was some cross-loading of items: 8 (pain on standing up from a chair) and 11 (pain during work). These items were assigned to the 'Pain' subscale. The final 'Pain' subscale consisted of items 1, 8, 9, 10, 11 and 12. The 'Function' subscale consisted of items 2, 3, 4, 5, 6 and 7, with the recommended scoring of the subscales being from 0 (worst) to 100 (best). Cronbach's alpha was 0.855 for the 'Pain' subscale and 0.861 for the 'Function' subscale. A confirmatory factor analysis demonstrated that the two-factor model of the OHS had a better fit. However, none of the one-factor or two-factor models was rejected. Factor analyses demonstrated that, in addition to current usage as a single summary scale, separate information on pain and self-reported function can be extracted from the OHS in a meaningful way in the form of subscales. Cite this article: Bone Joint Res 2014;3:305-9. ©2014 The British Editorial Society of Bone & Joint Surgery.
Regional-scale analysis of extreme precipitation from short and fragmented records
NASA Astrophysics Data System (ADS)
Libertino, Andrea; Allamano, Paola; Laio, Francesco; Claps, Pierluigi
2018-02-01
Rain gauge is the oldest and most accurate instrument for rainfall measurement, able to provide long series of reliable data. However, rain gauge records are often plagued by gaps, spatio-temporal discontinuities and inhomogeneities that could affect their suitability for a statistical assessment of the characteristics of extreme rainfall. Furthermore, the need to discard the shorter series for obtaining robust estimates leads to ignore a significant amount of information which can be essential, especially when large return periods estimates are sought. This work describes a robust statistical framework for dealing with uneven and fragmented rainfall records on a regional spatial domain. The proposed technique, named "patched kriging" allows one to exploit all the information available from the recorded series, independently of their length, to provide extreme rainfall estimates in ungauged areas. The methodology involves the sequential application of the ordinary kriging equations, producing a homogeneous dataset of synthetic series with uniform lengths. In this way, the errors inherent to any regional statistical estimation can be easily represented in the spatial domain and, possibly, corrected. Furthermore, the homogeneity of the obtained series, provides robustness toward local artefacts during the parameter-estimation phase. The application to a case study in the north-western Italy demonstrates the potential of the methodology and provides a significant base for discussing its advantages over previous techniques.
[Mobile phones and head tumours: it is time to read and highlight data in a proper way].
Levis, Angelo G; Minicucci, Nadia; Ricci, Paolo; Gennaro, Valerio; Garbisa, Spiridione
2011-01-01
The uncertainty about the relationship between the use of mobile phones (MPs: analogue and digital cellulars, and cordless) and the increase of head tumour risk can be solved by a critical analysis of the methodological elements of both the positive and the negative studies. Results by Hardell indicate a cause/effect relationship: exposures for or latencies from ≥ 10 years to MPs increase by up to 100% the risk of tumour on the same side of the head preferred for phone use (ipsilateral tumours) - which is the only one significantly irradiated - with statistical significance for brain gliomas, meningiomas and acoustic neuromas. On the contrary, studies published under the Interphone project and others produced negative results and are characterised by the substantial underestimation of the risk of tumour. However, also in the Interphone studies a clear and statistically significant increase of ipsilateral head tumours (gliomas, neuromas and parotid gland tumours) is quite common in people having used MPs since or for ≥ 10 years. And also the metaanalyses by Hardell and other Authors, including only the literature data on ipsilateral tumours in people having used MPs since or for ≥ 10 years - and so also part of the Interphone data - still show statistically significant increases of head tumours.
Hans, Rinki; Thomas, Susan; Garla, Bharat; Dagli, Rushabh J; Hans, Manoj Kumar
2016-01-01
Introduction. Diet is a major aetiological factor for dental caries and enamel erosion. This study was undertaken with the aim of assessing the effect of selected locally available beverages on salivary pH, flow rate, and oral clearance rate amongst adults. Materials and Method. This clinical trial comprised 120 subjects. Test beverages undertaken were pepsi, fruit drink, coffee, and sweetened milk. Statistical analysis was carried out using SPSS version 17. Descriptive statistics, one-way ANOVA, and post hoc Tukey's test were applied in the statistical tests. Results. It was found that salivary pH decreased for all the beverages immediately after consumption and the salivary flow rate increased after their consumption. The oral clearance rate of sweetened milk was found to be the least at 6.5 minutes and that of pepsi was found to be 13 minutes. However, the oral clearance rates of fruit drink and coffee were found to be equal at 15 minutes. Conclusion. Although it was found out that liquids cleared rapidly from the oral cavity, they had a significant cariogenic and erosive potential. Hence, it is always advised to minimise the consumption of beverages, especially amongst children and young adults to maintain a good oral health.
Effect of Various Sugary Beverages on Salivary pH, Flow Rate, and Oral Clearance Rate amongst Adults
Hans, Rinki; Thomas, Susan; Garla, Bharat; Dagli, Rushabh J.
2016-01-01
Introduction. Diet is a major aetiological factor for dental caries and enamel erosion. This study was undertaken with the aim of assessing the effect of selected locally available beverages on salivary pH, flow rate, and oral clearance rate amongst adults. Materials and Method. This clinical trial comprised 120 subjects. Test beverages undertaken were pepsi, fruit drink, coffee, and sweetened milk. Statistical analysis was carried out using SPSS version 17. Descriptive statistics, one-way ANOVA, and post hoc Tukey's test were applied in the statistical tests. Results. It was found that salivary pH decreased for all the beverages immediately after consumption and the salivary flow rate increased after their consumption. The oral clearance rate of sweetened milk was found to be the least at 6.5 minutes and that of pepsi was found to be 13 minutes. However, the oral clearance rates of fruit drink and coffee were found to be equal at 15 minutes. Conclusion. Although it was found out that liquids cleared rapidly from the oral cavity, they had a significant cariogenic and erosive potential. Hence, it is always advised to minimise the consumption of beverages, especially amongst children and young adults to maintain a good oral health. PMID:27051556
Jeyapalan, Karthigeyan; Kumar, Jaya Krishna; Azhagarasan, N. S.
2015-01-01
Aims: The aim was to evaluate and compare the effects of three chemically different commercially available denture cleansing agents on the surface topography of two different denture base materials. Materials and Methods: Three chemically different denture cleansers (sodium perborate, 1% sodium hypochlorite, 0.2% chlorhexidine gluconate) were used on two denture base materials (acrylic resin and chrome cobalt alloy) and the changes were evaluated at 3 times intervals (56 h, 120 h, 240 h). Changes from baseline for surface roughness were recorded using a surface profilometer and standard error of the mean (SEM) both quantitatively and qualitatively, respectively. Qualitative surface analyses for all groups were done by SEM. Statistical Analysis Used: The values obtained were analyzed statistically using one-way ANOVA and paired t-test. Results: All three denture cleanser solutions showed no statistically significant surface changes on the acrylic resin portions at 56 h, 120 h, and 240 h of immersion. However, on the alloy portion changes were significant at the end of 120 h and 240 h. Conclusion: Of the three denture cleansers used in the study, none produced significant changes on the two denture base materials for the short duration of immersion, whereas changes were seen as the immersion periods were increased. PMID:26538915
MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.
Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin
2015-04-01
Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
A Framework to Support Research on Informal Inferential Reasoning
ERIC Educational Resources Information Center
Zieffler, Andrew; Garfield, Joan; delMas, Robert; Reading, Chris
2008-01-01
Informal inferential reasoning is a relatively recent concept in the research literature. Several research studies have defined this type of cognitive process in slightly different ways. In this paper, a working definition of informal inferential reasoning based on an analysis of the key aspects of statistical inference, and on research from…
ERIC Educational Resources Information Center
Palazotto, Anthony N.; And Others
This report is the result of a pilot program to seek out ways for developing an educational institution's transportation flow. Techniques and resulting statistics are discussed. Suggestions for additional uses of the information obtained are indicated. (Author)
Guidelines for collecting and maintaining archives for genetic monitoring
Jennifer A. Jackson; Linda Laikre; C. Scott Baker; Katherine C. Kendall; F. W. Allendorf; M. K. Schwartz
2011-01-01
Rapid advances in molecular genetic techniques and the statistical analysis of genetic data have revolutionized the way that populations of animals, plants and microorganisms can be monitored. Genetic monitoring is the practice of using molecular genetic markers to track changes in the abundance, diversity or distribution of populations, species or ecosystems over time...
ERIC Educational Resources Information Center
Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.
2005-01-01
Results obtained with interindividual techniques in a representative sample of a population are not necessarily generalizable to the individual members of this population. In this article the specific condition is presented that must be satisfied to generalize from the interindividual level to the intraindividual level. A way to investigate…
Quench dynamics in superconducting nanojunctions: Metastability and dynamical Yang-Lee zeros
NASA Astrophysics Data System (ADS)
Souto, R. Seoane; Martín-Rodero, A.; Yeyati, A. Levy
2017-10-01
We study the charge transfer dynamics following the formation of a phase or voltage biased superconducting nanojunction using a full counting statistics analysis. We demonstrate that the evolution of the zeros of the generating function allows one to identify the population of different many body states much in the same way as the accumulation of Yang-Lee zeros of the partition function in equilibrium statistical mechanics is connected to phase transitions. We give an exact expression connecting the dynamical zeros to the charge transfer cumulants and discuss when an approximation based on "dominant" zeros is valid. We show that, for generic values of the parameters, the system gets trapped into a metastable state characterized by a nonequilibrium population of the many body states which is dependent on the initial conditions. We study in particular the effect of the switching rates in the dynamics showing that, in contrast to intuition, the deviation from thermal equilibrium increases for the slower rates. In the voltage biased case the steady state is reached independent of the initial conditions. Our method allows us to obtain accurate results for the steady state current and noise in quantitative agreement with steady state methods developed to describe the multiple Andreev reflections regime. Finally, we discuss the system dynamics after a sudden voltage drop showing the possibility of tuning the many body states population by an appropriate choice of the initial voltage, providing a feasible experimental way to access the quench dynamics and control the state of the system.
Lopes, Lawrence Gonzaga; Franco, Eduardo Batista; Pereira, José Carlos; Mondelli, Rafael Francisco Lia
2008-01-01
The aim of this study was to evaluate the polymerization shrinkage and shrinkage stress of composites polymerized with a LED and a quartz tungsten halogen (QTH) light sources. The LED was used in a conventional mode (CM) and the QTH was used in both conventional and pulse-delay modes (PD). The composite resins used were Z100, A110, SureFil and Bisfil 2B (chemical-cured). Composite deformation upon polymerization was measured by the strain gauge method. The shrinkage stress was measured by photoelastic analysis. The polymerization shrinkage data were analyzed statistically using two-way ANOVA and Tukey test (p≤0.05), and the stress data were analyzed by one-way ANOVA and Tukey's test (p≤0.05). Shrinkage and stress means of Bisfil 2B were statistically significant lower than those of Z100, A110 and SureFil. In general, the PD mode reduced the contraction and the stress values when compared to CM. LED generated the same stress as QTH in conventional mode. Regardless of the activation mode, SureFil produced lower contraction and stress values than the other light-cured resins. Conversely, Z100 and A110 produced the greatest contraction and stress values. As expected, the chemically cured resin generated lower shrinkage and stress than the light-cured resins. In conclusion, The PD mode effectively decreased contraction stress for Z100 and A110. Development of stress in light-cured resins depended on the shrinkage value. PMID:19089287
Single-photon quantum key distribution in the presence of loss
NASA Astrophysics Data System (ADS)
Curty, Marcos; Moroder, Tobias
2007-05-01
We investigate two-way and one-way single-photon quantum key distribution (QKD) protocols in the presence of loss introduced by the quantum channel. Our analysis is based on a simple precondition for secure QKD in each case. In particular, the legitimate users need to prove that there exists no separable state (in the case of two-way QKD), or that there exists no quantum state having a symmetric extension (one-way QKD), that is compatible with the available measurements results. We show that both criteria can be formulated as a convex optimization problem known as a semidefinite program, which can be efficiently solved. Moreover, we prove that the solution to the dual optimization corresponds to the evaluation of an optimal witness operator that belongs to the minimal verification set of them for the given two-way (or one-way) QKD protocol. A positive expectation value of this optimal witness operator states that no secret key can be distilled from the available measurements results. We apply such analysis to several well-known single-photon QKD protocols under losses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, F.W.; Pagano, C.; Schneiderman, B.
1959-07-01
Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)
Kordi, Masoumeh; Riyazi, Sahar; Lotfalizade, Marziyeh; Shakeri, Mohammad Taghi; Suny, Hoseyn Jafari
2018-01-01
Screening of fetal anomalies is assumed as a necessary measurement in antenatal cares. The screening plans aim at empowerment of individuals to make the informed choice. This study was conducted in order to compare the effect of group and face-to-face education and decisional conflicts among the pregnant females regarding screening of fetal abnormalities. This study of the clinical trial was carried out on 240 pregnant women at <10-week pregnancy age in health care medical centers in Mashhad city in 2014. The form of individual-midwifery information and informed choice questionnaire and decisional conflict scale were used as tools for data collection. The face-to-face and group education course were held in two weekly sessions for intervention groups during two consecutive weeks, and the usual care was conducted for the control group. The rate of informed choice and decisional conflict was measured in pregnant women before education and also at weeks 20-22 of pregnancy in three groups. The data analysis was executed using SPSS statistical software (version 16), and statistical tests were implemented including Chi-square test, Kruskal-Wallis test, Wilcoxon test, Mann-Whitney U-test, one-way analysis of variance test, and Tukey's range test. The P < 0.05 was considered as a significant. The results showed that there was statically significant difference between three groups in terms of frequency of informed choice in screening of fetal abnormalities ( P = 0.001) in such a way that at next step of intervention, 62 participants (77.5%) in face-to-face education group, 64 members (80%) in group education class, and 20 persons (25%) in control group had the informed choice regarding screening tests, but there was no statistically significant difference between two individual and group education classes. Similarly, during the postintervention phase, there was a statistically significant difference in mean score of decisional conflict scale among pregnant women regarding screening tests in three groups ( P = 0.001). With respect to effectiveness of group and face-to-face education methods in increasing the informed choice and reduced decisional conflict in pregnant women regarding screening tests, each of these education methods may be employed according to the clinical environment conditions and requirement to encourage the women for conducting the screening tests.
Toward improved analysis of concentration data: Embracing nondetects.
Shoari, Niloofar; Dubé, Jean-Sébastien
2018-03-01
Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.
Shrout, Patrick E; Rodgers, Joseph L
2018-01-04
Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.
In situ and in-transit analysis of cosmological simulations
Friesen, Brian; Almgren, Ann; Lukic, Zarija; ...
2016-08-24
Modern cosmological simulations have reached the trillion-element scale, rendering data storage and subsequent analysis formidable tasks. To address this circumstance, we present a new MPI-parallel approach for analysis of simulation data while the simulation runs, as an alternative to the traditional workflow consisting of periodically saving large data sets to disk for subsequent ‘offline’ analysis. We demonstrate this approach in the compressible gasdynamics/N-body code Nyx, a hybrid MPI+OpenMP code based on the BoxLib framework, used for large-scale cosmological simulations. We have enabled on-the-fly workflows in two different ways: one is a straightforward approach consisting of all MPI processes periodically haltingmore » the main simulation and analyzing each component of data that they own (‘ in situ’). The other consists of partitioning processes into disjoint MPI groups, with one performing the simulation and periodically sending data to the other ‘sidecar’ group, which post-processes it while the simulation continues (‘in-transit’). The two groups execute their tasks asynchronously, stopping only to synchronize when a new set of simulation data needs to be analyzed. For both the in situ and in-transit approaches, we experiment with two different analysis suites with distinct performance behavior: one which finds dark matter halos in the simulation using merge trees to calculate the mass contained within iso-density contours, and another which calculates probability distribution functions and power spectra of various fields in the simulation. Both are common analysis tasks for cosmology, and both result in summary statistics significantly smaller than the original data set. We study the behavior of each type of analysis in each workflow in order to determine the optimal configuration for the different data analysis algorithms.« less
Krawczyk, Michał
2015-01-01
In this project I investigate the use and possible misuse of p values in papers published in five (high-ranked) journals in experimental psychology. I use a data set of over 135'000 p values from more than five thousand papers. I inspect (1) the way in which the p values are reported and (2) their distribution. The main findings are following: first, it appears that some authors choose the mode of reporting their results in an arbitrary way. Moreover, they often end up doing it in such a way that makes their findings seem more statistically significant than they really are (which is well known to improve the chances for publication). Specifically, they frequently report p values "just above" significance thresholds directly, whereas other values are reported by means of inequalities (e.g. "p<.1"), they round the p values down more eagerly than up and appear to choose between the significance thresholds and between one- and two-sided tests only after seeing the data. Further, about 9.2% of reported p values are inconsistent with their underlying statistics (e.g. F or t) and it appears that there are "too many" "just significant" values. One interpretation of this is that researchers tend to choose the model or include/discard observations to bring the p value to the right side of the threshold.
ERIC Educational Resources Information Center
Underwood, Sonia M.; Reyes-Gastelum, David; Cooper, Melanie M.
2015-01-01
Longitudinal studies can provide significant insights into how students develop competence in a topic or subject area over time. However, there are many barriers, such as retention of students in the study and the complexity of data analysis, that make these studies rare. Here, we present how a statistical framework, discrete-time survival…
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Statistical detection of patterns in unidimensional distributions by continuous wavelet transforms
NASA Astrophysics Data System (ADS)
Baluev, R. V.
2018-04-01
Objective detection of specific patterns in statistical distributions, like groupings or gaps or abrupt transitions between different subsets, is a task with a rich range of applications in astronomy: Milky Way stellar population analysis, investigations of the exoplanets diversity, Solar System minor bodies statistics, extragalactic studies, etc. We adapt the powerful technique of the wavelet transforms to this generalized task, making a strong emphasis on the assessment of the patterns detection significance. Among other things, our method also involves optimal minimum-noise wavelets and minimum-noise reconstruction of the distribution density function. Based on this development, we construct a self-closed algorithmic pipeline aimed to process statistical samples. It is currently applicable to single-dimensional distributions only, but it is flexible enough to undergo further generalizations and development.
An Exercise in Exploring Big Data for Producing Reliable Statistical Information.
Rey-Del-Castillo, Pilar; Cardeñosa, Jesús
2016-06-01
The availability of copious data about many human, social, and economic phenomena is considered an opportunity for the production of official statistics. National statistical organizations and other institutions are more and more involved in new projects for developing what is sometimes seen as a possible change of paradigm in the way statistical figures are produced. Nevertheless, there are hardly any systems in production using Big Data sources. Arguments of confidentiality, data ownership, representativeness, and others make it a difficult task to get results in the short term. Using Call Detail Records from Ivory Coast as an illustration, this article shows some of the issues that must be dealt with when producing statistical indicators from Big Data sources. A proposal of a graphical method to evaluate one specific aspect of the quality of the computed figures is also presented, demonstrating that the visual insight provided improves the results obtained using other traditional procedures.
Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O; Liang, Jingjing; Young, J Hunter; Franceschini, Nora; Smith, Jennifer A; Yanek, Lisa R; Sun, Yan V; Edwards, Todd L; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K; Chakravati, Aravinda; Cooper, Richard S; Redline, Susan
2015-01-08
Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Effect Size as the Essential Statistic in Developing Methods for mTBI Diagnosis.
Gibson, Douglas Brandt
2015-01-01
The descriptive statistic known as "effect size" measures the distinguishability of two sets of data. Distingishability is at the core of diagnosis. This article is intended to point out the importance of effect size in the development of effective diagnostics for mild traumatic brain injury and to point out the applicability of the effect size statistic in comparing diagnostic efficiency across the main proposed TBI diagnostic methods: psychological, physiological, biochemical, and radiologic. Comparing diagnostic approaches is difficult because different researcher in different fields have different approaches to measuring efficacy. Converting diverse measures to effect sizes, as is done in meta-analysis, is a relatively easy way to make studies comparable.
ERIC Educational Resources Information Center
Onchwari, Jacqueline
2010-01-01
This article reports a study that investigated preservice and inservice early childhood teachers' perceived levels of preparedness to handle stress in early childhood and elementary education students. A survey that included vignettes was used to collect data. Data were analyzed both qualitatively and statistically, using one-way ANOVA, "t"-test,…
Modeling Socially Desirable Responding and Its Effects
ERIC Educational Resources Information Center
Ziegler, Matthias; Buehner, Markus
2009-01-01
The impact of socially desirable responding or faking on noncognitive assessments remains an issue of strong debate. One of the main reasons for the controversy is the lack of a statistical method to model such response sets. This article introduces a new way to model faking based on the assumption that faking occurs due to an interaction between…
Statistical Assessment of Estimated Transformations in Observed-Score Equating
ERIC Educational Resources Information Center
Wiberg, Marie; González, Jorge
2016-01-01
Equating methods make use of an appropriate transformation function to map the scores of one test form into the scale of another so that scores are comparable and can be used interchangeably. The equating literature shows that the ways of judging the success of an equating (i.e., the score transformation) might differ depending on the adopted…
Official Statistics as Curriculum: Biopolitics and the United States "Census in Schools" Program
ERIC Educational Resources Information Center
Berdayes, Vicente
2008-01-01
One way that people learn to recognize themselves as members of a nation-state is by participating in the ritual of a national census. In the United States acquaintance with such enumerations is cultivated during childhood by the federal government's "Census in Schools" (CIS) program, which distributes a variety of educational materials…
The Behavioral and Social Sciences Survey: Mathematical Sciences and Social Sciences.
ERIC Educational Resources Information Center
Kruskal, William, Ed.
This book, one of a series prepared in connection with the Behavioral and Social Sciences Survey (BASS) conducted between 1967 and 1969, deals with problems of statistics, mathematics, and computation as they related to the social sciences. Chapter 1 shows how these subjects help in their own ways for studying learning behavior with irregular…
Deep Space Navigation with Noncoherent Tracking Data
NASA Technical Reports Server (NTRS)
Ellis, J.
1983-01-01
Navigation capabilities of noncoherent tracking data are evaluated for interplanetary cruise phase and planetary (Venus) flyby orbit determination. Results of a formal covariance analysis are presented which show that a combination of one-way Doppler and delta DOR yields orbit accuracies comparable to conventional two-way Doppler tracking. For the interplanetary cruise phase, a tracking cycle consisting of a 3-hour Doppler pass and delta DOR (differential one-way range) from two baselines (one observation per overlap) acquired 3 times a month results in 100-km orbit determination accuracy. For reconstruction of a Venus flyby orbit, 10 days tracking at encounter consisting of continuous one-way Doppler and delta DOR sampled at one observation per overlap is sufficient to satisfy the accuracy requirements.
Mobarak, E H
2011-01-01
To evaluate the influence of 2% and 5% chlorhexidine (CHX) pretreatment on bond durability of a self-etching adhesive to normal (ND) and caries-affected (AD) dentin after 2-years of aging in artificial saliva and under simulated intrapulpal pressure (IPP). One hundred twenty freshly extracted carious teeth were ground to expose ND and AD. Specimens were distributed into three equal groups (n=40) according to whether the dentin substrates were pretreated with 2% or 5% CHX or with water (control). Clearfil SE Bond (Kuraray) was applied to both substrates and composite cylinders (0.9 mm diameter and 0.7 mm height) were formed. Pretreatment and bonding were done while the specimens were subjected to 15 mm Hg IPP. After curing, specimens were aged in artificial saliva at 37°C and under IPP at 20 mm Hg until being tested after 24 hours or 2 years (n=20/group). Microshear bond strength was evaluated. Failure modes were determined using a scanning electron microscope (SEM) at 400× magnification. Data were statistically analyzed using three-way analysis of variance (ANOVA); one-way ANOVA tests, and t-test (p<0.05). Additional specimens (n=5/group) were prepared to evaluate interfacial silver precipitation. For the 24-hour groups, there were no significant differences among the ND groups and AD groups. For ND aged specimens, the 5% CHX group had the highest value followed by the 2% CHX and control groups, although the difference was statistically insignificant. For AD aged specimens, the 5% CHX group revealed statistically higher bond values compared to the 2% CHX and control groups. Fracture modes were predominately adhesive and mixed. Different interfacial silver depositions were recorded. Two percent or 5% CHX pretreatment has no adverse effect on the 24-hour bonding to ND and AD. Five percent CHX was able to diminish the loss in bonding to AD after 2years of aging in artificial saliva and under simulated IPP.
[Spanish doctoral theses in emergency medicine (1978-2013)].
Fernández-Guerrero, Inés María
2015-01-01
To quantitatively analyze the production of Spanish doctoral theses in emergency medicine. Quantitative synthesis of productivity indicators for 214 doctoral theses in emergency medicine found in the database (TESEO) for Spanish universities from 1978 to 2013. We processed the data in 3 ways as follows: compilation of descriptive statistics, regression analysis (correlation coefficients of determination), and modeling of linear trend (time-series analysis). Most of the thesis supervisors (84.1%) only oversaw a single project. No major supervisor of 10 or more theses was identified. Analysis of cosupervision indicated there were 1.6 supervisors per thesis. The theses were defended in 67 departments (both general and specialist departments) because no emergency medicine departments had been established. The most productive universities were 2 large ones (Universitat de Barcelona and Universidad Complutense de Madrid) and 3 medium-sized ones (Universidad de Granada, Universitat Autónoma de Barcelona, and Universidad de La Laguna). Productivity over time analyzed as the trend for 2-year periods in the time-series was expressed as a polynomial function with a correlation coefficient of determination of R2 = 0.80. Spanish doctoral research in emergency medicine has grown markedly. Work has been done in various university departments in different disciplines and specialties. The findings confirm that emergency medicine is a disciplinary field.
Fadil, Mouhcine; Farah, Abdellah; Ihssane, Bouchaib; Haloui, Taoufik; Lebrazi, Sara; Zghari, Badreddine; Rachiq, Saâd
2016-01-01
To investigate the effect of environmental factors such as light and shade on essential oil yield and morphological traits of Moroccan Myrtus communis, a chemometric study was conducted on 20 individuals growing under two contrasting light environments. The study of individual's parameters by principal component analysis has shown that essential oil yield, altitude, and leaves thickness were positively correlated between them and negatively correlated with plants height, leaves length and leaves width. Principal component analysis and hierarchical cluster analysis have also shown that the individuals of each sampling site were grouped separately. The one-way ANOVA test has confirmed the effect of light and shade on essential oil yield and morphological parameters by showing a statistically significant difference between them from the shaded side to the sunny one. Finally, the multiple linear model containing main, interaction and quadratic terms was chosen for the modeling of essential oil yield in terms of morphological parameters. Sun plants have a small height, small leaves length and width, but they are thicker and richer in essential oil than shade plants which have shown almost the opposite. The highlighted multiple linear model can be used to predict essential oil yield in the studied area.
Briand, Valérie; Escolano, Sylvie; Journot, Valérie; Massougbodji, Achille; Cot, Michel; Tubert-Bitter, Pascale
2015-01-01
Since there is no ideal candidate to replace sulfadoxine–pyrimethamine (SP) for intermittent preventive treatment (IPTp), alternatives need to be evaluated on basis of their benefit–risk ratio. We reanalyzed the first Beninese trial on mefloquine (MQ) versus SP for IPTp using a multiple outcome approach, which allowed the joint assessment of efficacy and tolerability. Overall superiority of MQ to SP was defined as superiority on at least one efficacy outcome (low birth weight [LBW], placental malaria, or maternal anemia), non-inferiority on all of them as well as on tolerability defined as cutaneous or neuropsychiatric adverse events (AEs) or low compliance with the treatment. The analysis included 1,601 women. MQ was found to be overall superior to SP (P = 0.004). Performing several sensitivity analyses to handle both missing data and stillbirths provided similar results. Using MQ for IPTp as an example, we show that a multiple outcome analysis is a pragmatic way to assess the benefits/disadvantages of one drug compared with another. In the current context of a lack of antimalarials that could be used for IPTp, such a statistical approach could be widely used by institutional policy makers for future recommendations regarding the prevention of malaria in pregnancy (MiP). PMID:26055735
Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.
Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J
2017-10-20
This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p < 0.05) between the two lesion types. A hybrid biomarker developed using a stepwise feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.
NASA Astrophysics Data System (ADS)
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.
Yung, Emmanuel; Wong, Michael; Williams, Haddie; Mache, Kyle
2014-08-01
Randomized clinical trial. Objectives To compare the blood pressure (BP) and heart rate (HR) response of healthy volunteers to posteriorly directed (anterior-to-posterior [AP]) pressure applied to the cervical spine versus placebo. Manual therapists employ cervical spine AP mobilizations for various cervical-shoulder pain conditions. However, there is a paucity of literature describing the procedure, cardiovascular response, and safety profile. Thirty-nine (25 female) healthy participants (mean ± SD age, 24.7 ± 1.9 years) were randomly assigned to 1 of 2 groups. Group 1 received a placebo, consisting of light touch applied to the right C6 costal process. Group 2 received AP pressure at the same location. Blood pressure and HR were measured prior to, during, and after the application of AP pressure. One-way analysis of variance and paired-difference statistics were used for data analysis. There was no statistically significant difference between groups for mean systolic BP, mean diastolic BP, and mean HR (P >.05) for all time points. Within-group comparisons indicated statistically significant differences between baseline and post-AP pressure HR (-2.8 bpm; 95% confidence interval: -4.6, -1.1) and between baseline and post-AP pressure systolic BP (-2.4 mmHg; 95% confidence interval: -3.7, -1.0) in the AP group, and between baseline and postplacebo systolic BP (-2.6 mmHg; 95% confidence interval: -4.2, -1.0) in the placebo group. No participants reported any adverse reactions or side effects within 24 hours of testing. AP pressure caused a statistically significant physiologic response that resulted in a minor drop in HR (without causing asystole or vasodepression) after the procedure, whereas this cardiovascular change did not occur for those in the placebo group. Within both groups, there was a small but statistically significant reduction in systolic BP following the procedure.
Parallel processing of genomics data
NASA Astrophysics Data System (ADS)
Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario
2016-10-01
The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.
Statistics without Tears: Complex Statistics with Simple Arithmetic
ERIC Educational Resources Information Center
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
Pazira, Parvin; Rostami Haji-Abadi, Mahdi; Zolaktaf, Vahid; Sabahi, Mohammadfarzan; Pazira, Toomaj
2016-06-08
In relation to statistical analysis, studies to determine the validity, reliability, objectivity and precision of new measuring devices are usually incomplete, due in part to using only correlation coefficient and ignoring the data dispersion. The aim of this study was to demonstrate the best way to determine the validity, reliability, objectivity and accuracy of an electro-inclinometer or other measuring devices. Another purpose of this study is to answer the question of whether reliability and objectivity represent accuracy of measuring devices. The validity of an electro-inclinometer was examined by mechanical and geometric methods. The objectivity and reliability of the device was assessed by calculating Cronbach's alpha for repeated measurements by three raters and by measurements on the same person by mechanical goniometer and the electro-inclinometer. Measurements were performed on "hip flexion with the extended knee" and "shoulder abduction with the extended elbow." The raters measured every angle three times within an interval of two hours. The three-way ANOVA was used to determine accuracy. The results of mechanical and geometric analysis showed that validity of the electro-inclinometer was 1.00 and level of error was less than one degree. Objectivity and reliability of electro-inclinometer was 0.999, while objectivity of mechanical goniometer was in the range of 0.802 to 0.966 and the reliability was 0.760 to 0.961. For hip flexion, the difference between raters in joints angle measurement by electro-inclinometer and mechanical goniometer was 1.74 and 16.33 degree (P<0.05), respectively. The differences for shoulder abduction measurement by electro-inclinometer and goniometer were 0.35 and 4.40 degree (P<0.05). Although both the objectivity and reliability are acceptable, the results showed that measurement error was very high in the mechanical goniometer. Therefore, it can be concluded that objectivity and reliability alone cannot determine the accuracy of a device and it is preferable to use other statistical methods to compare and evaluate the accuracy of these two devices.
1987-11-01
differential qualita- tive (DQ) analysis, which solves the task, providing explanations suitable for use by design systems, automated diagnosis, intelligent...solves the task, providing explanations suitable for use by design systems, automated diagnosis, intelligent tutoring systems, and explanation based...comparative analysis as an important component; the explanation is used in many different ways. * One way method of automated design is the principlvd
Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences
Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric
2016-01-01
Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566
Moutia, Mouna; Seghrouchni, Fouad; Abouelazz, Omar; Elouaddari, Anass; Al Jahid, Abdellah; Elhou, Abdelhalim; Nadifi, Sellama; Jamal Eddine, Jamal; Habti, Norddine; Badou, Abdallah
2016-09-29
Allium sativum L. (A.S.) "garlic", one of the most interesting medicinal plants, has been suggested to contain compounds that could be beneficial in numerous pathological situations including cancer. In this work, we aimed to assess the immunomodulatory effect of A.S. preparation on human peripheral blood mononuclear cells from healthy individuals. Nontoxic doses of A.S. were identified using MTT assay. Effects on CD4+ or CD8+ T lymphocyte proliferation were studied using flow cytometry. The effect of A.S. on cytokine gene expression was studied using qRT-PCR. Finally, qualitative analysis of A.S. was performed by HPLC approach. Data were analyzed statistically by one-way ANOVA test. The nontoxic doses of A.S. preparation did not affect neither spontaneous nor TCR-mediated CD4+ or CD8+ T lymphocyte proliferation. Interestingly, A.S. exhibited a statistically significant regulation of IL-17 gene expression, a cytokine involved in several inflammatory and autoimmune diseases. In contrast, the expression of IL-4, an anti-inflammatory cytokine, was unaffected. Qualitative analysis of A.S. ethanol preparation indicated the presence of three polyphenol bioactive compounds, which are catechin, vanillic acid and ferulic acid. The specific inhibition of the pro-inflammatory cytokine, IL-17 without affecting cell proliferation in human PBMCs by the Allium sativum L. preparation suggests a potential valuable effect of the compounds present in this plant for the treatment of inflammatory diseases and cancer, where IL-17 is highly expressed. The individual contribution of these three compounds to this global effect will be assessed.
Mazloum, Vahid; Rahnama, Nader; Khayambashi, Khalil
2014-01-01
Background: Pain and limited range of motion (ROM) are the crucial subsequent results of joint hemorrhages in individuals with bleeding disorders and hemophilia. Exercise interventions are particularly recommended in treatment of such patients. The purpose of this study was to detect the influences of conventional exercise therapy and hydrotherapy on the knee joint complications in patients with hemophilia. Methods: A total of 40 patients engaging hemophilia A were randomized into one of three groups: Therapeutic exercise (N = 13), hydrotherapy (N = 14) or control (N = 13). While the first two groups followed their specific programs for 4 weeks, routine life-style was maintained by subjects in the control group in this period. To evaluate the pain level and knee ROM the visual analog scale and standard goniometer were utilized, respectively. The outcome was measured at baseline and after completing the prescribed protocols. Data analysis was performed using one-way analysis of variance and Scheffe statistical tests (P < 0.05). Results: Both experimental groups experienced more significant decreasing in pain level (P < 0.001) and knee flexion and extension ROM (P < 0.001) in comparison to the control group. Although the pain was significantly (P < 0.01) more alleviated in participants treated through hydrotherapy in comparison to exercise therapy, the difference in ROM improvement was not statistically significant (P > 0.05). Conclusions: Using hydrotherapy in addition to usual rehabilitation training can result in beneficial effect in terms of pain and knee joint ROM. However, it appears that hydrotherapy is more effective in reducing pain. PMID:24554996
Mazloum, Vahid; Rahnama, Nader; Khayambashi, Khalil
2014-01-01
Pain and limited range of motion (ROM) are the crucial subsequent results of joint hemorrhages in individuals with bleeding disorders and hemophilia. Exercise interventions are particularly recommended in treatment of such patients. The purpose of this study was to detect the influences of conventional exercise therapy and hydrotherapy on the knee joint complications in patients with hemophilia. A total of 40 patients engaging hemophilia A were randomized into one of three groups: Therapeutic exercise (N = 13), hydrotherapy (N = 14) or control (N = 13). While the first two groups followed their specific programs for 4 weeks, routine life-style was maintained by subjects in the control group in this period. To evaluate the pain level and knee ROM the visual analog scale and standard goniometer were utilized, respectively. The outcome was measured at baseline and after completing the prescribed protocols. Data analysis was performed using one-way analysis of variance and Scheffe statistical tests (P < 0.05). Both experimental groups experienced more significant decreasing in pain level (P < 0.001) and knee flexion and extension ROM (P < 0.001) in comparison to the control group. Although the pain was significantly (P < 0.01) more alleviated in participants treated through hydrotherapy in comparison to exercise therapy, the difference in ROM improvement was not statistically significant (P > 0.05). Using hydrotherapy in addition to usual rehabilitation training can result in beneficial effect in terms of pain and knee joint ROM. However, it appears that hydrotherapy is more effective in reducing pain.
Statistical representation of a spray as a point process
NASA Astrophysics Data System (ADS)
Subramaniam, S.
2000-10-01
The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed.
Samadzadeh, Gholam Reza; Rigi, Tahereh; Ganjali, Ali Reza
2013-01-01
Surveying valuable and most recent information from internet, has become vital for researchers and scholars, because every day, thousands and perhaps millions of scientific works are brought out as digital resources which represented by internet and researchers can't ignore this great resource to find related documents for their literature search, which may not be found in any library. With regard to variety of documents presented on the internet, search engines are one of the most effective search tools for finding information. The aim of this study is to evaluate the three criteria, recall, preciseness and importance of the four search engines which are PubMed, Science Direct, Google Scholar and federated search of Iranian National Medical Digital Library in addiction (prevention and treatment) to select the most effective search engine for offering the best literature research. This research was a cross-sectional study by which four popular search engines in medical sciences were evaluated. To select keywords, medical subject heading (Mesh) was used. We entered given keywords in the search engines and after searching, 10 first entries were evaluated. Direct observation was used as a mean for data collection and they were analyzed by descriptive statistics (number, percent number and mean) and inferential statistics, One way analysis of variance (ANOVA) and post hoc Tukey in Spss. 15 statistical software. P Value < 0.05 was considered statistically significant. Results have shown that the search engines had different operations with regard to the evaluated criteria. Since P Value was 0.004 < 0.05 for preciseness and was 0.002 < 0.05 for importance, it shows significant difference among search engines. PubMed, Science Direct and Google Scholar were the best in recall, preciseness and importance respectively. As literature research is one of the most important stages of research, it's better for researchers, especially Substance-Related Disorders scholars to use different search engines with the best recall, preciseness and importance in that subject field to reach desirable results while searching and they don't depend on just one search engine.
Samadzadeh, Gholam Reza; Rigi, Tahereh; Ganjali, Ali Reza
2013-01-01
Background Surveying valuable and most recent information from internet, has become vital for researchers and scholars, because every day, thousands and perhaps millions of scientific works are brought out as digital resources which represented by internet and researchers can’t ignore this great resource to find related documents for their literature search, which may not be found in any library. With regard to variety of documents presented on the internet, search engines are one of the most effective search tools for finding information. Objectives The aim of this study is to evaluate the three criteria, recall, preciseness and importance of the four search engines which are PubMed, Science Direct, Google Scholar and federated search of Iranian National Medical Digital Library in addiction (prevention and treatment) to select the most effective search engine for offering the best literature research. Materials and Methods This research was a cross-sectional study by which four popular search engines in medical sciences were evaluated. To select keywords, medical subject heading (Mesh) was used. We entered given keywords in the search engines and after searching, 10 first entries were evaluated. Direct observation was used as a mean for data collection and they were analyzed by descriptive statistics (number, percent number and mean) and inferential statistics, One way analysis of variance (ANOVA) and post hoc Tukey in Spss. 15 statistical software. P Value < 0.05 was considered statistically significant. Results Results have shown that the search engines had different operations with regard to the evaluated criteria. Since P Value was 0.004 < 0.05 for preciseness and was 0.002 < 0.05 for importance, it shows significant difference among search engines. PubMed, Science Direct and Google Scholar were the best in recall, preciseness and importance respectively. Conclusions As literature research is one of the most important stages of research, it's better for researchers, especially Substance-Related Disorders scholars to use different search engines with the best recall, preciseness and importance in that subject field to reach desirable results while searching and they don’t depend on just one search engine. PMID:24971257
Enhancement of MS Signal Processing For Improved Cancer Biomarker Discovery
NASA Astrophysics Data System (ADS)
Si, Qian
Technological advances in proteomics have shown great potential in detecting cancer at the earliest stages. One way is to use the time of flight mass spectroscopy to identify biomarkers, or early disease indicators related to the cancer. Pattern analysis of time of flight mass spectra data from blood and tissue samples gives great hope for the identification of potential biomarkers among the complex mixture of biological and chemical samples for the early cancer detection. One of the keys issues is the pre-processing of raw mass spectra data. A lot of challenges need to be addressed: unknown noise character associated with the large volume of data, high variability in the mass spectroscopy measurements, and poorly understood signal background and so on. This dissertation focuses on developing statistical algorithms and creating data mining tools for computationally improved signal processing for mass spectrometry data. I have introduced an advanced accurate estimate of the noise model and a half-supervised method of mass spectrum data processing which requires little knowledge about the data.
Women's health and women's work in health services: what statistics tell us.
Hedman, B; Herner, E
1988-01-01
This article draws together statistical information in several broad areas that relate to women's health, women's reproductive activities and women's occupations in Sweden. The statistical analysis reflects the major changes that have occurred in Swedish society and that have had a major impact on the health and well-being, as well as on the social participation rate, of women. Much of the data is drawn from a recent special effort at Statistic Sweden aimed at influencing the classification, collection and presentation of statistical data in all fields in such a way that family, working, education, health and other conditions of women can be more readily and equitably compared with those of men. In addition, social changes have seen the shifting of the responsibility of health care from the unpaid duties of women in the home to health care institutions, where female employees predominate. These trends are also discussed.
Health-related Quality of Life and Mental Health in the Process of Active and Passive Ageing.
Dajak, Lidija; Mastilica, Miroslav; Orešković, Stjepan; Vuletić, Gorka
2016-12-01
To analyse the differences in the self-estimate of life quality depending on the ageing type - passive, active. Life-quality linked to health was measured with an SF-36 survey, which gives multi-dimensional criteria of health and life-quality. SF-36 survey represents a theoretically based and empirically proven operationalization of two overal health concepts, which are body and mental health, and its two general manifestations, functioning, and welfare. 200 examinees in total, aged from 55 to 92, were included in the research. Divided by sex, in the research participated 148 women and 52 men. Depending on the ageing way, the examinees were divided into 2 categories: passive ageing (n=100), active ageing (n=100), and for these groups a detailed result analysis was done. Statistical analysis includes descriptive statistics, Hi-square test, Spearman's correlation coefficient, and Mann-Whitney U test. In all dimensions of health, examinees from the category Active ageing achieve higher scores, which indicates better health and better functioning. Between the groups, a statistically significant difference was determined, on the following dimensions: Overall health, Pains, Energy and vitality, Social operations, and Limits due to emotional difficulties. With the Hi-square test, it was determined that there are differences between the groups. The biggest difference can be seen in the reply categories related to health deterioration (χ 2 =10.391; df=4; p=0.034). Examinees from the Active ageing group mention significantly less that their health has gotten worse compared to the previous year (26% of the active ones state that their health is somewhat worse, and only 2% that their health is significantly worse, compared to the passive ones where 36% state that their health is worse, and 9% that it's much worse compared to the year before). Tested was the difference between arithmetic middles on the issue of mental health based on the ageing type (p>0.05), and the results show that it's not statistically significant. On all dimensions, examinees from the category Active ageing achieve higher scores, which indicates a better health and better functioning.
Travelogue--a newcomer encounters statistics and the computer.
Bruce, Peter
2011-11-01
Computer-intensive methods have revolutionized statistics, giving rise to new areas of analysis and expertise in predictive analytics, image processing, pattern recognition, machine learning, genomic analysis, and more. Interest naturally centers on the new capabilities the computer allows the analyst to bring to the table. This article, instead, focuses on the account of how computer-based resampling methods, with their relative simplicity and transparency, enticed one individual, untutored in statistics or mathematics, on a long journey into learning statistics, then teaching it, then starting an education institution.
Woldegebriel, Michael; Vivó-Truyols, Gabriel
2016-10-04
A novel method for compound identification in liquid chromatography-high resolution mass spectrometry (LC-HRMS) is proposed. The method, based on Bayesian statistics, accommodates all possible uncertainties involved, from instrumentation up to data analysis into a single model yielding the probability of the compound of interest being present/absent in the sample. This approach differs from the classical methods in two ways. First, it is probabilistic (instead of deterministic); hence, it computes the probability that the compound is (or is not) present in a sample. Second, it answers the hypothesis "the compound is present", opposed to answering the question "the compound feature is present". This second difference implies a shift in the way data analysis is tackled, since the probability of interfering compounds (i.e., isomers and isobaric compounds) is also taken into account.
NASA Astrophysics Data System (ADS)
Korenchenko, Anna E.; Vorontsov, Alexander G.; Gelchinski, Boris R.; Sannikov, Grigorii P.
2018-04-01
We discuss the problem of dimer formation during the homogeneous nucleation of atomic metal vapor in an inert gas environment. We simulated nucleation with molecular dynamics and carried out the statistical analysis of double- and triple-atomic collisions as the two ways of long-lived diatomic complex formation. Close pair of atoms with lifetime greater than the mean time interval between atom-atom collisions is called a long-lived diatomic complex. We found that double- and triple-atomic collisions gave approximately the same probabilities of long-lived diatomic complex formation, but internal energy of the resulted state was essentially lower in the second case. Some diatomic complexes formed in three-particle collisions are stable enough to be a critical nucleus.
Mayrink, Gabriela; Sawazaki, Renato; Asprino, Luciana; de Moraes, Márcio; Fernandes Moreira, Roger William
2011-11-01
Compare the traditional method of mounting dental casts on a semiadjustable articulator and the new method suggested by Wolford and Galiano, 1 analyzing the inclination of maxillary occlusal plane in relation to FHP. Two casts of 10 patients were obtained. One of them was used for mounting of models on a traditional articulator, by using a face bow transfer system and the other one was used to mounting models at Occlusal Plane Indicator platform (OPI), using the SAM articulator. After that, na analysis of the accuracy of mounting models was performed. The angle made by de occlusal plane and FHP on the cephalogram should be equal the angle between the occlusal plane and the upper member of the articulator. The measures were tabulated in Microsoft Excell(®) and calculated using a 1-way analysis variance. Statistically, the results did not reveal significant differences among the measures. OPI and face bow presents similar results but more studies are needed to verify its accuracy relative to the maxillary cant in OPI or develop new techniques able to solve the disadvantages of each technique. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Khode, Rajiv Tarachand; Shenoi, Pratima Ramakrishna; Kubde, Rajesh R.; Makade, Chetana S.; Wadekar, Kanchan D.; Khode, Priyanka Tarachand
2017-01-01
Aims: This study evaluated effect of infection control barriers on light intensity (LI) of light-curing unit (LCU) and microhardness of composite. Materials and Methods: Four different disposable barriers (n = 30) were tested against the control. LI for each barrier was measured with Lux meter. One hundred and fifty Teflon molds were equally divided into five groups of thirty each. Composite was filled in bulk in these molds and cured without and with barrier. Microhardness was evaluated on top and bottom surface of composite specimen with microhardness testing machine and hardness ratio (HR) was derived. Statistical Analysis Used: One-way analysis of variance, Tukey's honestly significant difference test, and paired t-test using SPSS version 18 software. Results: All barriers had significantly reduced the baseline LI of LCU (P < 0.0001), but only Cure Elastic Steri-Shield and latex cut glove pieces (LCGP) significantly reduced the microhardness of the composite (P < 0.05). However, HR determined inadequate curing only with LCGP. Conclusions: Although entire tested barrier significantly reduced the LI; none, except LCGP markedly affected the degree of cure of the composite. PMID:29279622
The leap from ROI to SROI: Farther than expected?
Gargani, John
2017-10-01
Social return on investment (SROI) is a popular method for evaluating the impact that organizations have on society and the environment. It has its roots in finance, where return on investment (ROI) is used to evaluate investments. Over the past ten years, SROI has made the leap from a tool for building private wealth to one that advances the public good. Has it landed us in a better place? To answer the question, I describe the general approach to financial analysis, how it is applied to financial decisions, and how it has been adapted to evaluate impact. I then consider the strengths and weaknesses of SROI, and suggest how, by pushing beyond the constraints of financial analysis, it can give stakeholders voice and provide evidence of success from diverse perspectives. Along the way, I propose a conceptual model for value, a foundational concept in SROI that has been criticized by some as underdeveloped, and I include a technical appendix that identifies potential sources of statistical bias in SROI estimates. I conclude by acknowledging our growing need to incorporate efficiency as one of multiple success criteria and the role that SROI-properly implemented-can play. Copyright © 2017 Elsevier Ltd. All rights reserved.
Colour stability of aesthetic brackets: ceramic and plastic.
Filho, Hibernon Lopes; Maia, Lúcio Henrique; Araújo, Marcus V; Eliast, Carlos Nelson; Ruellas, Antônio Carlos O
2013-05-01
The colour stability of aesthetic brackets may differ according to their composition, morphology and surface property, which may consequently influence their aesthetic performance. To assess the colour stability of aesthetic brackets (ceramic and plastic) after simulating aging and staining. Twelve commercially manufactured ceramic brackets and four different plastic brackets were assessed. To determine possible colour change (change of E*(ab)) and the value of the NBS (National Bureau of Standards) unit system, spectrophotometric colour measurements for CIE L*, a* and b* were taken before and after the brackets were aged and stained. Statistical analysis was undertaken using a one-way ANOVA analysis of variance and a Tukey multiple comparison test (alpha = 0.05). The colour change between the various (ceramic and plastic) materials was not significant (p > 0.05), but still varied significantly (p < 0.001) between the brackets of the same composition or crystalline structure and among commercial brands. Colour stability cannot be confirmed simply by knowing the type of material and crystalline composition or structure.
Nikam, Lalita H; Gadkari, Jayshree V
2012-01-01
The effect of Age. Gender and Body Mass Index (BMI) on the Visual (VRT) and Auditory reaction time (ART) was studied in 30 males and 30 females in the age group of 18-20 years along with 30 males and 30 females in the age group of 65-75 years. Statistical analysis of the data by one-way ANOVA and post-hoc by Tukey-HSD test showed that BMI, VRT and ART were significantly higher in old than young individuals. Females had higher BMI and longer reaction times than males. There was significant positive correlation between BMI and reaction times (VRT and ART) in both males and females by Pearson correlation analysis. Older individuals should be more careful and vigilant about the injuries and falls due to increased reaction time. Longer reaction times and higher BMI in females could be attributed to fluid and salt retention due to female sex hormones affecting sensorimotor co-ordination.
Aluckal, Eby; Ismail, Asif; Paulose, Anoopa; Lakshmanan, Sanju; Balakrishnan, M S; Mathew, Benoy; M, Vikneshan; Kunnilathu, Abraham
2017-01-01
Objectives: The objectives of this study were to evaluate the antimicrobial activity and total antioxidant capacity (TAC) of licorice in Saliva of HIV/AIDS patients. Materials and Methods: Saliva specimens were collected from 20 people living with HIV infection, with CD4 count <500 cells/mm3 from people infected with HIV/AIDS in Mangalore city, India. A combination of amoxicillin-clavulanic acid and nystatin was taken as the positive control and normal saline as negative control. Results were compared using one-way analysis of variance followed by Tukey's post hoc analysis in SPSS 19. Results: The TAC was evaluated spectrophotometrically at 695nm using the phosphomolybdenum method. Glycyrrhiza glabra showed a statistically significant reduction (P < 0.05) in total Candida count. The TAC of G. glabra was found to be 4.467 mM/L. Conclusions: G. glabra extracts showed good anticandidal activity and also high antioxidant property which reduces the oxidative stress of HIV-infected people. PMID:29284971
Lee, Seung Hee; Jang, Hyung Suk; Yang, Young Hee
2016-10-01
This study was done to investigate factors influencing successful aging in middle-aged women. A convenience sample of 103 middle-aged women was selected from the community. Data were collected using a structured questionnaire and analyzed using descriptive statistics, two-sample t-test, one-way ANOVA, Kruskal Wallis test, Pearson correlations, Spearman correlations and multiple regression analysis with the SPSS/WIN 22.0 program. Results of regression analysis showed that significant factors influencing successful aging were post-traumatic growth and social support. This regression model explained 48% of the variance in successful aging. Findings show that the concept 'post-traumatic growth' is an important factor influencing successful aging in middle-aged women. In addition, social support from friends/co-workers had greater influence on successful aging than social support from family. Thus, we need to consider the positive impact of post-traumatic growth and increase the chances of social participation in a successful aging program for middle-aged women.
Comparative evaluation of tensile strength of Gutta-percha cones with a herbal disinfectant.
Mahali, Raghunandhan Raju; Dola, Binoy; Tanikonda, Rambabu; Peddireddi, Suresh
2015-01-01
To evaluate and compare the tensile strength values and influence of taper on the tensile strength of Gutta-percha (GP) cones after disinfection with sodium hypochlorite (SH) and Aloe vera gel (AV). Sixty GP cones of size 110, 2% taper, 60 GP cones F3 ProTaper, and 60 GP of size 30, 6% taper were obtained from sealed packs as three different groups. Experimental groups were disinfected with 5.25% SH and 90% AV gel except the control group. Tensile strengths of GP were measured using the universal testing machine. The mean tensile strength values for Group IA, IIA and IIIA are 11.8 MPa, 8.69 MPa, and 9.24 MPa, respectively. Results were subjected to statistical analysis one-way analysis of variance test and Tukey post-hoc test. 5.25% SH solutions decreased the tensile strength of GP cones whereas with 90% AV gel it was not significantly altered. Ninety percent Aloe vera gel as a disinfectant does not alter the tensile strength of GP cones.
Global alliances effect in coalition forming
NASA Astrophysics Data System (ADS)
Vinogradova, Galina; Galam, Serge
2014-11-01
Coalition forming is investigated among countries, which are coupled with short range interactions, under the influence of externally-set opposing global alliances. The model extends a recent Natural Model of coalition forming inspired from Statistical Physics, where instabilities are a consequence of decentralized maximization of the individual benefits of actors. In contrast to physics where spins can only evaluate the immediate cost/benefit of a flip of orientation, countries have a long horizon of rationality, which associates with the ability to envision a way up to a better configuration even at the cost of passing through intermediate loosing states. The stabilizing effect is produced through polarization by the global alliances of either a particular unique global interest factor or multiple simultaneous ones. This model provides a versatile theoretical tool for the analysis of real cases and design of novel strategies. Such analysis is provided for several real cases including the Eurozone. The results shed a new light on the understanding of the complex phenomena of planned stabilization in the coalition forming.
Analyzing Hidden Semantics in Social Bookmarking of Open Educational Resources
NASA Astrophysics Data System (ADS)
Minguillón, Julià
Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describing them. This is especially interesting in the field of open educational resources, as delicious is a simple way to bridge the institutional point of view (i.e. learning object repositories) with the individual one (i.e. personal collections), thus promoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used to improve descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags create clusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.
Performance Analysis of an Inter-Relay Co-operation in FSO Communication System
NASA Astrophysics Data System (ADS)
Khanna, Himanshu; Aggarwal, Mona; Ahuja, Swaran
2018-04-01
In this work, we analyze the outage and error performance of a one-way inter-relay assisted free space optical link. The assumption of the absence of direct link between the source and destination node is being made for the analysis, and the feasibility of such system configuration is studied. We consider the influence of path loss, atmospheric turbulence and pointing error impairments, and investigate the effect of these parameters on the system performance. The turbulence-induced fading is modeled by independent but not necessarily identically distributed gamma-gamma fading statistics. The closed-form expressions for outage probability and probability of error are derived and illustrated by numerical plots. It is concluded that the absence of line of sight path between source and destination nodes does not lead to significant performance degradation. Moreover, for the system model under consideration, interconnected relaying provides better error performance than the non-interconnected relaying and dual-hop serial relaying techniques.
Butcher, Jason T.; Stewart, Paul M.; Simon, Thomas P.
2003-01-01
Ninety-four sites were used to analyze the effects of two different classification strategies on the Benthic Community Index (BCI). The first, a priori classification, reflected the wetland status of the streams; the second, a posteriori classification, used a bio-environmental analysis to select classification variables. Both classifications were examined by measuring classification strength and testing differences in metric values with respect to group membership. The a priori (wetland) classification strength (83.3%) was greater than the a posteriori (bio-environmental) classification strength (76.8%). Both classifications found one metric that had significant differences between groups. The original index was modified to reflect the wetland classification by re-calibrating the scoring criteria for percent Crustacea and Mollusca. A proposed refinement to the original Benthic Community Index is suggested. This study shows the importance of using hypothesis-driven classifications, as well as exploratory statistical analysis, to evaluate alternative ways to reveal environmental variability in biological assessment tools.
Research of second harmonic generation images based on texture analysis
NASA Astrophysics Data System (ADS)
Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan
2014-09-01
Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.
Study of photon correlation techniques for processing of laser velocimeter signals
NASA Technical Reports Server (NTRS)
Mayo, W. T., Jr.
1977-01-01
The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.
The Effect of Diode Laser With Different Parameters on Root Fracture During Irrigation Procedure.
Karataş, Ertuğrul; Arslan, Hakan; Topçuoğlu, Hüseyin Sinan; Yılmaz, Cenk Burak; Yeter, Kübra Yesildal; Ayrancı, Leyla Benan
2016-06-01
The aim of this study is to compare the effect of a single diode laser application and agitation of EDTA with diode laser with different parameters at different time intervals on root fracture. Ninety mandibular incisors were instrumented except the negative control group. The specimens were divided randomly into 10 groups according to final irrigation procedure: (G1) non-instrumented; (G2) distilled water; (G3) 15% EDTA; (G4) ultrasonically agitated EDTA; (G5) single 1.5W/100 Hz Diode laser; (G6) single 3W/100 Hz Diode laser; (G7) 1.5W/100 Hz Diode laser agitation of EDTA for 20 s; (G8) 1.5W/100 Hz Diode laser agitation of EDTA for 40 s; (G9) 3W/100 Hz Diode laser agitation of EDTA for 20 s; and (G10) 3W/100 Hz Diode laser agitation of EDTA for 40 s. The specimens were filled, mounted in acrylic resin, and compression strength test was performed on each specimen. Statistical analysis was carried out using one way ANOVA and Tukey's post hoc tests (P = 0.05). The statistical analysis revealed that there were statistically significant differences among the groups (P < 0.05). Laser-agitated irrigation with a 3W/100 Hz Diode laser for both 20 s and 40 s decreased the fracture resistance of teeth. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
How Do You Determine Whether The Earth Is Warming Up?
NASA Astrophysics Data System (ADS)
Restrepo, J. M.; Comeau, D.; Flaschka, H.
2012-12-01
How does one determine whether the extreme summer temperatures in the North East of the US, or in Moscow during the summer of 2010, was an extreme weather fluctuation or the result of a systematic global climate warming trend? It is only under exceptional circumstances that one can determine whether an observational climate signal belongs to a particular statistical distribution. In fact, observed climate signals are rarely "statistical" and thus there is usually no way to rigorously obtain enough field data to produce a trend or tendency, based upon data alone. Furthermore, this type of data is often multi-scale. We propose a trend or tendency methodology that does not make use of a parametric or a statistical assumption. The most important feature of this trend strategy is that it is defined in very precise mathematical terms. The tendency is easily understood and practical, and its algorithmic realization is fairly robust. In addition to proposing a trend, the methodology can be adopted to generate surrogate statistical models, useful in reduced filtering schemes of time dependent processes.
Sekar, Vadhana; Kumar, Ranjith; Nandini, Suresh; Ballal, Suma; Velmurugan, Natanasabapathy
2016-01-01
The purpose of this study was to assess the role of cross section on cyclic fatigue resistance of One Shape, Revo-S SU, and Mtwo rotary files in continuous rotation and reciprocating motion in dynamic testing model. A total of 90 new rotary One Shape, Revo-S SU, and Mtwo files (ISO size 25, taper 0.06, length 25 mm) were subjected to continuous rotation or reciprocating motion. A cyclic fatigue testing device was fabricated with 60° angle of curvature and 5 mm radius. The dynamic testing of these files was performed using an electric motor which permitted the reproduction of pecking motion. All instruments were rotated or reciprocated until fracture occurred. The time taken for each instrument to fracture was recorded. All the fractured files were analyzed under a scanning electron microscope (SEM) to detect the mode of fracture. Statistical analysis was performed using one-way ANOVA, followed by Tukey's honestly significant difference post hoc test. The time taken for instruments in reciprocating motion to fail under cyclic loading was significantly longer when compared with groups in continuous rotary motion. There was a statistically significant difference between Mtwo rotary and the other two groups in both continuous and reciprocating motion. One Shape rotary files recorded significantly longer duration to fracture resistance when compared with Revo-S SU files in both continuous and reciprocating motion. SEM observations showed that the instruments of all groups had undergone a ductile mode of fracture. Reciprocating motion improved the cyclic fatigue resistance of all tested groups.
Sekar, Vadhana; Kumar, Ranjith; Nandini, Suresh; Ballal, Suma; Velmurugan, Natanasabapathy
2016-01-01
Objective: The purpose of this study was to assess the role of cross section on cyclic fatigue resistance of One Shape, Revo-S SU, and Mtwo rotary files in continuous rotation and reciprocating motion in dynamic testing model. Materials and Methods: A total of 90 new rotary One Shape, Revo-S SU, and Mtwo files (ISO size 25, taper 0.06, length 25 mm) were subjected to continuous rotation or reciprocating motion. A cyclic fatigue testing device was fabricated with 60° angle of curvature and 5 mm radius. The dynamic testing of these files was performed using an electric motor which permitted the reproduction of pecking motion. All instruments were rotated or reciprocated until fracture occurred. The time taken for each instrument to fracture was recorded. All the fractured files were analyzed under a scanning electron microscope (SEM) to detect the mode of fracture. Statistical analysis was performed using one-way ANOVA, followed by Tukey's honestly significant difference post hoc test. Results: The time taken for instruments in reciprocating motion to fail under cyclic loading was significantly longer when compared with groups in continuous rotary motion. There was a statistically significant difference between Mtwo rotary and the other two groups in both continuous and reciprocating motion. One Shape rotary files recorded significantly longer duration to fracture resistance when compared with Revo-S SU files in both continuous and reciprocating motion. SEM observations showed that the instruments of all groups had undergone a ductile mode of fracture. Conclusion: Reciprocating motion improved the cyclic fatigue resistance of all tested groups. PMID:28042272
Statistical significance test for transition matrices of atmospheric Markov chains
NASA Technical Reports Server (NTRS)
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
Integrating the statistical analysis of spatial data in ecology
A. M. Liebhold; J. Gurevitch
2002-01-01
In many areas of ecology there is an increasing emphasis on spatial relationships. Often ecologists are interested in new ways of analyzing data with the objective of quantifying spatial patterns, and in designing surveys and experiments in light of the recognition that there may be underlying spatial pattern in biotic responses. In doing so, ecologists have adopted a...
Toward Theological Inclusivism: The Effects of a World Religions Course in a Mormon University
ERIC Educational Resources Information Center
Properzi, Mauro
2017-01-01
Inclusivist, exclusivist, and pluralist attitudes toward other religions interact in complex ways within the Mormon faith. Hence, a course on the world's religions at LDS-sponsored Brigham Young University presents an interesting case study in this context. Through survey data and statistical analysis this article attempts to examine the effect of…
Vapor Pressure Data Analysis and Statistics
2016-12-01
sublimation for solids), volatility, and entropy of volatilization. Vapor pressure can be reported several different ways, including tables of experimental ...account the variation in heat of vaporization with temperature, and accurately describes data over broad experimental ranges, thereby enabling...pressure is incorrect at temperatures far below the experimental temperature limit; the calculated vapor pressure becomes undefined when the
Color Charts, Esthetics, and Subjective Randomness
ERIC Educational Resources Information Center
Sanderson, Yasmine B.
2012-01-01
Color charts, or grids of evenly spaced multicolored dots or squares, appear in the work of modern artists and designers. Often the artist/designer distributes the many colors in a way that could be described as "random," that is, without an obvious pattern. We conduct a statistical analysis of 125 "random-looking" art and design color charts and…
Predicting Body Fat Using Data on the BMI
ERIC Educational Resources Information Center
Mills, Terence C.
2005-01-01
A data set contained in the "Journal of Statistical Education's" data archive provides a way of exploring regression analysis at a variety of teaching levels. An appropriate functional form for the relationship between percentage body fat and the BMI is shown to be the semi-logarithmic, with variation in the BMI accounting for a little over half…
Quimby, Jessica M; Dowers, Kristy; Herndon, Andrea K; Randall, Elissa K
2017-08-01
Objectives The objective was to describe ultrasonographic characteristics of cats with stable chronic kidney disease (CKD) and determine if these were significantly different from cats with pyelonephritis (Pyelo) and ureteral obstruction (UO), to aid in clinical assessment during uremic crisis. Methods Sixty-six cats with stable CKD were prospectively enrolled, as well as normal control cats (n = 10), cats with a clinical diagnosis of Pyelo (n = 13) and cats with UO confirmed by surgical resolution (n = 11). Renal ultrasound was performed and routine still images and cine loops were obtained. Analysis included degree of pelvic dilation, and presence and degree of ureteral dilation. Measurements were compared between groups using non-parametric one-way ANOVA with Dunn's post-hoc analysis. Results In total, 66.6% of CKD cats had measurable renal pelvic dilation compared with 30.0% of normal cats, 84.6% of Pyelo cats and 100% of UO cats. There was no statistically significant difference in renal pelvic widths between CKD cats and normal cats, or CKD cats and Pyelo cats. On almost all measurement categories, UO cats had significantly greater renal pelvic widths compared with CKD cats and normal cats ( P <0.05) but not Pyelo cats. Six percent of stable CKD cats had measurable proximal ureteral dilation on one or both sides vs 46.2% of Pyelo cats and 81.8% of UO cats. There was no statistically significant difference in proximal ureteral width between normal and CKD cats, or between Pyelo and UO cats. There was a statistically significant difference in proximal ureteral width between CKD and Pyelo cats, CKD and UO cats, normal and UO cats, and normal and Pyelo cats. Conclusions and relevance No significant difference in renal pelvic widths between CKD cats and Pyelo cats was seen. These data suggest CKD cats should have a baseline ultrasonography performed so that abnormalities documented during a uremic crisis can be better interpreted.
Visual analytics in cheminformatics: user-supervised descriptor selection for QSAR methods.
Martínez, María Jimena; Ponzoni, Ignacio; Díaz, Mónica F; Vazquez, Gustavo E; Soto, Axel J
2015-01-01
The design of QSAR/QSPR models is a challenging problem, where the selection of the most relevant descriptors constitutes a key step of the process. Several feature selection methods that address this step are concentrated on statistical associations among descriptors and target properties, whereas the chemical knowledge is left out of the analysis. For this reason, the interpretability and generality of the QSAR/QSPR models obtained by these feature selection methods are drastically affected. Therefore, an approach for integrating domain expert's knowledge in the selection process is needed for increase the confidence in the final set of descriptors. In this paper a software tool, which we named Visual and Interactive DEscriptor ANalysis (VIDEAN), that combines statistical methods with interactive visualizations for choosing a set of descriptors for predicting a target property is proposed. Domain expertise can be added to the feature selection process by means of an interactive visual exploration of data, and aided by statistical tools and metrics based on information theory. Coordinated visual representations are presented for capturing different relationships and interactions among descriptors, target properties and candidate subsets of descriptors. The competencies of the proposed software were assessed through different scenarios. These scenarios reveal how an expert can use this tool to choose one subset of descriptors from a group of candidate subsets or how to modify existing descriptor subsets and even incorporate new descriptors according to his or her own knowledge of the target property. The reported experiences showed the suitability of our software for selecting sets of descriptors with low cardinality, high interpretability, low redundancy and high statistical performance in a visual exploratory way. Therefore, it is possible to conclude that the resulting tool allows the integration of a chemist's expertise in the descriptor selection process with a low cognitive effort in contrast with the alternative of using an ad-hoc manual analysis of the selected descriptors. Graphical abstractVIDEAN allows the visual analysis of candidate subsets of descriptors for QSAR/QSPR. In the two panels on the top, users can interactively explore numerical correlations as well as co-occurrences in the candidate subsets through two interactive graphs.
Patil, Narendra P; Dandekar, Minal; Nadiger, Ramesh K; Guttal, Satyabodh S
2010-09-01
The aim of this study was to evaluate the shear bond strength of porcelain to laser welded titanium surface and to determine the mode of bond failure through scanning electron microscopy (SEM) and energy dispersive spectrophotometry (EDS). Forty five cast rectangular titanium specimens with the dimension of 10 mm x 8 mm x 1 mm were tested. Thirty specimens had a perforation of 2 mm diameter in the centre. These were randomly divided into Group A and B. The perforations in the Group B specimens were repaired by laser welding using Cp Grade II titanium wire. The remaining 15 specimens were taken as control group. All the test specimens were layered with low fusing porcelain and tested for shear bond strength. The debonded specimens were subjected to SEM and EDS. Data were analysed with 1-way analysis of variance and Student's t-test for comparison among the different groups. One-way analysis of variance (ANOVA) showed no statistically significant difference in shear bond strength values at a 5% level of confidence. The mean shear bond strength values for control group, Group A and B was 8.4 +/- 0.5 Mpa, 8.1 +/- 0.4 Mpa and 8.3 +/- 0.3 Mpa respectively. SEM/EDS analysis of the specimens showed mixed and cohesive type of bond failure. Within the limitations of the study laser welding did not have any effect on the shear bond strength of porcelain bonded to titanium.
Li, Hongliang; Dai, Jiewen; Si, Jiawen; Zhang, Jianfei; Wang, Minjiao; Shen, Steve Guofang; Yu, Hongbo
2015-01-01
Anterior maxillary segmental distraction (AMSD) is an effective surgical procedure in the treatment of maxillary hypoplasia secondary to cleft lip and palate. Its unique advantage of preserving velopharyngeal function makes this procedure widely applied. In this study, the application of AMSD was described and its long-term stability was explored. Eight patients with severe maxillary hypoplasia secondary to CLP were included in this study. They were treated with AMSD using rigid external distraction (RED) device. Cephalometric analysis was performed twice at three time points for evaluation: before surgery (T1), after distraction (T2), and 2 years after treatment (T3). One-way analysis of variance was used to assess the differences statistically. All the distractions completed smoothly, and maxilla was distracted efficiently. The value of SNA, NA-FH, Ptm-A, U1-PP, overjet and PP (ANS-PNS) increased significantly after the AMSD procedure (P < 0.05), with the mean overjet increased by 14.28 mm. However, comparison of cephalometric analysis between T2 and T3 showed no significant difference (P > 0.05). Changes of palatopharyngeal depth and soft palatal length were insignificant. AMSD with RED device provided an effective way to correct maxillary hypoplasia secondary to CLP, extended the palatal and arch length, avoided damage on velopharyngeal closure function and reduced the relapse rate. It is a promising and valuable technique in this potentially complicated procedure. PMID:26629107
Morozov, Andrey K; Colosi, John A
2017-09-01
Underwater sound scattering by a rough sea surface, ice, or a rough elastic bottom is studied. The study includes both the scattering from the rough boundary and the elastic effects in the solid layer. A coupled mode matrix is approximated by a linear function of one random perturbation parameter such as the ice-thickness or a perturbation of the surface position. A full two-way coupled mode solution is used to derive the stochastic differential equation for the second order statistics in a Markov approximation.
Thermal equilibrium and statistical thermometers in special relativity.
Cubero, David; Casado-Pascual, Jesús; Dunkel, Jörn; Talkner, Peter; Hänggi, Peter
2007-10-26
There is an intense debate in the recent literature about the correct generalization of Maxwell's velocity distribution in special relativity. The most frequently discussed candidate distributions include the Jüttner function as well as modifications thereof. Here we report results from fully relativistic one-dimensional molecular dynamics simulations that resolve the ambiguity. The numerical evidence unequivocally favors the Jüttner distribution. Moreover, our simulations illustrate that the concept of "thermal equilibrium" extends naturally to special relativity only if a many-particle system is spatially confined. They make evident that "temperature" can be statistically defined and measured in an observer frame independent way.
H K, Sowmya; T S, Subhash; Goel, Beena Rani; T N, Nandini; Bhandi, Shilpa H
2014-02-01
Decreased apical extrusion of debris and apical one third debris have strong implications for decreased incidence of postoperative inflammation and pain. Thus, the aim of this study was to assess quantitatively the apical extrusion of debris and intracanal debris in the apical third during root canal instrumentation using hand and three different types of rotary instruments. Sixty freshly extracted single rooted human teeth were randomly divided into four groups. Canal preparation was done using step-back with hand instrumentation, crown-down technique with respect to ProTaper and K3, and hybrid technique with LightSpeed LSX. Irrigation was done with NaOCl, EDTA, and normal saline and for final irrigation, EndoVac system was used. The apically extruded debris was collected on the pre-weighed Millipore plastic filter disk and weighed using microbalance. The teeth were submitted to the histological processing. Sections from the apical third were analyzed by a trinocular research microscope that was coupled to a computer where the images were captured and analyzed using image proplus V4.1.0.0 software. The mean weight of extruded debris for each group and intracanal debris in the root canal was statistically analyzed by a Kruskal-Wallis one-way analysis of variance and Mann-Whitney U test. The result showed that, hand instrumentation using K files showed the highest amount of debris extrusion apically when compared to ProTaper, K3 and LightSpeed LSX. The result also showed that there was no statistically significant difference between the groups in relation to presence of intracanal debris in the apical one third. Based on the results, all instrumentation techniques produced debris extrusion. The engine driven Ni-Ti systems extruded significantly less apical debris than hand instrumentation. There was no statistically significant difference between the groups in relation to presence of intracanal debris in the apical one third.
H.K., Sowmya; T.S., Subhash; Goel, Beena Rani; T.N., Nandini; Bhandi, Shilpa H.
2014-01-01
Introduction: Decreased apical extrusion of debris and apical one third debris have strong implications for decreased incidence of postoperative inflammation and pain. Thus, the aim of this study was to assess quantitatively the apical extrusion of debris and intracanal debris in the apical third during root canal instrumentation using hand and three different types of rotary instruments. Methodology: Sixty freshly extracted single rooted human teeth were randomly divided into four groups. Canal preparation was done using step-back with hand instrumentation, crown-down technique with respect to ProTaper and K3, and hybrid technique with LightSpeed LSX. Irrigation was done with NaOCl, EDTA, and normal saline and for final irrigation, EndoVac system was used. The apically extruded debris was collected on the pre-weighed Millipore plastic filter disk and weighed using microbalance. The teeth were submitted to the histological processing. Sections from the apical third were analyzed by a trinocular research microscope that was coupled to a computer where the images were captured and analyzed using image proplus V4.1.0.0 software. The mean weight of extruded debris for each group and intracanal debris in the root canal was statistically analyzed by a Kruskal-Wallis one-way analysis of variance and Mann-Whitney U test. Results: The result showed that, hand instrumentation using K files showed the highest amount of debris extrusion apically when compared to ProTaper, K3 and LightSpeed LSX. The result also showed that there was no statistically significant difference between the groups in relation to presence of intracanal debris in the apical one third. Conclusion: Based on the results, all instrumentation techniques produced debris extrusion. The engine driven Ni-Ti systems extruded significantly less apical debris than hand instrumentation. There was no statistically significant difference between the groups in relation to presence of intracanal debris in the apical one third. PMID:24701536
Analysis of One-Way Laser Ranging Data to LRO, Time Transfer and Clock Characterization
NASA Technical Reports Server (NTRS)
Bauer, S.; Hussmann, H.; Oberst, J.; Dirkx, D.; Mao, D.; Neumann, G. A.; Mazarico, E.; Torrence, M. H.; McGarry, J. F.; Smith, D. E.;
2016-01-01
We processed and analyzed one-way laser ranging data from International Laser Ranging Service ground stations to NASA's Lunar Reconnaissance Orbiter (LRO), obtained from June 13, 2009 until September 30, 2014. We pair and analyze the one-way range observables from station laser fire and spacecraft laser arrival times by using nominal LRO orbit models based on the GRAIL gravity field. We apply corrections for instrument range walk, as well as for atmospheric and relativistic effects. In total we derived a tracking data volume of approximately 3000 hours featuring 64 million Full Rate and 1.5 million Normal Point observations. From a statistical analysis of the dataset we evaluate the experiment and the ground station performance. We observe a laser ranging measurement precision of 12.3 centimeters in case of the Full Rate data which surpasses the LOLA (Lunar Orbiting Laser Altimeter) timestamp precision of 15 centimeters. The averaging to Normal Point data further reduces the measurement precision to 5.6 centimeters. We characterized the LRO clock with fits throughout the mission time and estimated the rate to 6.9 times10 (sup -8), the aging to 1.6 times 10 (sup -12) per day and the change of aging to 2.3 times 10 (sup -14) per day squared over all mission phases. The fits also provide referencing of onboard time to the TDB (Barycentric Dynamical Time) time scale at a precision of 166 nanoseconds over two and 256 nanoseconds over all mission phases, representing ground to space time transfer. Furthermore we measure ground station clock differences from the fits as well as from simultaneous passes which we use for ground to ground time transfer from common view observations. We observed relative offsets ranging from 33 to 560 nanoseconds and relative rates ranging from 2 times 10 (sup -13) to 6 times 10 (sup -12) between the ground station clocks during selected mission phases. We study the results from the different methods and discuss their applicability for time transfer.
Assessment of Tooth Wear Among Glass Factory Workers: WHO 2013 Oral Health Survey
Bhat, Nagesh; Asawa, Kailash; Tak, Mridula; Bapat, Salil; Gupta, Vivek Vardhan
2015-01-01
Background Glass factory workers are often exposed to the hazardous environment that leads to deleterious oral health and subsequently, general health. We planned to determine the effects of the particulates present in the milieu on the tooth wear among workers. Aim To assess tooth wear among glass factory workers in Jaipur, Rajasthan, India. Settings and Design A descriptive cross-sectional survey was conducted among 936 glass workers in Jaipur, Rajasthan, India from January-June 2014. Materials and Methods A survey proforma was designed for tooth wear evaluation with the help of WHO Oral Health Assessment form 2013 (for adults). Information regarding oral health practices, adverse habits and dietary habits, demographic details was gathered and clinical parameters were recorded. Statistical Analysis The Chi–square test, t–test, One-way Analysis of Variance and a Stepwise multiple linear regression analysis. Results The most prevalent form of erosion was enamel erosion (589, 62.93%) with few subjects of deeper dentinal erosion and the difference was statistically significant (p=0.001). Dental erosion was found to be higher among males compared to females. Years of experience and educational status were identified as best predictors for dental erosion. Conclusion It was concluded that there was considerable evidence of dental erosion found among the factory workers. Due to ignorance on social, cultural and health aspects, professional approach with regular dental care services for detection of early symptoms and planning of preventive strategies is warranted. PMID:26436050
The effect of kangaroo mother care on mental health of mothers with low birth weight infants
Badiee, Zohreh; Faramarzi, Salar; MiriZadeh, Tahereh
2014-01-01
Background: The mothers of premature infants are at risk of psychological stress because of separation from their infants. One of the methods influencing the maternal mental health in the postpartum period is kangaroo mother care (KMC). This study was conducted to evaluate the effect of KMC of low birth weight infants on their maternal mental health. Materials and Methods: The study was conducted in the Department of Pediatrics of Isfahan University of Medical Sciences, Isfahan, Iran. Premature infants were randomly allocated into two groups. The control group received standard caring in the incubator. In the experimental group, caring with three sessions of 60 min KMC daily for 1 week was practiced. Mental health scores of the mothers were evaluated by using the 28-item General Health Questionnaire. Statistical analysis was performed by the analysis of covariance using SPSS. Results: The scores of 50 infant-mother pairs were analyzed totally (25 in KMC group and 25 in standard care group). Results of covariance analysis showed the positive effects of KMC on the rate of maternal mental health scores. There were statistically significant differences between the mean scores of the experimental group and control subjects in the posttest period (P < 0.001). Conclusion: KMC for low birth weight infants is a safe way to improve maternal mental health. Therefore, it is suggested as a useful method that can be recommended for improving the mental health of mothers. PMID:25371871
Reddy, Nagam Raja; Reddy, Jakranpally Sathya; Padmaja, Bramha Josyula Indira; Reddy, Budigi Madan Mohan; Sunil, Motupalli; Reddy, Bommireddy Tejeswar
2016-01-01
Aims: To evaluate the accuracy of dies made from dual arch impressions using different sectional dual arch trays, combinations of elastomeric impression materials, and the sequence of pour of dies. Subjects and Methods: The dual arch impression materials were grouped into three groups depending on the combination of impression materials used and each group is subdivided into four subgroups. A sample size of 8 in each subgroup yielding a total 96 impressions will be made into three groups of 32 each (Group I, II, and III). Group I constitute impressions made using monophase (M) impression material, Group II constitute impressions made using combination of heavy body and light body (HL), and Group III constitute impressions made using combination of putty and light body (PL). Dies obtained were evaluated with a travelling microscope to measure the buccolingual width of the tooth at the margin by using the sharp corners of the notches as reference points. Statistical Analysis Used: Descriptive analysis namely mean and standard deviation, one-way analysis of variance test. Results: The results obtained in this study indicate that though not statistically significant, the metal dual arch trays performed better when compared to the plastic trays in reproducing die dimensions. Conclusions: From the results obtained, dies poured from combination of heavy body and light body impressions using plastic or metal dual arch trays showed least variation in bucco-lingual dimension from master model. PMID:27141172
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
ERIC Educational Resources Information Center
Kroeker, Leonard P.
The problem of blocking on a status variable was investigated. The one-way fixed-effects analysis of variance, analysis of covariance, and generalized randomized block designs each treat the blocking problem in a different way. In order to compare these designs, it is necessary to restrict attention to experimental situations in which observations…
Martini, Paolo; Risso, Davide; Sales, Gabriele; Romualdi, Chiara; Lanfranchi, Gerolamo; Cagnin, Stefano
2011-04-11
In the last decades, microarray technology has spread, leading to a dramatic increase of publicly available datasets. The first statistical tools developed were focused on the identification of significant differentially expressed genes. Later, researchers moved toward the systematic integration of gene expression profiles with additional biological information, such as chromosomal location, ontological annotations or sequence features. The analysis of gene expression linked to physical location of genes on chromosomes allows the identification of transcriptionally imbalanced regions, while, Gene Set Analysis focuses on the detection of coordinated changes in transcriptional levels among sets of biologically related genes. In this field, meta-analysis offers the possibility to compare different studies, addressing the same biological question to fully exploit public gene expression datasets. We describe STEPath, a method that starts from gene expression profiles and integrates the analysis of imbalanced region as an a priori step before performing gene set analysis. The application of STEPath in individual studies produced gene set scores weighted by chromosomal activation. As a final step, we propose a way to compare these scores across different studies (meta-analysis) on related biological issues. One complication with meta-analysis is batch effects, which occur because molecular measurements are affected by laboratory conditions, reagent lots and personnel differences. Major problems occur when batch effects are correlated with an outcome of interest and lead to incorrect conclusions. We evaluated the power of combining chromosome mapping and gene set enrichment analysis, performing the analysis on a dataset of leukaemia (example of individual study) and on a dataset of skeletal muscle diseases (meta-analysis approach). In leukaemia, we identified the Hox gene set, a gene set closely related to the pathology that other algorithms of gene set analysis do not identify, while the meta-analysis approach on muscular disease discriminates between related pathologies and correlates similar ones from different studies. STEPath is a new method that integrates gene expression profiles, genomic co-expressed regions and the information about the biological function of genes. The usage of the STEPath-computed gene set scores overcomes batch effects in the meta-analysis approaches allowing the direct comparison of different pathologies and different studies on a gene set activation level.
Attitudes towards schizophrenia on YouTube: A content analysis of Finnish and Greek videos.
Athanasopoulou, Christina; Suni, Sanna; Hätönen, Heli; Apostolakis, Ioannis; Lionis, Christos; Välimäki, Maritta
2016-01-01
To investigate attitudes towards schizophrenia and people with schizophrenia presented in YouTube videos. We searched YouTube using the search terms "schizophrenia" and "psychosis" in Finnish and Greek language on April 3rd, 2013. The first 20 videos from each search (N = 80) were retrieved. Deductive content analysis was first applied for coding and data interpretation and it was followed by descriptive statistical analysis. A total of 52 videos were analyzed (65%). The majority of the videos were in the "Music" category (50%, n = 26). Most of the videos (83%, n = 43) tended to present schizophrenia in a negative way, while less than a fifth (17%, n = 9) presented schizophrenia in a positive or neutral way. Specifically, the most common negative attitude towards schizophrenia was dangerousness (29%, n = 15), while the most often identified positive attitude was objective, medically appropriate beliefs (21%, n = 11). All attitudes identified were similarly present in the Finnish and Greek videos, without any statistically significant difference. Negative presentations of schizophrenia are most likely to be accessed when searching YouTube for schizophrenia in Finnish and Greek language. More research is needed to investigate to what extent, if any, YouTube viewers' attitudes are affected by the videos they watch.
Agents with left and right dominant hemispheres and quantum statistics
NASA Astrophysics Data System (ADS)
Ezhov, Alexandr A.; Khrennikov, Andrei Yu.
2005-01-01
We present a multiagent model illustrating the emergence of two different quantum statistics, Bose-Einstein and Fermi-Dirac, in a friendly population of individuals with the right-brain dominance and in a competitive population of individuals with the left-brain hemisphere dominance, correspondingly. Doing so, we adduce the arguments that Lefebvre’s “algebra of conscience” can be used in a natural way to describe decision-making strategies of agents simulating people with different brain dominance. One can suggest that the emergence of the two principal statistical distributions is able to illustrate different types of society organization and also to be used in order to simulate market phenomena and psychic disorders, when a switching of hemisphere dominance is involved.
A Civilian/Military Trauma Institute: National Trauma Coordinating Center
2015-12-01
zip codes was used in “proximity to violence” analysis. Data were analyzed using SPSS (version 20.0, SPSS Inc., Chicago, IL). Multivariable linear...number of adverse events and serious events was not statistically higher in one group, the incidence of deep venous thrombosis (DVT) was statistically ...subjects the lack of statistical difference on multivariate analysis may be related to an underpowered sample size. It was recommended that the
Assessment of Online Patient Education Materials from Major Dermatologic Associations
John, Ann M.; John, Elizabeth S.; Hansberry, David R.
2016-01-01
Objective: Patients increasingly use the internet to find medical information regarding their conditions and treatments. Physicians often supplement visits with written education materials. Online patient education materials from major dermatologic associations should be written at appropriate reading levels to optimize utility for patients. The purpose of this study is to assess online patient education materials from major dermatologic associations and determine if they are written at the fourth to sixth grade level recommended by the American Medical Association and National Institutes of Health. Design: This is a descriptive and correlational design. Setting: Academic institution. Participants/measurements: Patient education materials from eight major dermatology websites were downloaded and assessed using 10 readability scales. A one-way analysis of variance and Tukey’s Honestly Statistically Different post hoc analysis were performed to determine the difference in readability levels between websites. Results: Two hundred and sixty patient education materials were assessed. Collectively, patient education materials were written at a mean grade level of 11.13, with 65.8 percent of articles written above a tenth grade level and no articles written at the American Medical Association/National Institutes of Health recommended grade levels. Analysis of variance demonstrated a significant difference between websites for each reading scale (p<0.001), which was confirmed with Tukey’s Honestly Statistically Different post hoc analysis. Conclusion: Online patient education materials from major dermatologic association websites are written well above recommended reading levels. Associations should consider revising patient education materials to allow more effective patient comprehension. (J ClinAesthet Dermatol. 2016;9(9):23–28.) PMID:27878059
Assessment of Online Patient Education Materials from Major Dermatologic Associations.
John, Ann M; John, Elizabeth S; Hansberry, David R; Lambert, William Clark
2016-09-01
Objective: Patients increasingly use the internet to find medical information regarding their conditions and treatments. Physicians often supplement visits with written education materials. Online patient education materials from major dermatologic associations should be written at appropriate reading levels to optimize utility for patients. The purpose of this study is to assess online patient education materials from major dermatologic associations and determine if they are written at the fourth to sixth grade level recommended by the American Medical Association and National Institutes of Health. Design: This is a descriptive and correlational design. Setting: Academic institution. Participants/measurements: Patient education materials from eight major dermatology websites were downloaded and assessed using 10 readability scales. A one-way analysis of variance and Tukey's Honestly Statistically Different post hoc analysis were performed to determine the difference in readability levels between websites. Results: Two hundred and sixty patient education materials were assessed. Collectively, patient education materials were written at a mean grade level of 11.13, with 65.8 percent of articles written above a tenth grade level and no articles written at the American Medical Association/National Institutes of Health recommended grade levels. Analysis of variance demonstrated a significant difference between websites for each reading scale (p<0.001), which was confirmed with Tukey's Honestly Statistically Different post hoc analysis. Conclusion: Online patient education materials from major dermatologic association websites are written well above recommended reading levels. Associations should consider revising patient education materials to allow more effective patient comprehension. (J ClinAesthet Dermatol. 2016;9(9):23-28.).
Malvania, Ekta A; Ajithkrishnan, C G
2011-01-01
Anxiety is a subjective state of feelings. Dental anxiety is often reported as a cause of irregular dental attendance, delay in seeking dental care or even avoidance of dental care, resulting in poor oral health related quality of life. To assess the prevalence and socio-demographic correlates of dental anxiety among a group of adult patients attending a dental institution in Vadodara, Gujarat. A total of 150 adult patients waiting in the out-patient Department of Oral Diagnosis of K.M. Shah Dental College and Hospital were included in the study. Subjects were selected by convenience sampling. Dental anxiety was assessed by using Modified Dental Anxiety Scale (MDAS) and self-designed, semi-structured questionnaire incorporating various demographic variables, type and nature of dental treatment. Statistical analysis was done using SPSS version 16. Descriptive analysis, unpaired t-test, one-way analysis of variance (ANOVA) test and multiple logistic regression were applied for statistical analysis. 46% of the participants were dentally anxious. Females were found to be significantly more anxious than males. Subjects residing in villages had significantly more score than those residing in city. Relative influence of age, education, type of dental treatment, and previous dental visit were not significantly associated with dental anxiety. However, those subjects who had past negative dental experience were found to be significantly more anxious. The study shows that dental anxiety was high among study subjects. It is recommended that this issue should be given due importance and addressed in a practical and meaningful manner.
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
NASA Astrophysics Data System (ADS)
Goldstein, Sheldon; Tumulka, Roderich; Zanghı, Nino
2016-07-01
According to statistical mechanics, microstates of an isolated physical system (say, a gas in a box) at time t0 in a given macrostate of less-than-maximal entropy typically evolve in such a way that the entropy at time t increases with |t -t0| in both time directions. In order to account for the observed entropy increase in only one time direction, the thermodynamic arrow of time, one usually appeals to the hypothesis that the initial state of the Universe was one of very low entropy. In certain recent models of cosmology, however, no hypothesis about the initial state of the Universe is invoked. We discuss how the emergence of a thermodynamic arrow of time in such models can nevertheless be compatible with the above-mentioned consequence of statistical mechanics, appearances to the contrary notwithstanding.
Higher certainty of the laser-induced damage threshold test with a redistributing data treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, Lars; Mrohs, Marius; Gyamfi, Mark
2015-10-15
As a consequence of its statistical nature, the measurement of the laser-induced damage threshold holds always risks to over- or underestimate the real threshold value. As one of the established measurement procedures, the results of S-on-1 (and 1-on-1) tests outlined in the corresponding ISO standard 21 254 depend on the amount of data points and their distribution over the fluence scale. With the limited space on a test sample as well as the requirements on test site separation and beam sizes, the amount of data from one test is restricted. This paper reports on a way to treat damage testmore » data in order to reduce the statistical error and therefore measurement uncertainty. Three simple assumptions allow for the assignment of one data point to multiple data bins and therefore virtually increase the available data base.« less
A Primer on Multivariate Analysis of Variance (MANOVA) for Behavioral Scientists
ERIC Educational Resources Information Center
Warne, Russell T.
2014-01-01
Reviews of statistical procedures (e.g., Bangert & Baumberger, 2005; Kieffer, Reese, & Thompson, 2001; Warne, Lazo, Ramos, & Ritter, 2012) show that one of the most common multivariate statistical methods in psychological research is multivariate analysis of variance (MANOVA). However, MANOVA and its associated procedures are often not…
Impulse Response Operators for Structural Complexes
1990-05-12
systems of the complex. The statistical energy analysis (SEA) is one such a device [ 13, 14]. The rendering of SEA from equation (21) and/or (25) lies...Propagation.] 13. L. Cremer, M. Heckl, and E.E. Ungar 1973 Structure-Borne Sound (Springer Verlag). 14. R. H. Lyon 1975 Statistical Energy Analysis of
Espinàs, J A; Riba, M D; Borràs, J M; Sánchez, V
1995-01-01
The study of the relationship between self-reported morbidity, health status and health care utilization presents methodological problems due to the variety of illnesses and medical conditions that one individual may report. In this article, correspondence analysis was use to analyse these relationships. Data from the Spanish National Health Survey pertaining to the region of Catalonia was studied. Statistical analysis included multi-way correspondence analysis (MCA) followed by cluster analysis. The first factor extracted is defined by self-assessed health perception; the second, by limitation of activities, and the third is related to self-reported morbidity caused by chronic and acute health problems. Fourth and fifth factors, capture residual variability and missing values. Acute problems are more related to perception of poor health while chronic problems are related to perception of fair health. Also, it may be possible to distinguish self-reported morbidity due to relapses of chronic diseases from true acute health problems. Cluster analysis classified individuals into four groups: 1) healthy people; 2) people who assess their health as being poor and those with acute health problems; 3) people with chronic health problems, limited activity and a perception of fair health; and 4) missing values. Correspondence analysis is a useful tool when analyzing qualitative variables like those in a health survey.
ERIC Educational Resources Information Center
Luciano, David
2012-01-01
This study examined the relationship between Acculturation Strategy and Social Supports on Acculturative Stress and Academic Performance Among Hispanic/Latino/a College students. The sample of approximately 522 students was recruited at the City College of The City University of New York. Various statistical methods, including one way ANOVAS,…
ERIC Educational Resources Information Center
Sadd, James; Morello-Frosch, Rachel; Pastor, Manuel; Matsuoka, Martha; Prichard, Michele; Carter, Vanessa
2014-01-01
Environmental justice advocates often argue that environmental hazards and their health effects vary by neighborhood, income, and race. To assess these patterns and advance preventive policy, their colleagues in the research world often use complex and methodologically sophisticated statistical and geospatial techniques. One way to bridge the gap…
Efficient Scores, Variance Decompositions and Monte Carlo Swindles.
1984-08-28
to ;r Then a version .of Pythagoras ’ theorem gives the variance decomposition (6.1) varT var S var o(T-S) P P0 0 0 One way to see this is to note...complete sufficient statistics for (B, a) , and that the standard- ized residuals a(y - XB) 6 are ancillary. Basu’s sufficiency- ancillarity theorem
ERIC Educational Resources Information Center
Madheswaran, S.
2007-01-01
Policy makers confronted with the need to introduce health and safety regulations often wonder how to value the benefits of these regulations. One way that a monetary value could be placed on reductions in health risks, including risk of death, is through understanding how people are compensated for the different risks they take. While there is an…