2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
STATISTICAL SAMPLING AND DATA ANALYSIS
Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...
Linkage analysis of systolic blood pressure: a score statistic and computer implementation
Wang, Kai; Peng, Yingwei
2003-01-01
A genome-wide linkage analysis was conducted on systolic blood pressure using a score statistic. The randomly selected Replicate 34 of the simulated data was used. The score statistic was applied to the sibships derived from the general pedigrees. An add-on R program to GENEHUNTER was developed for this analysis and is freely available. PMID:14975145
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
Using independent component analysis for electrical impedance tomography
NASA Astrophysics Data System (ADS)
Yan, Peimin; Mo, Yulong
2004-05-01
Independent component analysis (ICA) is a way to resolve signals into independent components based on the statistical characteristics of the signals. It is a method for factoring probability densities of measured signals into a set of densities that are as statistically independent as possible under the assumptions of a linear model. Electrical impedance tomography (EIT) is used to detect variations of the electric conductivity of the human body. Because there are variations of the conductivity distributions inside the body, EIT presents multi-channel data. In order to get all information contained in different location of tissue it is necessary to image the individual conductivity distribution. In this paper we consider to apply ICA to EIT on the signal subspace (individual conductivity distribution). Using ICA the signal subspace will then be decomposed into statistically independent components. The individual conductivity distribution can be reconstructed by the sensitivity theorem in this paper. Compute simulations show that the full information contained in the multi-conductivity distribution will be obtained by this method.
Usman, Mohammad N.; Umar, Muhammad D.
2018-01-01
Background: Recent studies have revealed that pharmacists have interest in conducting research. However, lack of confidence is a major barrier. Objective: This study evaluated pharmacists’ self-perceived competence and confidence to plan and conduct health-related research. Method: This cross sectional study was conducted during the 89th Annual National Conference of the Pharmaceutical Society of Nigeria in November 2016. An adapted questionnaire was validated and administered to 200 pharmacist delegates during the conference. Result: Overall, 127 questionnaires were included in the analysis. At least 80% of the pharmacists had previous health-related research experience. Pharmacist’s competence and confidence scores were lowest for research skills such as: using software for statistical analysis, choosing and applying appropriate inferential statistical test and method, and outlining detailed statistical plan to be used in data analysis. Highest competence and confidence scores were observed for conception of research idea, literature search and critical appraisal of literature. Pharmacists with previous research experience had higher competence and confidence scores than those with no previous research experience (p<0.05). The only predictor of moderate-to-extreme self-competence and confidence was having at least one journal article publication during the last 5 years. Conclusion: Nigerian pharmacists indicated interest to participate in health-related research. However, self-competence and confidence to plan and conduct research were low. This was particularly so for skills related to statistical analysis. Training programs and building of Pharmacy Practice Research Network are recommended to enhance pharmacist’s research capacity. PMID:29619141
A Mechanical Power Flow Capability for the Finite Element Code NASTRAN
1989-07-01
perimental methods. statistical energy analysis , the finite element method, and a finite element analog-,y using heat conduction equations. Experimental...weights and inertias of the transducers attached to an experimental structure may produce accuracy problems. Statistical energy analysis (SEA) is a...405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Sistems, The M.I.T. Press, (1975). 9. Mickol, J.D., and R.J. Bernhard, "An
General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies
Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong
2013-01-01
We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515
A Data Warehouse Architecture for DoD Healthcare Performance Measurements.
1999-09-01
design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS
10 CFR 431.445 - Determination of small electric motor efficiency.
Code of Federal Regulations, 2012 CFR
2012-01-01
... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3... statistical analysis, computer simulation or modeling, and other analytic evaluation of performance data on.... (ii) If requested by the Department, the manufacturer shall conduct simulations to predict the...
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
ERIC Educational Resources Information Center
Kadel, Robert
2004-01-01
To her surprise, Ms. Logan had just conducted a statistical analysis of her 10th grade biology students' quiz scores. The results indicated that she needed to reinforce mitosis before the students took the high-school proficiency test in three weeks, as required by the state. "Oh! That's easy!" She exclaimed. Teachers like Ms. Logan are…
Child-Centered Play Therapy in the Schools: Review and Meta-Analysis
ERIC Educational Resources Information Center
Ray, Dee C.; Armstrong, Stephen A.; Balkin, Richard S.; Jayne, Kimberly M.
2015-01-01
The authors conducted a meta-analysis and systematic review that examined 23 studies evaluating the effectiveness of child centered play therapy (CCPT) conducted in elementary schools. Meta-analysis results were explored using a random effects model for mean difference and mean gain effect size estimates. Results revealed statistically significant…
Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar
2014-01-01
Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed.
The Use of Meta-Analytic Statistical Significance Testing
ERIC Educational Resources Information Center
Polanin, Joshua R.; Pigott, Terri D.
2015-01-01
Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…
Chi-Square Statistics, Tests of Hypothesis and Technology.
ERIC Educational Resources Information Center
Rochowicz, John A.
The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…
Forest statistics for New Hampshire
Thomas S. Frieswyk; Anne M. Malley
1985-01-01
This is a statistical report on the fourth forest survey of New Hampshire conducted in 1982-83 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that...
Multiple comparison analysis testing in ANOVA.
McHugh, Mary L
2011-01-01
The Analysis of Variance (ANOVA) test has long been an important tool for researchers conducting studies on multiple experimental groups and one or more control groups. However, ANOVA cannot provide detailed information on differences among the various study groups, or on complex combinations of study groups. To fully understand group differences in an ANOVA, researchers must conduct tests of the differences between particular pairs of experimental and control groups. Tests conducted on subsets of data tested previously in another analysis are called post hoc tests. A class of post hoc tests that provide this type of detailed information for ANOVA results are called "multiple comparison analysis" tests. The most commonly used multiple comparison analysis statistics include the following tests: Tukey, Newman-Keuls, Scheffee, Bonferroni and Dunnett. These statistical tools each have specific uses, advantages and disadvantages. Some are best used for testing theory while others are useful in generating new theory. Selection of the appropriate post hoc test will provide researchers with the most detailed information while limiting Type 1 errors due to alpha inflation.
ERIC Educational Resources Information Center
Zheng, Henry Y.; Stewart, Alice A.
This study explores data envelopment analysis (DEA) as a tool for assessing and benchmarking the performance of public research universities. Using of national databases such as those conducted by the National Science Foundation and the National Center for Education Statistics, DEA analysis was conducted of the research and instructional outcomes…
Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar
2014-01-01
Background: Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. Aim: The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. Materials and Methods: A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Results: Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Conclusion: Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed. PMID:24744547
Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian
2011-11-30
We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.
ERIC Educational Resources Information Center
Shafiq, M. Najeeb; Myers, John P.
2014-01-01
This study examines the Swedish national educational voucher scheme and changes in social cohesion. We conduct a statistical analysis using data from the 1999 and 2009 rounds of the International Association for the Evaluation of Educational Achievement's civic education study of 14-year-old students and their attitudes toward the rights of ethnic…
DNA Damage and Genetic Instability as Harbingers of Prostate Cancer
2013-01-01
incidence of prostate cancer as compared to placebo. Primary analysis of this trial indicated no statistically significant effect of selenium...Identification, isolation, staining, processing, and statistical analysis of slides for ERG and PTEN markers (aim 1) and interpretation of these results...participating in this study being conducted under Investigational New Drug #29829 from the Food and Drug Administration. STANDARD TREATMENT Patients
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-05-04
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-01-01
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460
Forest statistics for Vermont: 1973 and 1983
Thomas S. Frieswyk; Anne M. Malley
1985-01-01
A statistical report on the fourth forest survey of Vermont conducted in 1982-1983 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that the state has...
Forest statistics for Delaware: 1986 and 1999
Douglas M. Griffith; Richard H. Widmann; Richard H. Widmann
2001-01-01
A statistical report on the fourth forest inventory of Delaware conducted in 1999 by the Forest Inventory and Analysis Unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there are...
Forest statistics for West Virginia: 1989 and 2000
Douglas M. Griffith; Richard H. Widmann
2003-01-01
A statistical report on the fifth forest inventory of West Virginia conducted in 2000 by the Forest Inventory and Analysis unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there...
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
Conducting Simulation Studies in the R Programming Environment.
Hallgren, Kevin A
2013-10-12
Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.
Gordon, J.D.; Schroder, L.J.; Morden-Moore, A. L.; Bowersox, V.C.
1995-01-01
Separate experiments by the U.S. Geological Survey (USGS) and the Illinois State Water Survey Central Analytical Laboratory (CAL) independently assessed the stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures. The USGS experiment represented a test of sample stability under a diverse range of conditions, whereas the CAL experiment was a controlled test of sample stability. In the experiment by the USGS, a statistically significant (?? = 0.05) relation between [H+] and time was found for the composited filtered, natural, wet-deposition solution when all reported values are included in the analysis. However, if two outlying pH values most likely representing measurement error are excluded from the analysis, the change in [H+] over time was not statistically significant. In the experiment by the CAL, randomly selected samples were reanalyzed between July 1984 and February 1991. The original analysis and reanalysis pairs revealed that [H+] differences, although very small, were statistically different from zero, whereas specific-conductance differences were not. Nevertheless, the results of the CAL reanalysis project indicate there appears to be no consistent, chemically significant degradation in sample integrity with regard to [H+] and specific conductance while samples are stored at room temperature at the CAL. Based on the results of the CAL and USGS studies, short-term (45-60 day) stability of [H+] and specific conductance in natural filtered wet-deposition samples that are shipped and stored unchilled at ambient temperatures was satisfactory.
Statistical analysis of vehicle crashes in Mississippi based on crash data from 2010 to 2014.
DOT National Transportation Integrated Search
2017-08-15
Traffic crash data from 2010 to 2014 were collected by Mississippi Department of Transportation (MDOT) and extracted for the study. Three tasks were conducted in this study: (1) geographic distribution of crashes; (2) descriptive statistics of crash ...
Analysis of Statistical Methods Currently used in Toxicology Journals
Na, Jihye; Yang, Hyeri
2014-01-01
Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health. PMID:25343012
Analysis of Statistical Methods Currently used in Toxicology Journals.
Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min
2014-09-01
Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.
Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok
2017-09-01
This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.
75 FR 160 - Paperwork Reduction Act; 30-Day Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... Cohen, ONDCP, Office of Research and Data Analysis. Dated: December 29, 2009. Daniel R. Petersen, Deputy... support statistical trend analysis. Frequency: Ten sites will each conduct two cycles of surveys from 250...
Rákosi, Csilla
2018-01-22
This paper proposes the use of the tools of statistical meta-analysis as a method of conflict resolution with respect to experiments in cognitive linguistics. With the help of statistical meta-analysis, the effect size of similar experiments can be compared, a well-founded and robust synthesis of the experimental data can be achieved, and possible causes of any divergence(s) in the outcomes can be revealed. This application of statistical meta-analysis offers a novel method of how diverging evidence can be dealt with. The workability of this idea is exemplified by a case study dealing with a series of experiments conducted as non-exact replications of Thibodeau and Boroditsky (PLoS ONE 6(2):e16782, 2011. https://doi.org/10.1371/journal.pone.0016782 ).
Periodontal disease and carotid atherosclerosis: A meta-analysis of 17,330 participants.
Zeng, Xian-Tao; Leng, Wei-Dong; Lam, Yat-Yin; Yan, Bryan P; Wei, Xue-Mei; Weng, Hong; Kwong, Joey S W
2016-01-15
The association between periodontal disease and carotid atherosclerosis has been evaluated primarily in single-center studies, and whether periodontal disease is an independent risk factor of carotid atherosclerosis remains uncertain. This meta-analysis aimed to evaluate the association between periodontal disease and carotid atherosclerosis. We searched PubMed and Embase for relevant observational studies up to February 20, 2015. Two authors independently extracted data from included studies, and odds ratios (ORs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was assessed by the chi-squared test (P<0.1 for statistical significance) and quantified by the I(2) statistic. Data analysis was conducted using the Comprehensive Meta-Analysis (CMA) software. Fifteen observational studies involving 17,330 participants were included in the meta-analysis. The overall pooled result showed that periodontal disease was associated with carotid atherosclerosis (OR: 1.27, 95% CI: 1.14-1.41; P<0.001) but statistical heterogeneity was substantial (I(2)=78.90%). Subgroup analysis of adjusted smoking and diabetes mellitus showed borderline significance (OR: 1.08; 95% CI: 1.00-1.18; P=0.05). Sensitivity and cumulative analyses both indicated that our results were robust. Findings of our meta-analysis indicated that the presence of periodontal disease was associated with carotid atherosclerosis; however, further large-scale, well-conducted clinical studies are needed to explore the precise risk of developing carotid atherosclerosis in patients with periodontal disease. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-08-07
Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.
Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-01-01
Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160
ERIC Educational Resources Information Center
Ling, Guo
2017-01-01
The author conducted sampling and statistical analysis of papers on education policy research collected by the China National Knowledge Infrastructure in the period from the years 2004--2013. Under the current state of education policy research in China, the number of papers correlates positively with the year; the papers are concentrated in…
ERIC Educational Resources Information Center
Xia, Tian; Shumin, Zhang; Yifeng, Wu
2016-01-01
We utilized cross tabulation statistics, word frequency counts, and content analysis of research output to conduct a bibliometric study, and used CiteSpace software to depict a knowledge map for research on entrepreneurship education in China from 2004 to 2013. The study shows that, in this duration, the study of Chinese entrepreneurship education…
Hu, Xiangdong; Liu, Yujiang; Qian, Linxue
2017-10-01
Real-time elastography (RTE) and shear wave elastography (SWE) are noninvasive and easily available imaging techniques that measure the tissue strain, and it has been reported that the sensitivity and the specificity of elastography were better in differentiating between benign and malignant thyroid nodules than conventional technologies. Relevant articles were searched in multiple databases; the comparison of elasticity index (EI) was conducted with the Review Manager 5.0. Forest plots of the sensitivity and specificity and SROC curve of RTE and SWE were performed with STATA 10.0 software. In addition, sensitivity analysis and bias analysis of the studies were conducted to examine the quality of articles; and to estimate possible publication bias, funnel plot was used and the Egger test was conducted. Finally 22 articles which eventually satisfied the inclusion criteria were included in this study. After eliminating the inefficient, benign and malignant nodules were 2106 and 613, respectively. The meta-analysis suggested that the difference of EI between benign and malignant nodules was statistically significant (SMD = 2.11, 95% CI [1.67, 2.55], P < .00001). The overall sensitivities of RTE and SWE were roughly comparable, whereas the difference of specificities between these 2 methods was statistically significant. In addition, statistically significant difference of AUC between RTE and SWE was observed between RTE and SWE (P < .01). The specificity of RTE was statistically higher than that of SWE; which suggests that compared with SWE, RTE may be more accurate on differentiating benign and malignant thyroid nodules.
Primer of statistics in dental research: part I.
Shintani, Ayumi
2014-01-01
Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.
Evaluation of the Kinetic Property of Single-Molecule Junctions by Tunneling Current Measurements.
Harashima, Takanori; Hasegawa, Yusuke; Kiguchi, Manabu; Nishino, Tomoaki
2018-01-01
We investigated the formation and breaking of single-molecule junctions of two kinds of dithiol molecules by time-resolved tunneling current measurements in a metal nanogap. The resulting current trajectory was statistically analyzed to determine the single-molecule conductance and, more importantly, to reveal the kinetic property of the single-molecular junction. These results suggested that combining a measurement of the single-molecule conductance and statistical analysis is a promising method to uncover the kinetic properties of the single-molecule junction.
NASA Astrophysics Data System (ADS)
Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan
2015-09-01
The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.
Grbovic, Vesna; Jurisic-Skevin, Aleksandra; Djukic, Svetlana; Stefanović, Srdjan; Nurkovic, Jasmin
2016-01-01
[Purpose] Painful diabetic polyneuropathy occurs as a complication in 16% of all patients with diabetes mellitus. [Subjects and Methods] A clinical, prospective open-label randomized intervention study was conducted of 60 adult patients, with distal sensorimotor diabetic neuropathy two groups of 30 patients, with diabetes mellitus type 2 with distal sensorimotor diabetic neuropathy. Patients in group A were treated with combined physical procedures, and patients in group B were treated with alpha lipoic acid. [Results] There where a statistically significant improvements in terminal latency and the amplitude of the action potential in group A patients, while group B patients showed a statistically significant improvements in conduction velocity and terminal latency of n. peroneus. Group A patients showed a statistically significant improvements in conduction velocity and terminal latency, while group B patients also showed a statistically significant improvements in conduction velocity and terminal latency. This was reflected in a significant improvements in electrophysiological parameters (conduction velocity, amplitude and latency) of the motor and sensory nerves (n. peroneus, n. suralis). [Conclusion] These results present further evidence justifying of the use of physical agents in the treatment of diabetic sensorimotor polyneuropathy. PMID:27065527
Conducting Multilevel Analyses in Medical Education
ERIC Educational Resources Information Center
Zyphur, Michael J.; Kaplan, Seth A.; Islam, Gazi; Barsky, Adam P.; Franklin, Michael S.
2008-01-01
A significant body of education literature has begun using multilevel statistical models to examine data that reside at multiple levels of analysis. In order to provide a primer for medical education researchers, the current work gives a brief overview of some issues associated with multilevel statistical modeling. To provide an example of this…
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876
Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.
Physiological Efficacy of a Lightweight Ambient Air Cooling Unit for Various Applications
1993-10-01
Acei 10. Mean skin temperature during continuous work . N*TIS. CRjI DTIC TAB 11. Thermal comfort rate during continuous work. . U nagnounf. 4...perceived exertion (RPE) and thermal comfort (TC) were taken every 10 min. Statistical analysis using a 3-way analysis of variance (ANOVA) was conducted...may account for the fact that no statistically significant differences were seen for thermal comfort and ratings of perceived exertion between the IC
Connell, J.F.; Bailey, Z.C.
1989-01-01
A total of 338 single-well aquifer tests from Bear Creek and Melton Valley, Tennessee were statistically grouped to estimate hydraulic conductivities for the geologic formations in the valleys. A cross-sectional simulation model linked to a regression model was used to further refine the statistical estimates for each of the formations and to improve understanding of ground-water flow in Bear Creek Valley. Median hydraulic-conductivity values were used as initial values in the model. Model-calculated estimates of hydraulic conductivity were generally lower than the statistical estimates. Simulations indicate that (1) the Pumpkin Valley Shale controls groundwater flow between Pine Ridge and Bear Creek; (2) all the recharge on Chestnut Ridge discharges to the Maynardville Limestone; (3) the formations having smaller hydraulic gradients may have a greater tendency for flow along strike; (4) local hydraulic conditions in the Maynardville Limestone cause inaccurate model-calculated estimates of hydraulic conductivity; and (5) the conductivity of deep bedrock neither affects the results of the model nor does it add information on the flow system. Improved model performance would require: (1) more water level data for the Copper Ridge Dolomite; (2) improved estimates of hydraulic conductivity in the Copper Ridge Dolomite and Maynardville Limestone; and (3) more water level data and aquifer tests in deep bedrock. (USGS)
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Davis, J.C.
2000-01-01
Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.
NASA Astrophysics Data System (ADS)
Reuter, Matthew; Tschudi, Stephen
When investigating the electrical response properties of molecules, experiments often measure conductance whereas computation predicts transmission probabilities. Although the Landauer-Büttiker theory relates the two in the limit of coherent scattering through the molecule, a direct comparison between experiment and computation can still be difficult. Experimental data (specifically that from break junctions) is statistical and computational results are deterministic. Many studies compare the most probable experimental conductance with computation, but such an analysis discards almost all of the experimental statistics. In this work we develop tools to decipher the Landauer-Büttiker transmission function directly from experimental statistics and then apply them to enable a fairer comparison between experimental and computational results.
ERIC Educational Resources Information Center
Parsad, Basmat; Lewis, Laurie
This study, conducted through the Postsecondary Quick Information System (PEQIS) of the National Center for Education Statistics, was designed to provide current national estimates of the prevalence and characteristics of remedial courses and enrollments in degree-granting 2-year and 4-year postsecondary institutions that enrolled freshmen in fall…
Finding Balance at the Elusive Mean
ERIC Educational Resources Information Center
Hudson, Rick A.
2012-01-01
Data analysis plays an important role in people's lives. Citizens need to be able to conduct critical analyses of statistical information in the work place, in their personal lives, and when portrayed by the media. However, becoming a conscientious consumer of statistics is a gradual process. The experiences that students have with data in the…
Forest wildlife statistics for New Hampshire - 1983
Robert T. Brooks; Thomas S. Frieswyk; Anne M. Malley
1987-01-01
This is a statistical report on the first forest wildlife habitat survey of New Hampshire conducted in 1982-83 by the Forest Inventory, Analysis, and Economics Unit, Northeastern Forest Experiment Station, U.S. Department o f Agriculture, Broomall, Pennsylvania. Results are displayed in 58 tables covering forest area, ownership, land pattern, mast potential, standing...
Systems Analysis of NASA Aviation Safety Program: Final Report
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen
2013-01-01
A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.
A study of the feasibility of statistical analysis of airport performance simulation
NASA Technical Reports Server (NTRS)
Myers, R. H.
1982-01-01
The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.
The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...
2017-01-01
Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series
Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory
2018-03-07
This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.
Point pattern analysis of FIA data
Chris Woodall
2002-01-01
Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...
Advanced microwave soil moisture studies. [Big Sioux River Basin, Iowa
NASA Technical Reports Server (NTRS)
Dalsted, K. J.; Harlan, J. C.
1983-01-01
Comparisons of low level L-band brightness temperature (TB) and thermal infrared (TIR) data as well as the following data sets: soil map and land cover data; direct soil moisture measurement; and a computer generated contour map were statistically evaluated using regression analysis and linear discriminant analysis. Regression analysis of footprint data shows that statistical groupings of ground variables (soil features and land cover) hold promise for qualitative assessment of soil moisture and for reducing variance within the sampling space. Dry conditions appear to be more conductive to producing meaningful statistics than wet conditions. Regression analysis using field averaged TB and TIR data did not approach the higher sq R values obtained using within-field variations. The linear discriminant analysis indicates some capacity to distinguish categories with the results being somewhat better on a field basis than a footprint basis.
The Content of Statistical Requirements for Authors in Biomedical Research Journals
Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang
2016-01-01
Background: Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues’ serious considerations not only at the stage of data analysis but also at the stage of methodological design. Methods: Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Results: Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including “address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation,” and “statistical methods and the reasons.” Conclusions: Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible. PMID:27748343
The Content of Statistical Requirements for Authors in Biomedical Research Journals.
Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang
2016-10-20
Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues' serious considerations not only at the stage of data analysis but also at the stage of methodological design. Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including "address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation," and "statistical methods and the reasons." Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible.
16 CFR 1000.26 - Directorate for Epidemiology.
Code of Federal Regulations, 2010 CFR
2010-01-01
... things, incidents associated with consumer products, based on news clips, medical examiner reports, hotline reports, Internet complaints, and referrals. The Hazard Analysis Division conducts statistical...
Twitter Use in Libraries: An Exploratory Analysis
ERIC Educational Resources Information Center
Aharony, Noa
2010-01-01
Microblogging is a relatively new phenomenon in online social networking that has become increasingly prevalent in the last few years. This study explores the use of Twitter in public and academic libraries to understand microblogging patterns. Analysis of the tweets was conducted in two phases: (1) statistical descriptive analysis and (2) content…
Efficiency Analysis of Public Universities in Thailand
ERIC Educational Resources Information Center
Kantabutra, Saranya; Tang, John C. S.
2010-01-01
This paper examines the performance of Thai public universities in terms of efficiency, using a non-parametric approach called data envelopment analysis. Two efficiency models, the teaching efficiency model and the research efficiency model, are developed and the analysis is conducted at the faculty level. Further statistical analyses are also…
Mental-behavioral health data: 2001 NHIS.
Lied, Terry R
2004-01-01
These data highlights are based on analysis of the 2001 National Health Interview Survey (NHIS) public use data (http://www.cdc. gov/nchs/nhis.htm). NHIS is a multi-purpose survey conducted by the National Center for Health Statistics, Centers for Disease Control and Prevention. NHIS has been conducted continuously since 1957.
NASA Astrophysics Data System (ADS)
Delyana, H.; Rismen, S.; Handayani, S.
2018-04-01
This research is a development research using 4-D design model (define, design, develop, and disseminate). The results of the define stage are analyzed for the needs of the following; Syllabus analysis, textbook analysis, student characteristics analysis and literature analysis. The results of textbook analysis obtained the description that of the two textbooks that must be owned by students also still difficulty in understanding it, the form of presentation also has not facilitated students to be independent in learning to find the concept, textbooks are also not equipped with data processing referrals by using software R. The developed module is considered valid by the experts. Further field trials are conducted to determine the practicality and effectiveness. The trial was conducted to the students of Mathematics Education Study Program of STKIP PGRI which was taken randomly which has not taken Basic Statistics Course that is as many as 4 people. Practical aspects of attention are easy, time efficient, easy to interpret, and equivalence. The practical value in each aspect is 3.7; 3.79, 3.7 and 3.78. Based on the results of the test students considered that the module has been very practical use in learning. This means that the module developed can be used by students in Elementary Statistics learning.
The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.
Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve
2013-09-01
The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.
Hu, Xiangdong; Liu, Yujiang; Qian, Linxue
2017-01-01
Abstract Background: Real-time elastography (RTE) and shear wave elastography (SWE) are noninvasive and easily available imaging techniques that measure the tissue strain, and it has been reported that the sensitivity and the specificity of elastography were better in differentiating between benign and malignant thyroid nodules than conventional technologies. Methods: Relevant articles were searched in multiple databases; the comparison of elasticity index (EI) was conducted with the Review Manager 5.0. Forest plots of the sensitivity and specificity and SROC curve of RTE and SWE were performed with STATA 10.0 software. In addition, sensitivity analysis and bias analysis of the studies were conducted to examine the quality of articles; and to estimate possible publication bias, funnel plot was used and the Egger test was conducted. Results: Finally 22 articles which eventually satisfied the inclusion criteria were included in this study. After eliminating the inefficient, benign and malignant nodules were 2106 and 613, respectively. The meta-analysis suggested that the difference of EI between benign and malignant nodules was statistically significant (SMD = 2.11, 95% CI [1.67, 2.55], P < .00001). The overall sensitivities of RTE and SWE were roughly comparable, whereas the difference of specificities between these 2 methods was statistically significant. In addition, statistically significant difference of AUC between RTE and SWE was observed between RTE and SWE (P < .01). Conclusion: The specificity of RTE was statistically higher than that of SWE; which suggests that compared with SWE, RTE may be more accurate on differentiating benign and malignant thyroid nodules. PMID:29068996
Statistics for demodulation RFI in inverting operational amplifier circuits
NASA Astrophysics Data System (ADS)
Sutu, Y.-H.; Whalen, J. J.
An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.
ERIC Educational Resources Information Center
Armijo, Michael; Lundy-Wagner, Valerie; Merrill, Elizabeth
2012-01-01
This paper asks how doctoral students understand the use of race variables in statistical modeling. More specifically, it examines how doctoral students at two universities are trained to define, operationalize, and analyze race variables. The authors interviewed students and instructors in addition to conducting a document analysis of their texts…
A Computer Evolution in Teaching Undergraduate Time Series
ERIC Educational Resources Information Center
Hodgess, Erin M.
2004-01-01
In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…
1997-06-01
career success for academy graduates relative to officers commissioned from other sources. Favoritism occurs if high-ranking officers who are service... career success as a naval officer? 6 The thesis investigates several databases in an effort to paint a complete statistical picture of naval officer...including both public and private sector career success was conducted by the Standard & Poor’s Corporation with a related analysis by Professor Michael Useem
Ion Channel Conductance Measurements on a Silicon-Based Platform
2006-01-01
calculated using the molecular dynamics code, GROMACS . Reasonable agreement is obtained in the simulated versus measured conductance over the range of...measurements of the lipid giga-seal characteristics have been performed, including AC conductance measurements and statistical analysis in order to...Dynamics kernel self-consistently coupled to Poisson equations using a P3M force field scheme and the GROMACS description of protein structure and
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-05-28
Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.
Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun
2008-01-01
Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045
Diagnosis checking of statistical analysis in RCTs indexed in PubMed.
Lee, Paul H; Tse, Andy C Y
2017-11-01
Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Bayesian statistics: estimating plant demographic parameters
James S. Clark; Michael Lavine
2001-01-01
There are times when external information should be brought tobear on an ecological analysis. experiments are never conducted in a knowledge-free context. The inference we draw from an observation may depend on everything else we know about the process. Bayesian analysis is a method that brings outside evidence into the analysis of experimental and observational data...
Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; ...
2014-11-17
In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.
Phantom Effects in Multilevel Compositional Analysis: Problems and Solutions
ERIC Educational Resources Information Center
Pokropek, Artur
2015-01-01
This article combines statistical and applied research perspective showing problems that might arise when measurement error in multilevel compositional effects analysis is ignored. This article focuses on data where independent variables are constructed measures. Simulation studies are conducted evaluating methods that could overcome the…
ERIC Educational Resources Information Center
Mansfield, Wendy; Farris, Elizabeth
This report provides results of a Fast Response Survey System (FRSS) study conducted by the National Center for Education Statistics for the Office for Civil Rights (OCR). The OCR wanted input for their decision-making process on possible modifications to their biennial survey of a national sample of public school districts (PSDs). The survey, the…
Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting
2018-08-30
Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Hussain, Bilal; Sultana, Tayyaba; Sultana, Salma; Al-Ghanim, Khalid Abdullah; Masoud, Muhammad Shahreef; Mahboob, Shahid
2018-04-01
Cirrhinus mrigala, Labeo rohita, and Catla catla are economically important fish for human consumption in Pakistan, but industrial and sewage pollution has drastically reduced their population in the River Chenab. Statistics are an important tool to analyze and interpret comet assay results. The specific aims of the study were to determine the DNA damage in Cirrhinus mrigala, Labeo rohita, and Catla catla due to chemical pollution and to assess the validity of statistical analyses to determine the viability of the comet assay for a possible use with these freshwater fish species as a good indicator of pollution load and habitat degradation. Comet assay results indicated a significant (P < 0.05) degree of DNA fragmentation in Cirrhinus mrigala followed by Labeo rohita and Catla catla in respect to comet head diameter, comet tail length, and % DNA damage. Regression analysis and correlation matrices conducted among the parameters of the comet assay affirmed the precision and the legitimacy of the results. The present study, therefore, strongly recommends that genotoxicological studies conduct appropriate analysis of the various components of comet assays to offer better interpretation of the assay data.
Patterson, Megan S; Goodson, Patricia
2017-05-01
Compulsive exercise, a form of unhealthy exercise often associated with prioritizing exercise and feeling guilty when exercise is missed, is a common precursor to and symptom of eating disorders. College-aged women are at high risk of exercising compulsively compared with other groups. Social network analysis (SNA) is a theoretical perspective and methodology allowing researchers to observe the effects of relational dynamics on the behaviors of people. SNA was used to assess the relationship between compulsive exercise and body dissatisfaction, physical activity, and network variables. Descriptive statistics were conducted using SPSS, and quadratic assignment procedure (QAP) analyses were conducted using UCINET. QAP regression analysis revealed a statistically significant model (R 2 = .375, P < .0001) predicting compulsive exercise behavior. Physical activity, body dissatisfaction, and network variables were statistically significant predictor variables in the QAP regression model. In our sample, women who are connected to "important" or "powerful" people in their network are likely to have higher compulsive exercise scores. This result provides healthcare practitioners key target points for intervention within similar groups of women. For scholars researching eating disorders and associated behaviors, this study supports looking into group dynamics and network structure in conjunction with body dissatisfaction and exercise frequency.
The multiple imputation method: a case study involving secondary data analysis.
Walani, Salimah R; Cleland, Charles M
2015-05-01
To illustrate with the example of a secondary data analysis study the use of the multiple imputation method to replace missing data. Most large public datasets have missing data, which need to be handled by researchers conducting secondary data analysis studies. Multiple imputation is a technique widely used to replace missing values while preserving the sample size and sampling variability of the data. The 2004 National Sample Survey of Registered Nurses. The authors created a model to impute missing values using the chained equation method. They used imputation diagnostics procedures and conducted regression analysis of imputed data to determine the differences between the log hourly wages of internationally educated and US-educated registered nurses. The authors used multiple imputation procedures to replace missing values in a large dataset with 29,059 observations. Five multiple imputed datasets were created. Imputation diagnostics using time series and density plots showed that imputation was successful. The authors also present an example of the use of multiple imputed datasets to conduct regression analysis to answer a substantive research question. Multiple imputation is a powerful technique for imputing missing values in large datasets while preserving the sample size and variance of the data. Even though the chained equation method involves complex statistical computations, recent innovations in software and computation have made it possible for researchers to conduct this technique on large datasets. The authors recommend nurse researchers use multiple imputation methods for handling missing data to improve the statistical power and external validity of their studies.
NASA Astrophysics Data System (ADS)
Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.
2014-12-01
The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.
Statistical issues in the design, conduct and analysis of two large safety studies.
Gaffney, Michael
2016-10-01
The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Naumovich, E. N.; Kharton, V. V.; Yaremchenko, A. A.; Patrakeev, M. V.; Kellerman, D. G.; Logvinovich, D. I.; Kozhevnikov, V. L.
2006-08-01
A statistical thermodynamic approach to analyze defect thermodynamics in strongly nonideal solid solutions was proposed and validated by a case study focused on the oxygen intercalation processes in mixed-conducting LaGa0.65Mg0.15Ni0.20O3-δ perovskite. The oxygen nonstoichiometry of Ni-doped lanthanum gallate, measured by coulometric titration and thermogravimetric analysis at 923-1223K in the oxygen partial pressure range 5×10-5to0.9atm , indicates the coexistence of Ni2+ , Ni3+ , and Ni4+ oxidation states. The formation of tetravalent nickel was also confirmed by the magnetic susceptibility data at 77-600K , and by the analysis of p -type electronic conductivity and Seebeck coefficient as function of the oxygen pressure at 1023-1223K . The oxygen thermodynamics and the partial ionic and hole conductivities are strongly affected by the point-defect interactions, primarily the Coulombic repulsion between oxygen vacancies and/or electron holes and the vacancy association with Mg2+ cations. These factors can be analyzed by introducing the defect interaction energy in the concentration-dependent part of defect chemical potentials expressed by the discrete Fermi-Dirac distribution, and taking into account the probabilities of local configurations calculated via binomial distributions.
Marketing of Personalized Cancer Care on the Web: An Analysis of Internet Websites
Cronin, Angel; Bair, Elizabeth; Lindeman, Neal; Viswanath, Vish; Janeway, Katherine A.
2015-01-01
Internet marketing may accelerate the use of care based on genomic or tumor-derived data. However, online marketing may be detrimental if it endorses products of unproven benefit. We conducted an analysis of Internet websites to identify personalized cancer medicine (PCM) products and claims. A Delphi Panel categorized PCM as standard or nonstandard based on evidence of clinical utility. Fifty-five websites, sponsored by commercial entities, academic institutions, physicians, research institutes, and organizations, that marketed PCM included somatic (58%) and germline (20%) analysis, interpretive services (15%), and physicians/institutions offering personalized care (44%). Of 32 sites offering somatic analysis, 56% included specific test information (range 1–152 tests). All statistical tests were two-sided, and comparisons of website content were conducted using McNemar’s test. More websites contained information about the benefits than limitations of PCM (85% vs 27%, P < .001). Websites specifying somatic analysis were statistically significantly more likely to market one or more nonstandard tests as compared with standard tests (88% vs 44%, P = .04). PMID:25745021
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
ERIC Educational Resources Information Center
Cooper, Harris; Patall, Erika A.
2009-01-01
The authors describe the relative benefits of conducting meta-analyses with (a) individual participant data (IPD) gathered from the constituent studies and (b) aggregated data (AD), or the group-level statistics (in particular, effect sizes) that appear in reports of a study's results. Given that both IPD and AD are equally available,…
HOW WELL ARE HYDRAULIC CONDUCTIVITY VARIATIONS APPROXIMATED BY ADDITIVE STABLE PROCESSES? (R826171)
Analysis of the higher statistical moments of a hydraulic conductivity (K) and an intrinsic permeability (k) data set leads to the conclusion that the increments of the data and the logs of the data are not governed by Levy-stable or Gaussian dis...
Effective Analysis of Reaction Time Data
ERIC Educational Resources Information Center
Whelan, Robert
2008-01-01
Most analyses of reaction time (RT) data are conducted by using the statistical techniques with which psychologists are most familiar, such as analysis of variance on the sample mean. Unfortunately, these methods are usually inappropriate for RT data, because they have little power to detect genuine differences in RT between conditions. In…
DOT National Transportation Integrated Search
2013-04-01
We analyzed the use of energy by Alaskas transportation sectors to assess the impact of sudden fuel prices changes. : We conducted three types of analysis: 1) Development of broad energy use statistics for each transportation sector, : including t...
An Interinstitutional Analysis of Faculty Teaching Load.
ERIC Educational Resources Information Center
Ahrens, Stephen W.
A two-year interinstitutional study among 15 cooperating universities was conducted to determine whether significant differences exist in teaching loads among the selected universities as measured by student credit hours produced by full-time equivalent faculty. The statistical model was a multivariate analysis of variance with fixed effects and…
Guidelines for Using the "Q" Test in Meta-Analysis
ERIC Educational Resources Information Center
Maeda, Yukiko; Harwell, Michael R.
2016-01-01
The "Q" test is regularly used in meta-analysis to examine variation in effect sizes. However, the assumptions of "Q" are unlikely to be satisfied in practice prompting methodological researchers to conduct computer simulation studies examining its statistical properties. Narrative summaries of this literature are available but…
MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data
Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.
2014-01-01
Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Babu, Giridhara R; Murthy, G V S; Ana, Yamuna; Patel, Prital; Deepa, R; Neelon, Sara E Benjamin; Kinra, Sanjay; Reddy, K Srinath
2018-01-01
AIM To perform a meta-analysis of the association of obesity with hypertension and type 2 diabetes mellitus (T2DM) in India among adults. METHODS To conduct meta-analysis, we performed comprehensive, electronic literature search in the PubMed, CINAHL Plus, and Google Scholar. We restricted the analysis to studies with documentation of some measure of obesity namely; body mass index, waist-hip ratio, waist circumference and diagnosis of hypertension or diagnosis of T2DM. By obtaining summary estimates of all included studies, the meta-analysis was performed using both RevMan version 5 and “metan” command STATA version 11. Heterogeneity was measured by I2 statistic. Funnel plot analysis has been done to assess the study publication bias. RESULTS Of the 956 studies screened, 18 met the eligibility criteria. The pooled odds ratio between obesity and hypertension was 3.82 (95%CI: 3.39 to 4.25). The heterogeneity around this estimate (I2 statistic) was 0%, indicating low variability. The pooled odds ratio from the included studies showed a statistically significant association between obesity and T2DM (OR = 1.14, 95%CI: 1.04 to 1.24) with a high degree of variability. CONCLUSION Despite methodological differences, obesity showed significant, potentially plausible association with hypertension and T2DM in studies conducted in India. Being a modifiable risk factor, our study informs setting policy priority and intervention efforts to prevent debilitating complications. PMID:29359028
Babu, Giridhara R; Murthy, G V S; Ana, Yamuna; Patel, Prital; Deepa, R; Neelon, Sara E Benjamin; Kinra, Sanjay; Reddy, K Srinath
2018-01-15
To perform a meta-analysis of the association of obesity with hypertension and type 2 diabetes mellitus (T2DM) in India among adults. To conduct meta-analysis, we performed comprehensive, electronic literature search in the PubMed, CINAHL Plus, and Google Scholar. We restricted the analysis to studies with documentation of some measure of obesity namely; body mass index, waist-hip ratio, waist circumference and diagnosis of hypertension or diagnosis of T2DM. By obtaining summary estimates of all included studies, the meta-analysis was performed using both RevMan version 5 and "metan" command STATA version 11. Heterogeneity was measured by I 2 statistic. Funnel plot analysis has been done to assess the study publication bias. Of the 956 studies screened, 18 met the eligibility criteria. The pooled odds ratio between obesity and hypertension was 3.82 (95%CI: 3.39 to 4.25). The heterogeneity around this estimate (I2 statistic) was 0%, indicating low variability. The pooled odds ratio from the included studies showed a statistically significant association between obesity and T2DM (OR = 1.14, 95%CI: 1.04 to 1.24) with a high degree of variability. Despite methodological differences, obesity showed significant, potentially plausible association with hypertension and T2DM in studies conducted in India. Being a modifiable risk factor, our study informs setting policy priority and intervention efforts to prevent debilitating complications.
Feng, Yanguo; Cheng, Dejun; Zhang, Chaofeng; Li, Yuchun; Zhang, Zhiying; Wang, Juan; Feng, Xiao
2017-02-01
Accumulating studies have reported inconsistent association between ErbB4 single nucleotide polymorphisms (SNPs) and predisposition to schizophrenia. To better interpret this issue, here we conducted a meta-analysis using published case-control studies. We conducted a systematic search of MEDLINE (Pubmed), Embase (Ovid), Web of Science (Thomson-Reuters) to identify relevant references. The association between ErbB4 SNPs and schizophrenia was assessed by odds ratios (ORs) and 95% confidence intervals (CIs). Between-study heterogeneity was evaluated by I squared (I) statistics and Cochran's Q test. To appraise the stability of results, we employed sensitivity analysis by omitting 1 single study each time. To assess the potential publication bias, we conducted trim and fill analysis. Seven studies published in English comprising 3162 cases and 4264 controls were included in this meta-analysis. Meta-analyses showed that rs707284 is statistically significantly associated with schizophrenia susceptibility among Asian and Caucasian populations under the allelic model (OR = 0.91, 95% CI: 0.83-0.99, P = 0.035). Additionally, a marginal association (P < 0.1) was observed between rs707284 and schizophrenia risk among Asian and Caucasian populations under the recessive (OR = 0.85, 95% CI: 0.72-1.01, P = 0.065) and homozygous (OR = 0.84, 95% CI: 0.68-1.03, P = 0.094) models. In the Asian subgroup, rs707284 was also noted to be marginally associated with schizophrenia under the recessive model (OR = 0.84, 95% CI: 0.70-1.00, P = 0.053). However, no statistically significant association was found between rs839523, rs7598440, rs3748962, and rs2371276 and schizophrenia risk. This meta-analysis suggested that rs707284 may be a potential ErbB4 SNP associated with susceptibility to schizophrenia. Nevertheless, due to the limited sample size in this meta-analysis, more large-scale association studies are still needed to confirm the results.
Overholser, Brian R; Sowinski, Kevin M
2007-12-01
Biostatistics is the application of statistics to biologic data. The field of statistics can be broken down into 2 fundamental parts: descriptive and inferential. Descriptive statistics are commonly used to categorize, display, and summarize data. Inferential statistics can be used to make predictions based on a sample obtained from a population or some large body of information. It is these inferences that are used to test specific research hypotheses. This 2-part review will outline important features of descriptive and inferential statistics as they apply to commonly conducted research studies in the biomedical literature. Part 1 in this issue will discuss fundamental topics of statistics and data analysis. Additionally, some of the most commonly used statistical tests found in the biomedical literature will be reviewed in Part 2 in the February 2008 issue.
Irani, Morvarid; Amirian, Malihe; Sadeghi, Ramin; Lez, Justine Le; Latifnejad Roudsari, Robab
2017-08-29
To evaluate the effect of folate and folate plus zinc supplementation on endocrine parameters and sperm characteristics in sub fertile men. We conducted a systematic review and meta-analysis. Electronic databases of Medline, Scopus , Google scholar and Persian databases (SID, Iran medex, Magiran, Medlib, Iran doc) were searched from 1966 to December 2016 using a set of relevant keywords including "folate or folic acid AND (infertility, infertile, sterility)".All available randomized controlled trials (RCTs), conducted on a sample of sub fertile men with semen analyses, who took oral folic acid or folate plus zinc, were included. Data collected included endocrine parameters and sperm characteristics. Statistical analyses were done by Comprehensive Meta-analysis Version 2. In total, seven studies were included. Six studies had sufficient data for meta-analysis. "Sperm concentration was statistically higher in men supplemented with folate than with placebo (P < .001)". However, folate supplementation alone did not seem to be more effective than the placebo on the morphology (P = .056) and motility of the sperms (P = .652). Folate plus zinc supplementation did not show any statistically different effect on serum testosterone (P = .86), inhibin B (P = .84), FSH (P = .054), and sperm motility (P = .169) as compared to the placebo. Yet, folate plus zinc showed statistically higher effect on the sperm concentration (P < .001), morphology (P < .001), and serum folate level (P < .001) as compared to placebo. Folate plus zinc supplementation has a positive effect on sperm characteristics in sub fertile men. However, these results should be interpreted with caution due to the important heterogeneity of the studies included in this meta-analysis. Further trials are still needed to confirm the current findings.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E
2014-04-01
This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.
Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve
2013-10-01
The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the study. This will minimise analytic bias and conforms to current best practice in conducting clinical trials. © 2013 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
ERIC Educational Resources Information Center
Knight, Jennifer L.
This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…
Alternative types of molecule-decorated atomic chains in Au–CO–Au single-molecule junctions
Balogh, Zoltán; Makk, Péter
2015-01-01
Summary We investigate the formation and evolution of Au–CO single-molecule break junctions. The conductance histogram exhibits two distinct molecular configurations, which are further investigated by a combined statistical analysis. According to conditional histogram and correlation analysis these molecular configurations show strong anticorrelations with each other and with pure Au monoatomic junctions and atomic chains. We identify molecular precursor configurations with somewhat higher conductance, which are formed prior to single-molecule junctions. According to detailed length analysis two distinct types of molecule-affected chain-formation processes are observed, and we compare these results to former theoretical calculations considering bridge- and atop-type molecular configurations where the latter has reduced conductance due to destructive Fano interference. PMID:26199840
Alternative types of molecule-decorated atomic chains in Au-CO-Au single-molecule junctions.
Balogh, Zoltán; Makk, Péter; Halbritter, András
2015-01-01
We investigate the formation and evolution of Au-CO single-molecule break junctions. The conductance histogram exhibits two distinct molecular configurations, which are further investigated by a combined statistical analysis. According to conditional histogram and correlation analysis these molecular configurations show strong anticorrelations with each other and with pure Au monoatomic junctions and atomic chains. We identify molecular precursor configurations with somewhat higher conductance, which are formed prior to single-molecule junctions. According to detailed length analysis two distinct types of molecule-affected chain-formation processes are observed, and we compare these results to former theoretical calculations considering bridge- and atop-type molecular configurations where the latter has reduced conductance due to destructive Fano interference.
Content Analysis of Papers Submitted to "Communications in Information Literacy," 2007-2013
ERIC Educational Resources Information Center
Hollister, Christopher V.
2014-01-01
The author conducted a content analysis of papers submitted to the journal, "Communications in Information Literacy," from the years 2007-2013. The purpose was to investigate and report on the overall quality characteristics of a statistically significant sample of papers submitted to a single-topic, open access, library and information…
The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.
ERIC Educational Resources Information Center
Hummel, Thomas J.
An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…
Analysis of Superintendent Longevity in Large School Districts: A Qualitative Study
ERIC Educational Resources Information Center
Mouton, Nikki Golar
2013-01-01
School district leadership matters, as evidenced by a meta-analysis of 27 reports and 1,210 districts conducted by Waters and Marzano (2006) which highlights a statistically significant correlation between district leadership and student achievement. Because this relationship is significant, it is important for school districts to have effective…
Determinants of Linear Judgment: A Meta-Analysis of Lens Model Studies
ERIC Educational Resources Information Center
Karelaia, Natalia; Hogarth, Robin M.
2008-01-01
The mathematical representation of E. Brunswik's (1952) lens model has been used extensively to study human judgment and provides a unique opportunity to conduct a meta-analysis of studies that covers roughly 5 decades. Specifically, the authors analyzed statistics of the "lens model equation" (L. R. Tucker, 1964) associated with 249 different…
Computers and Student Learning: Interpreting the Multivariate Analysis of PISA 2000
ERIC Educational Resources Information Center
Bielefeldt, Talbot
2005-01-01
In November 2004, economists Thomas Fuchs and Ludger Woessmann published a statistical analysis of the relationship between technology and student achievement using year 2000 data from the Programme for International Student Assessment (PISA). The 2000 PISA was the first in a series of triennial assessments of 15-year-olds conducted by the…
Applications of Nonlinear Principal Components Analysis to Behavioral Data.
ERIC Educational Resources Information Center
Hicks, Marilyn Maginley
1981-01-01
An empirical investigation of the statistical procedure entitled nonlinear principal components analysis was conducted on a known equation and on measurement data in order to demonstrate the procedure and examine its potential usefulness. This method was suggested by R. Gnanadesikan and based on an early paper of Karl Pearson. (Author/AL)
A Meta-Analysis of Writing Instruction for Students in the Elementary Grades
ERIC Educational Resources Information Center
Graham, Steve; McKeown, Debra; Kiuhara, Sharlene; Harris, Karen R.
2012-01-01
In an effort to identify effective instructional practices for teaching writing to elementary grade students, we conducted a meta-analysis of the writing intervention literature, focusing our efforts on true and quasi-experiments. We located 115 documents that included the statistics for computing an effect size (ES). We calculated an average…
Analysis of Employment Flow of Landscape Architecture Graduates in Agricultural Universities
ERIC Educational Resources Information Center
Yao, Xia; He, Linchun
2012-01-01
A statistical analysis of employment flow of landscape architecture graduates was conducted on the employment data of graduates major in landscape architecture in 2008 to 2011. The employment flow of graduates was to be admitted to graduate students, industrial direction and regional distribution, etc. Then, the features of talent flow and factors…
ERIC Educational Resources Information Center
Brint, Steven; Yoshikawa, Sarah R. K.; Rotondi, Matthew B.; Viggiano, Tiffany; Maldonado, John
2016-01-01
Press reports and industry statistics both give incomplete pictures of the outcomes of the Great Recession for U.S. four-year colleges and universities. To address these gaps, we conducted a statistical analysis of all articles that appeared in Lexis-Nexis on a sample of more than 300 U.S. colleges and universities during the Recession years. We…
Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti
2016-07-01
A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J.; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T.; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti
2016-01-01
Motivation: A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. Results: We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness. Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Availability and implementation: Code is available at https://github.com/aalto-ics-kepaco Contacts: anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153689
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Tilson, Julie K; Marshall, Katie; Tam, Jodi J; Fetters, Linda
2016-04-22
A primary barrier to the implementation of evidence based practice (EBP) in physical therapy is therapists' limited ability to understand and interpret statistics. Physical therapists demonstrate limited skills and report low self-efficacy for interpreting results of statistical procedures. While standards for physical therapist education include statistics, little empirical evidence is available to inform what should constitute such curricula. The purpose of this study was to conduct a census of the statistical terms and study designs used in physical therapy literature and to use the results to make recommendations for curricular development in physical therapist education. We conducted a bibliometric analysis of 14 peer-reviewed journals associated with the American Physical Therapy Association over 12 months (Oct 2011-Sept 2012). Trained raters recorded every statistical term appearing in identified systematic reviews, primary research reports, and case series and case reports. Investigator-reported study design was also recorded. Terms representing the same statistical test or concept were combined into a single, representative term. Cumulative percentage was used to identify the most common representative statistical terms. Common representative terms were organized into eight categories to inform curricular design. Of 485 articles reviewed, 391 met the inclusion criteria. These 391 articles used 532 different terms which were combined into 321 representative terms; 13.1 (sd = 8.0) terms per article. Eighty-one representative terms constituted 90% of all representative term occurrences. Of the remaining 240 representative terms, 105 (44%) were used in only one article. The most common study design was prospective cohort (32.5%). Physical therapy literature contains a large number of statistical terms and concepts for readers to navigate. However, in the year sampled, 81 representative terms accounted for 90% of all occurrences. These "common representative terms" can be used to inform curricula to promote physical therapists' skills, competency, and confidence in interpreting statistics in their professional literature. We make specific recommendations for curriculum development informed by our findings.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
NASA Astrophysics Data System (ADS)
Sergis, Antonis; Hardalupas, Yannis
2011-05-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.
Sergis, Antonis; Hardalupas, Yannis
2011-05-19
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
2011-01-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932
Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P
2016-06-01
Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
An Exploratory Data Analysis System for Support in Medical Decision-Making
Copeland, J. A.; Hamel, B.; Bourne, J. R.
1979-01-01
An experimental system was developed to allow retrieval and analysis of data collected during a study of neurobehavioral correlates of renal disease. After retrieving data organized in a relational data base, simple bivariate statistics of parametric and nonparametric nature could be conducted. An “exploratory” mode in which the system provided guidance in selection of appropriate statistical analyses was also available to the user. The system traversed a decision tree using the inherent qualities of the data (e.g., the identity and number of patients, tests, and time epochs) to search for the appropriate analyses to employ.
Mixed-Methods Research in the Discipline of Nursing.
Beck, Cheryl Tatano; Harrison, Lisa
2016-01-01
In this review article, we examined the prevalence and characteristics of 294 mixed-methods studies in the discipline of nursing. Creswell and Plano Clark's typology was most frequently used along with concurrent timing. Bivariate statistics was most often the highest level of statistics reported in the results. As for qualitative data analysis, content analysis was most frequently used. The majority of nurse researchers did not specifically address the purpose, paradigm, typology, priority, timing, interaction, or integration of their mixed-methods studies. Strategies are suggested for improving the design, conduct, and reporting of mixed-methods studies in the discipline of nursing.
Statistical evidence of strain induced breaking of metallic point contacts
NASA Astrophysics Data System (ADS)
Alwan, Monzer; Candoni, Nadine; Dumas, Philippe; Klein, Hubert R.
2013-06-01
A scanning tunneling microscopy in break junction regime and a mechanically controllable break junction are used to acquire thousands of conductance-elongation curves by stretching until breaking and re-connecting Au junctions. From a robust statistical analysis performed on large sets of experiments, parameters such as lifetime, elongation and occurrence probabilities are extracted. The analysis of results obtained for different stretching speeds of the electrodes indicates that the breaking mechanism of di- and mono-atomic junction is identical, and that the junctions undergo atomic rearrangement during their stretching and at the moment of breaking.
[Analysis the epidemiological features of 3,258 patients with allergic rhinitis in Yichang City].
Chen, Bo; Zhang, Zhimao; Pei, Zhi; Chen, Shihan; Du, Zhimei; Lan, Yan; Han, Bei; Qi, Qi
2015-02-01
To investigate the epidemiological features in patients with allergic rhinitis (AR) in Yichang city, and put forward effective prevention and control measures. Collecting the data of allergic rhinitis in city proper from 2010 to 2013, input the data into the database and used statistical analysis. In recent years, the AR patients in this area increased year by year. The spring and the winter were the peak season of onset. The patients was constituted by young men. There was statistically significant difference between the age, the area,and the gender (P < 0.01). The history of allergy and the diseases related to the gender composition had statistical significance difference (P < 0.05). The allergens and the positive degree in gender, age structure had statistically significant difference (P < 0.01). Need to conduct the healthy propaganda and education, optimizing the environment, change the bad habits, timely medical treatment, standard treatment.
Use of check lists in assessing the statistical content of medical studies.
Gardner, M J; Machin, D; Campbell, M J
1986-01-01
Two check lists are used routinely in the statistical assessment of manuscripts submitted to the "BMJ." One is for papers of a general nature and the other specifically for reports on clinical trials. Each check list includes questions on the design, conduct, analysis, and presentation of studies, and answers to these contribute to the overall statistical evaluation. Only a small proportion of submitted papers are assessed statistically, and these are selected at the refereeing or editorial stage. Examination of the use of the check lists showed that most papers contained statistical failings, many of which could easily be remedied. It is recommended that the check lists should be used by statistical referees, editorial staff, and authors and also during the design stage of studies. PMID:3082452
Marketing of personalized cancer care on the web: an analysis of Internet websites.
Gray, Stacy W; Cronin, Angel; Bair, Elizabeth; Lindeman, Neal; Viswanath, Vish; Janeway, Katherine A
2015-05-01
Internet marketing may accelerate the use of care based on genomic or tumor-derived data. However, online marketing may be detrimental if it endorses products of unproven benefit. We conducted an analysis of Internet websites to identify personalized cancer medicine (PCM) products and claims. A Delphi Panel categorized PCM as standard or nonstandard based on evidence of clinical utility. Fifty-five websites, sponsored by commercial entities, academic institutions, physicians, research institutes, and organizations, that marketed PCM included somatic (58%) and germline (20%) analysis, interpretive services (15%), and physicians/institutions offering personalized care (44%). Of 32 sites offering somatic analysis, 56% included specific test information (range 1-152 tests). All statistical tests were two-sided, and comparisons of website content were conducted using McNemar's test. More websites contained information about the benefits than limitations of PCM (85% vs 27%, P < .001). Websites specifying somatic analysis were statistically significantly more likely to market one or more nonstandard tests as compared with standard tests (88% vs 44%, P = .04). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Robustness of Multiple Objective Decision Analysis Preference Functions
2002-06-01
p p′ : The probability of some event. ,i ip q : The probability of event . i Π : An aggregation of proportional data used in calculating a test ...statistical tests of the significance of the term and also is conducted in a multivariate framework rather than the ROSA univariate approach. A...residual error is ˆ−e = y y (45) The coefficient provides a ready indicator of the contribution for the associated variable and statistical tests
Expert Planning Processes in Writing
1990-12-17
For example, here are some target ideas from the immunity text: " protozoa attack red blood cells " agglutin clumps pathogens together * vaccination ...majors may have already understood (e.g., vaccination is an injection of a virus), we conducted a second analysis scoring only for the "hard" concepts...page texts, a clear and an unclear version on each of two topics, autism and statistics. The clear autism and the unclear statistics texts were
How Do Microfinance Programs Contribute to Poverty Reduction
2016-09-01
areas have experienced statistically higher incidents of crime tied to class conflict.90 Land tax systems under the British were also responsible for...countries.173 This low delinquency rate is credited to the lack of alternative opportunities that are available to the poor.174 According to Muhammad...TOTAL: 909 54.6 60.2 55 Figure 2. Program Duration and Objective Poverty.197 The statistical analysis conducted by Chowdhury, Gosh and Wright finds
The need for conducting forensic analysis of decommissioned bridges.
DOT National Transportation Integrated Search
2014-01-01
A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...
12 & 15 passenger vans tire pressure study : preliminary results
DOT National Transportation Integrated Search
2005-05-01
A study was conducted by the National Highway Traffic Safety Administration's (NHTSA's) National Center for Statistics and Analysis (NCSA) to determine the extent of underinflation and observe the tire condition in 12- and 15-passenger vans. This Res...
Landscape preference assessment of Louisiana river landscapes: a methodological study
Michael S. Lee
1979-01-01
The study pertains to the development of an assessment system for the analysis of visual preference attributed to Louisiana river landscapes. The assessment system was utilized in the evaluation of 20 Louisiana river scenes. Individuals were tested for their free choice preference for the same scenes. A statistical analysis was conducted to examine the relationship...
Number of Black Children in Extreme Poverty Hits Record High. Analysis Background.
ERIC Educational Resources Information Center
Children's Defense Fund, Washington, DC.
To examine the experiences of black children and poverty, researchers conducted a computer analysis of data from the U.S. Census Bureau's Current Population Survey, the source of official government poverty statistics. The data are through 2001. Results indicated that nearly 1 million black children were living in extreme poverty, with after-tax…
Explore the Usefulness of Person-Fit Analysis on Large-Scale Assessment
ERIC Educational Resources Information Center
Cui, Ying; Mousavi, Amin
2015-01-01
The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…
Reply to discussion: ground water response to forest harvest: implications or hillslope stability
Amod Dhakal; Roy C. Sidle; A.C. Johnson; R.T. Edwards
2008-01-01
Dhakal and Sidle (this volume) have requested clarification of some of the rationales and approaches used in analyses described by Johnson et al. (2007). Here we further describe hydrologic conditions typical of southeast Alaska and elaborate on an accepted methodology used for conducting analysis of covariance statistical analysis (ANCOVA). We discuss Dhakal and Sidle...
ERIC Educational Resources Information Center
Raker, Jeffrey R.; Holme, Thomas A.
2014-01-01
A cluster analysis was conducted with a set of survey data on chemistry faculty familiarity with 13 assessment terms. Cluster groupings suggest a high, middle, and low overall familiarity with the terminology and an independent high and low familiarity with terms related to fundamental statistics. The six resultant clusters were found to be…
A Content Analysis of Dissertations in the Field of Educational Technology: The Case of Turkey
ERIC Educational Resources Information Center
Durak, Gurhan; Cankaya, Serkan; Yunkul, Eyup; Misirli, Zeynel Abidin
2018-01-01
The present study aimed at conducting content analysis on dissertations carried out so far in the field of Educational Technology in Turkey. A total of 137 dissertations were examined to determine the key words, academic discipline, research areas, theoretical frameworks, research designs and models, statistical analyses, data collection tools,…
Villagómez-Ornelas, Paloma; Hernández-López, Pedro; Carrasco-Enríquez, Brenda; Barrios-Sánchez, Karina; Pérez-Escamilla, Rafael; Melgar-Quiñónez, Hugo
2014-01-01
This article validates the statistical consistency of two food security scales: the Mexican Food Security Scale (EMSA) and the Latin American and Caribbean Food Security Scale (ELCSA). Validity tests were conducted in order to verify that both scales were consistent instruments, conformed by independent, properly calibrated and adequately sorted items, arranged in a continuum of severity. The following tests were developed: sorting of items; Cronbach's alpha analysis; parallelism of prevalence curves; Rasch models; sensitivity analysis through mean differences' hypothesis test. The tests showed that both scales meet the required attributes and are robust statistical instruments for food security measurement. This is relevant given that the lack of access to food indicator, included in multidimensional poverty measurement in Mexico, is calculated with EMSA.
Handhayanti, Ludwy; Rustina, Yeni; Budiati, Tri
Premature infants tend to lose heat quickly. This loss can be aggravated when they have received an invasive procedure involving a venous puncture. This research uses crossover design by conducting 2 intervention tests to compare 2 different treatments on the same sample. This research involved 2 groups with 18 premature infants in each. The process of data analysis used a statistical independent t test. Interventions conducted in an open incubator showed a p value of .001 which statistically related to heat loss in premature infants. In contrast, the radiant warmer p value of .001 statistically referred to a different range of heat gain before and after the venous puncture was given. The radiant warmer saved the premature infant from hypothermia during the invasive procedure. However, it is inadvisable for routine care of newborn infants since it can increase insensible water loss.
History of water quality parameters - a study on the Sinos River/Brazil.
Konzen, G B; Figueiredo, J A S; Quevedo, D M
2015-05-01
Water is increasingly becoming a valuable resource, constituting one of the central themes of environmental, economic and social discussions. The Sinos River, located in southern Brazil, is the main river from the Sinos River Basin, representing a source of drinking water supply for a highly populated region. Considering its size and importance, it becomes necessary to conduct a study to follow up the water quality of this river, which is considered by some experts as one of the most polluted rivers in Brazil. As for this study, its great importance lies in the historical analysis of indicators. In this sense, we sought to develop aspects related to the management of water resources by performing a historical analysis of the Water Quality Index (WQI) of the Sinos River, using statistical methods. With regard to the methodological procedures, it should be pointed out that this study performs a time analysis of monitoring data on parameters related to a punctual measurement that is variable in time, using statistical tools. The data used refer to analyses of the water quality of the Sinos River (WQI) from the State Environmental Protection Agency Henrique Luiz Roessler (Fundação Estadual de Proteção Ambiental Henrique Luiz Roessler, FEPAM) covering the period between 2000 and 2008, as well as to a theoretical analysis focusing on the management of water resources. The study of WQI and its parameters by statistical analysis has shown to be effective, ensuring its effectiveness as a tool for the management of water resources. The descriptive analysis of the WQI and its parameters showed that the water quality of the Sinos River is concerning low, which reaffirms that it is one of the most polluted rivers in Brazil. It should be highlighted that there was an overall difficulty in obtaining data with the appropriate periodicity, as well as a long complete series, which limited the conduction of statistical studies such as the present one.
Cheng, Tao; Zhu, Chen; Guo, Yongyuan; Shi, Sifeng; Chen, Desheng; Zhang, Xianlong
2014-11-01
The impact of patellar denervation with electrocautery in total knee arthroplasty (TKA) on post-operative outcomes has been under debate. This study aims to conduct a meta-analysis and systematic review to compare the benefits and risks of circumpatellar electrocautery with those of non-electrocautery in primary TKAs. Comparative and randomized clinical studies were identified by conducting an electronic search of articles dated up to September 2012 in PubMed, EMBASE, Scopus, and the Cochrane databases. Six studies that focus on a total of 849 knees were analysed. A random-effects model was conducted using the inverse-variance method for continuous variables and the Mantel-Haenszel method for dichotomous variables. There was no significant difference in the incidence of anterior knee pain between the electrocautery and non-electrocautery groups. In term of patellar score and Knee Society Score, circumpatellar electrocautery improved clinical outcomes compared with non-electrocautery in TKAs. The statistical differences were in favour of the electrocautery group but have minimal clinical significance. In addition, the overall complications indicate no statistical significance between the two groups. This study shows no strong evidence either for or against electrocautery compared with non-electrocautery in TKAs. Therapeutic study (systematic review and meta-analysis), Level III.
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
Analysis of the Database of Theses and Dissertations from DME/UFSCAR about Astronomy Education
NASA Astrophysics Data System (ADS)
Rodrigues Ferreira, Orlando; Voelzke, Marcos Rincon
2013-11-01
The paper presents a brief analysis of the "Database of Theses and Dissertations about Astronomy Education" from the Department of Teaching Methodology (DME) of the Federal University of São Carlos(UFSCar). This kind of study made it possible to develop new analysis and statistical data, as well as to conduct a rating of Brazilian institutions that produce academic work in the area.
van Gelder, P.H.A.J.M.; Nijs, M.
2011-01-01
Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care. PMID:24753877
van Gelder, P H A J M; Nijs, M
2011-01-01
Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost -importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the -required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper -interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care.
Students' attitudes towards learning statistics
NASA Astrophysics Data System (ADS)
Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah
2015-05-01
Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.
Comparative analysis of positive and negative attitudes toward statistics
NASA Astrophysics Data System (ADS)
Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah
2015-02-01
Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.
Bayesian Sensitivity Analysis of Statistical Models with Missing Data
ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG
2013-01-01
Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718
Zhao, Ren-Wu; Guo, Zhi-Qiang; Zhang, Ru-Xin
2016-06-01
A growing number of molecular epidemiological studies have been conducted to evaluate the association between human papillomavirus (HPV) infection and the malignancy of sinonasal inverted papilloma (SNIP). However, the results remain inconclusive. Here, a meta-analysis was conducted to quantitatively assess this association. Case-control studies investigating SNIP tissues for presence of HPV DNA were identified. The odds ratios (ORs) and 95% confidence intervals (CIs) were calculated by the Mantel-Haenszel method. An assessment of publication bias and sensitivity analysis were also performed. We calculated a pooled OR of 2.16 (95% CI=1.46-3.21, P<0.001) without statistically significant heterogeneity or publication bias. Stratification by HPV type showed a stronger association for patients with high-risk HPV (hrHPV) types, HPV-16, HPV-18, and HPV-16/18 infection (OR=8.8 [95% CI: 4.73-16.38], 8.04 [95% CI: 3.34-19.39], 18.57 [95% CI: 4.56-75.70], and 26.24 [4.35-158.47], respectively). When only using PCR studies, pooled ORs for patients with hrHPV, HPV-16, and HPV18 infection still reached statistical significance. However, Egger's test reflected significant publication bias in the HPV-16 sub-analysis (P=0.06), and the adjusted OR was no longer statistically significant (OR=1.65, 95%CI: 0.58-4.63). These results suggest that HPV infection, especially hrHPV (HPV-18), is significantly associated with malignant SNIP. Copyright © 2016 Elsevier B.V. All rights reserved.
Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth
This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less
Comparative analysis on the selection of number of clusters in community detection
NASA Astrophysics Data System (ADS)
Kawamoto, Tatsuro; Kabashima, Yoshiyuki
2018-02-01
We conduct a comparative analysis on various estimates of the number of clusters in community detection. An exhaustive comparison requires testing of all possible combinations of frameworks, algorithms, and assessment criteria. In this paper we focus on the framework based on a stochastic block model, and investigate the performance of greedy algorithms, statistical inference, and spectral methods. For the assessment criteria, we consider modularity, map equation, Bethe free energy, prediction errors, and isolated eigenvalues. From the analysis, the tendency of overfit and underfit that the assessment criteria and algorithms have becomes apparent. In addition, we propose that the alluvial diagram is a suitable tool to visualize statistical inference results and can be useful to determine the number of clusters.
How Will DSM-5 Affect Autism Diagnosis? A Systematic Literature Review and Meta-Analysis
ERIC Educational Resources Information Center
Kulage, Kristine M.; Smaldone, Arlene M.; Cohn, Elizabeth G.
2014-01-01
We conducted a systematic review and meta-analysis to determine the effect of changes to the Diagnostic and Statistical Manual (DSM)-5 on autism spectrum disorder (ASD) and explore policy implications. We identified 418 studies; 14 met inclusion criteria. Studies consistently reported decreases in ASD diagnosis (range 7.3-68.4%) using DSM-5…
A Secondary Analysis of the Impact of School Management Practices on School Performance
ERIC Educational Resources Information Center
Talbert, Dale A.
2009-01-01
The purpose of this study was to conduct a secondary analysis of the impact of school management practices on school performance utilizing a survey design of School and Staffing (SASS) data collected by the National Center for Education Statistics (NCES) of the U.S. Department of Education, 1999-2000. The study identifies those school management…
Vermont's use-value appraisal property tax program: a forest inventory and analysis
Paul E. Sendak; Donald F. Dennis; Donald F. Dennis
1989-01-01
A statistical report and analysis of the timberland enrolled in the Vermont Use Value Appraisal (UVA) property tax program. The study was conducted using data collected in the fourth forest survey of Vermont (1983). Estimates are presented on land area, timber volumes, tree quality, numbers of live trees, and biomass for timberland enrolled in the UVA program and for...
ERIC Educational Resources Information Center
Bernard, Robert M.; Borokhovski, Eugene; Schmid, Richard F.; Tamim, Rana M.
2014-01-01
This article contains a second-order meta-analysis and an exploration of bias in the technology integration literature in higher education. Thirteen meta-analyses, dated from 2000 to 2014 were selected to be included based on the questions asked and the presence of adequate statistical information to conduct a quantitative synthesis. The weighted…
ERIC Educational Resources Information Center
Douglas, Jeff; Kim, Hae-Rim; Roussos, Louis; Stout, William; Zhang, Jinming
An extensive nonparametric dimensionality analysis of latent structure was conducted on three forms of the Law School Admission Test (LSAT) (December 1991, June 1992, and October 1992) using the DIMTEST model in confirmatory analyses and using DIMTEST, FAC, DETECT, HCA, PROX, and a genetic algorithm in exploratory analyses. Results indicate that…
Goto, Masami; Abe, Osamu; Hata, Junichi; Fukunaga, Issei; Shimoji, Keigo; Kunimatsu, Akira; Gomi, Tsutomu
2017-02-01
Background Diffusion tensor imaging (DTI) is a magnetic resonance imaging (MRI) technique that reflects the Brownian motion of water molecules constrained within brain tissue. Fractional anisotropy (FA) is one of the most commonly measured DTI parameters, and can be applied to quantitative analysis of white matter as tract-based spatial statistics (TBSS) and voxel-wise analysis. Purpose To show an association between metallic implants and the results of statistical analysis (voxel-wise group comparison and TBSS) for fractional anisotropy (FA) mapping, in DTI of healthy adults. Material and Methods Sixteen healthy volunteers were scanned with 3-Tesla MRI. A magnetic keeper type of dental implant was used as the metallic implant. DTI was acquired three times in each participant: (i) without a magnetic keeper (FAnon1); (ii) with a magnetic keeper (FAimp); and (iii) without a magnetic keeper (FAnon2) as reproducibility of FAnon1. Group comparisons with paired t-test were performed as FAnon1 vs. FAnon2, and as FAnon1 vs. FAimp. Results Regions of significantly reduced and increased local FA values were revealed by voxel-wise group comparison analysis (a P value of less than 0.05, corrected with family-wise error), but not by TBSS. Conclusion Metallic implants existing outside the field of view produce artifacts that affect the statistical analysis (voxel-wise group comparisons) for FA mapping. When statistical analysis for FA mapping is conducted by researchers, it is important to pay attention to any dental implants present in the mouths of the participants.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
William H. McWilliams; Richard A. Birdsey
1986-01-01
The forest inventory and analysis unit of the southern forest experiment station (Forest Survey) conducts periodic inventories about every 10 years covering forest resource inventories of Alabama, Arkansas, Louisiana, Mississippi, East Oklahoma, Tennessee, and East Texas. Appendix tables present summaries of timberland area, growing-stock volume, ownership class,...
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.
2009-01-01
In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212
Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M
2017-02-01
Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.
Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-06-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao
2009-01-01
Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650
Amali, Amin; Mahdi, Parvane; Karimi Yazdi, Alireza; Khorsandi Ashtiyani, Mohammad Taghi; Yazdani, Nasrin; Vakili, Varasteh; Pourbakht, Akram
2014-01-01
Vestibular involvements have long been observed in otosclerotic patients. Among vestibular structures saccule has the closest anatomical proximity to the sclerotic foci, so it is the most prone vestibular structure to be affected during the otosclerosis process. The aim of this study was to investigate the saccular function in patients suffering from otosclerosis, by means of Vestibular Evoked Myogenic Potential (VEMP). The material consisted of 30 otosclerosis patients and 20 control subjects. All participants underwent audiometric and VEMP testing. Analysis of tests results revealed that the mean values of Air-Conducted Pure Tone Average (AC-PTA) and Bone-Conducted Pure Tone Average (BC-PTA) in patients were 45.28 ± 15.57 and 19.68 ± 10.91, respectively and calculated 4 frequencies Air Bone Gap (ABG) was 25.64 ± 9.95. The VEMP response was absent in 14 (28.57%) otosclerotic ears. A statistically significant increase in latency of the p13 was found in the affected ears (P=0.004), differences in n23 latency did not reach a statistically significant level (P=0.112). Disparities in amplitude of p13-n23 in between two study groups was statistically meaningful (P=0.009), indicating that the patients with otosclerosis had lower amplitudes. This study tends to suggest that due to the direct biotoxic effect of the materials released from the otosclerosis foci on saccular receptors, there might be a possibility of vestibular dysfunction in otosclerotic patients.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
Ker, Katharine; Prieto-Merino, David; Sprigg, Nikola; Mahmood, Abda; Bath, Philip; Kang Law, Zhe; Flaherty, Katie; Roberts, Ian
2017-01-01
Introduction : The Antifibrinolytic Trialists Collaboration aims to increase knowledge about the effectiveness and safety of antifibrinolytic treatment by conducting individual patient data (IPD) meta-analyses of randomised trials. This article presents the statistical analysis plan for an IPD meta-analysis of the effects of antifibrinolytics for acute intracranial haemorrhage. Methods : The protocol for the IPD meta-analysis has been registered with PROSPERO (CRD42016052155). We will conduct an individual patient data meta-analysis of randomised controlled trials with 1000 patients or more assessing the effects of antifibrinolytics in acute intracranial haemorrhage. We will assess the effect on two co-primary outcomes: 1) death in hospital at end of trial follow-up, and 2) death in hospital or dependency at end of trial follow-up. The co-primary outcomes will be limited to patients treated within three hours of injury or stroke onset. We will report treatment effects using odds ratios and 95% confidence intervals. We use logistic regression models to examine how the effect of antifibrinolytics vary by time to treatment, severity of intracranial bleeding, and age. We will also examine the effect of antifibrinolytics on secondary outcomes including death, dependency, vascular occlusive events, seizures, and neurological outcomes. Secondary outcomes will be assessed in all patients irrespective of time of treatment. All analyses will be conducted on an intention-to-treat basis. Conclusions : This IPD meta-analysis will examine important clinical questions about the effects of antifibrinolytic treatment in patients with intracranial haemorrhage that cannot be answered using aggregate data. With IPD we can examine how effects vary by time to treatment, bleeding severity, and age, to gain better understanding of the balance of benefit and harms on which to base recommendations for practice.
Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines
NASA Astrophysics Data System (ADS)
Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.
2016-12-01
Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.
Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei
2011-08-01
We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.
Statistical analysis of time transfer data from Timation 2. [US Naval Observatory and Australia
NASA Technical Reports Server (NTRS)
Luck, J. M.; Morgan, P.
1974-01-01
Between July 1973 and January 1974, three time transfer experiments using the Timation 2 satellite were conducted to measure time differences between the U.S. Naval Observatory and Australia. Statistical tests showed that the results are unaffected by the satellite's position with respect to the sunrise/sunset line or by its closest approach azimuth at the Australian station. Further tests revealed that forward predictions of time scale differences, based on the measurements, can be made with high confidence.
ERIC Educational Resources Information Center
Rampey, B.D.; Lutkus, Anthony D.; Weiner, Arlene W.; Rahman, Taslima
2006-01-01
The National Indian Education Study is a two-part study designed to describe the condition of education for American Indian/Alaska Native students in the United States. The study was conducted by the National Center for Education Statistics for the U.S. Department of Education, with the support of the Office of Indian Education. This report, Part…
A clinical research analytics toolkit for cohort study.
Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue
2012-01-01
This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.
Mishra, Lora; Pattnaik, Prajna; Kumar, Manoj; Aggarwal, Sonia; Misra, Satya Ranjan
2016-01-01
Aim: The present study was conducted with an aim to determine the number and trends of published articles in the International Endodontic Journal (IEJ) and Journal of Endodontics (JOE) from 2009 to 2014. Settings and Designs: A retrospective observational study was conducted for IEJ and JOE. Subjects and Methods: All issues of IEJ and JOE were electronically and hand searched for the following parameters: Amount of papers, publication year, affiliated organizations, and countries. Statistical Analysis Used: The data were organized and analyzed using software SPSS version 21.0; descriptive statistics was used. Results: A total of 872 articles were analyzed in the IEJ and JOE with 1606 papers. Brazil had the largest number of articles (170) mainly in IEJ, and the USA (350) in JOE. Indians published more of their research in JOE than IEJ. Conclusions: Original articles in endodontic publication from different universities in India have considerably increased, showing that research is becoming more important. PMID:27795645
Analysis of Wood Structure Connections Using Cylindrical Steel and Carbon Fiber Dowel Pins
NASA Astrophysics Data System (ADS)
Vodiannikov, Mikhail A.; Kashevarova, Galina G., Dr.
2017-06-01
In this paper, the results of the statistical analysis of corrosion processes and moisture saturation of glued laminated timber structures and their joints in corrosive environment are shown. This paper includes calculation results for dowel connections of wood structures using steel and carbon fiber reinforced plastic cylindrical dowel pins in accordance with applicable regulatory documents by means of finite element analysis in ANSYS software, as well as experimental findings. Dependence diagrams are shown; comparative analysis of the results obtained is conducted.
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
Li, Jinling; He, Ming; Han, Wei; Gu, Yifan
2009-05-30
An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.
Profile Of 'Original Articles' Published In 2016 By The Journal Of Ayub Medical College, Pakistan.
Shaikh, Masood Ali
2018-01-01
Journal of Ayub Medical College (JAMC) is the only Medline indexed biomedical journal of Pakistan that is edited and published by a medical college. Assessing the trends of study designs employed, statistical methods used, and statistical analysis software used in the articles of medical journals help understand the sophistication of research published. The objectives of this descriptive study were to assess all original articles published by JAMC in the year 2016. JAMC published 147 original articles in the year 2016. The most commonly used study design was crosssectional studies, with 64 (43.5%) articles reporting its use. Statistical tests involving bivariate analysis were most common and reported by 73 (49.6%) articles. Use of SPSS software was reported by 109 (74.1%) of articles. Most 138 (93.9%) of the original articles published were based on studies conducted in Pakistan. The number and sophistication of analysis reported in JAMC increased from year 2014 to 2016.
NASA Astrophysics Data System (ADS)
Wardhani, D. K.; Azmi, D. S.; Purnamasari, W. D.
2017-06-01
RW 3 Sukun Malang was one of kampong that won the competition kampong environment and had managed to maintain the preservation of the kampong. Society of RW 3 Sukun undertake various activities to manage the environment by optimizing the use of kampong space. Despite RW 3 Sukun had conducted environmental management activities, there are several locations in the kampong space that less well maintained. The purpose of this research was to determine the relation of environmental management with the quality of kampong space in RW 3 Sukun. This research used qualitative and quantitative research approaches. Quantitative research conducted by using descriptive statistical analysis in assessing the quality of kampong space with weighting, scoring, and overlay maps. Quantitative research was also conducted on the relation analysis of environmental management with the quality of kampong space by using typology analysis and pearson correlation analysis. Qualitative research conducted on the analysis of environmental management and the relation analysis of environmental management with the quality of kampong space. Result of this research indicates that environmental management in RW 3 Sukun have relation with the quality of kampong space.
In vitro burn model illustrating heat conduction patterns using compressed thermal papers.
Lee, Jun Yong; Jung, Sung-No; Kwon, Ho
2015-01-01
To date, heat conduction from heat sources to tissue has been estimated by complex mathematical modeling. In the present study, we developed an intuitive in vitro skin burn model that illustrates heat conduction patterns inside the skin. This was composed of tightly compressed thermal papers with compression frames. Heat flow through the model left a trace by changing the color of thermal papers. These were digitized and three-dimensionally reconstituted to reproduce the heat conduction patterns in the skin. For standardization, we validated K91HG-CE thermal paper using a printout test and bivariate correlation analysis. We measured the papers' physical properties and calculated the estimated depth of heat conduction using Fourier's equation. Through contact burns of 5, 10, 15, 20, and 30 seconds on porcine skin and our burn model using a heated brass comb, and comparing the burn wound and heat conduction trace, we validated our model. The heat conduction pattern correlation analysis (intraclass correlation coefficient: 0.846, p < 0.001) and the heat conduction depth correlation analysis (intraclass correlation coefficient: 0.93, p < 0.001) showed statistically significant high correlations between the porcine burn wound and our model. Our model showed good correlation with porcine skin burn injury and replicated its heat conduction patterns. © 2014 by the Wound Healing Society.
Educational games in geriatric medicine education: a systematic review
2010-01-01
Objective To systematically review the medical literature to assess the effect of geriatric educational games on the satisfaction, knowledge, beliefs, attitudes and behaviors of health care professionals. Methods We conducted a systematic review following the Cochrane Collaboration methodology including an electronic search of 10 electronic databases. We included randomized controlled trials (RCT) and controlled clinical trials (CCT) and excluded single arm studies. Population of interests included members (practitioners or students) of the health care professions. Outcomes of interests were participants' satisfaction, knowledge, beliefs, attitude, and behaviors. Results We included 8 studies evaluating 5 geriatric role playing games, all conducted in United States. All studies suffered from one or more methodological limitations but the overall quality of evidence was acceptable. None of the studies assessed the effects of the games on beliefs or behaviors. None of the 8 studies reported a statistically significant difference between the 2 groups in terms of change in attitude. One study assessed the impact on knowledge and found non-statistically significant difference between the 2 groups. Two studies found levels of satisfaction among participants to be high. We did not conduct a planned meta-analysis because the included studies either reported no statistical data or reported different summary statistics. Conclusion The available evidence does not support the use of role playing interventions in geriatric medical education with the aim of improving the attitudes towards the elderly. PMID:20416055
10 CFR 431.17 - Determination of efficiency.
Code of Federal Regulations, 2014 CFR
2014-01-01
... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...
10 CFR 431.17 - Determination of efficiency.
Code of Federal Regulations, 2012 CFR
2012-01-01
... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...
DOT National Transportation Integrated Search
2016-08-01
This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee provided oversight : of the process. The research process included extensive statistical analyses of test data supplied by SCDOT. : A total of 2,789 AC tes...
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
Maldonado, Alejandra; Laugisch, Oliver; Bürgin, Walter; Sculean, Anton; Eick, Sigrun
2018-06-22
Considering the increasing number of elderly people, dementia has gained an important role in today's society. Although the contributing factors for dementia have not been fully understood, chronic periodontitis (CP) seems to have a possible link to dementia. To conduct a systematic review including meta-analysis in order to assess potential differences in clinical periodontal variables between patients with dementia and non-demented individuals. The following focused question was evaluated: is periodontitis associated with dementia? Electronic searches in two databases, MEDLINE and EMBASE, were conducted. Meta-analysis was performed with the collected data in order to find a statistically significant difference in clinical periodontal variables between the group of dementia and the cognitive normal controls. Forty-two articles remained for full text reading. Finally, seven articles met the inclusion criteria and only five studies provided data suitable for meta-analysis. Periodontal probing depth (PPD), bleeding on probing (BOP), gingival bleeding index (GBI), clinical attachment level (CAL), and plaque index (PI) were included as periodontal variables in the meta-analysis. Each variable revealed a statistically significant difference between the groups. In an attempt to reveal an overall difference between the periodontal variables in dementia patients and non-demented individuals, the chosen variables were transformed into units that resulted in a statistically significant overall difference (p < 0.00001). The current findings indicate that compared to systemically healthy individuals, demented patients show significantly worse clinical periodontal variables. However, further epidemiological studies including a high numbers of participants, the use of exact definitions both for dementia and chronic periodontitis and adjusted for cofounders is warranted. These findings appear to support the putative link between CP and dementia. Consequently, the need for periodontal screening and treatment of elderly demented people should be emphasized.
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2012-11-01
Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.
Factors affecting job satisfaction in nurse faculty: a meta-analysis.
Gormley, Denise K
2003-04-01
Evidence in the literature suggests job satisfaction can make a difference in keeping qualified workers on the job, but little research has been conducted focusing specifically on nursing faculty. Several studies have examined nurse faculty satisfaction in relationship to one or two influencing factors. These factors include professional autonomy, leader role expectations, organizational climate, perceived role conflict and role ambiguity, leadership behaviors, and organizational characteristics. This meta-analysis attempts to synthesize the various studies conducted on job satisfaction in nursing faculty and analyze which influencing factors have the greatest effect. The procedure used for this meta-analysis consisted of reviewing studies to identify factors influencing job satisfaction, research questions, sample size reported, instruments used for measurement of job satisfaction and influencing factors, and results of statistical analysis.
A Statistical Analysis of Brain Morphology Using Wild Bootstrapping
Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.
2008-01-01
Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
Gestational surrogacy: Viewpoint of Iranian infertile women
Rahmani, Azad; Sattarzadeh, Nilofar; Gholizadeh, Leila; Sheikhalipour, Zahra; Allahbakhshian, Atefeh; Hassankhani, Hadi
2011-01-01
BACKGROUND: Surrogacy is a popular form of assisted reproductive technology of which only gestational form is approved by most of the religious scholars in Iran. Little evidence exists about the Iranian infertile women's viewpoint regarding gestational surrogacy. AIM: To assess the viewpoint of Iranian infertile women toward gestational surrogacy. SETTING AND DESIGN: This descriptive study was conducted at the infertility clinic of Tabriz University of Medical Sciences, Iran. MATERIALS AND METHODS: The study sample consisted of 238 infertile women who were selected using the eligible sampling method. Data were collected by using a researcher developed questionnaire that included 25 items based on a five-point Likert scale. STATISTICAL ANALYSIS: Data analysis was conducted by SPSS statistical software using descriptive statistics. RESULTS: Viewpoint of 214 women (89.9%) was positive. 36 (15.1%) women considered gestational surrogacy against their religious beliefs; 170 women (71.4%) did not assume the commissioning couple as owners of the baby; 160 women (67.2%) said that children who were born through surrogacy would better not know about it; and 174 women (73.1%) believed that children born through surrogacy will face mental problems. CONCLUSION: Iranian infertile women have positive viewpoint regarding the surrogacy. However, to increase the acceptability of surrogacy among infertile women, further efforts are needed. PMID:22346081
NASA Astrophysics Data System (ADS)
Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.
2018-03-01
Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.
NASA Technical Reports Server (NTRS)
Hegsted, D. M.
1975-01-01
A prototype balance study was conducted on earth prior to the balance studies conducted in Skylab itself. Collected were daily dietary intake data of 6 minerals and nitrogen, and fecal and urinary outputs on each of three astronauts. Essential statistical issues show what quantities need to be estimated and establish the scope of inference associated with alternative variance estimates. The procedures for obtaining the final variability due both to errors of measurement and total error (total = measurement and biological variability) are exhibited.
2009-02-01
data was linearly fit, and the slope yielded the Seebeck coefficient. A small resis - tor was epoxied to the top of the sample, and the oppo- site end...space probes in its radioisotope thermoelectric generators (RTGs) and is of current interest to automobile manufacturers to supply additional power... resis - tivity or conductivity, thermal conductivity, and Seebeck coefficient. These required measurements are demanding, especially the thermal
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
[Road Extraction in Remote Sensing Images Based on Spectral and Edge Analysis].
Zhao, Wen-zhi; Luo, Li-qun; Guo, Zhou; Yue, Jun; Yu, Xue-ying; Liu, Hui; Wei, Jing
2015-10-01
Roads are typically man-made objects in urban areas. Road extraction from high-resolution images has important applications for urban planning and transportation development. However, due to the confusion of spectral characteristic, it is difficult to distinguish roads from other objects by merely using traditional classification methods that mainly depend on spectral information. Edge is an important feature for the identification of linear objects (e. g. , roads). The distribution patterns of edges vary greatly among different objects. It is crucial to merge edge statistical information into spectral ones. In this study, a new method that combines spectral information and edge statistical features has been proposed. First, edge detection is conducted by using self-adaptive mean-shift algorithm on the panchromatic band, which can greatly reduce pseudo-edges and noise effects. Then, edge statistical features are obtained from the edge statistical model, which measures the length and angle distribution of edges. Finally, by integrating the spectral and edge statistical features, SVM algorithm is used to classify the image and roads are ultimately extracted. A series of experiments are conducted and the results show that the overall accuracy of proposed method is 93% comparing with only 78% overall accuracy of the traditional. The results demonstrate that the proposed method is efficient and valuable for road extraction, especially on high-resolution images.
O'Connor, B P
2000-08-01
Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.
O'Connor, Brian P
2004-02-01
Levels-of-analysis issues arise whenever individual-level data are collected from more than one person from the same dyad, family, classroom, work group, or other interaction unit. Interdependence in data from individuals in the same interaction units also violates the independence-of-observations assumption that underlies commonly used statistical tests. This article describes the data analysis challenges that are presented by these issues and presents SPSS and SAS programs for conducting appropriate analyses. The programs conduct the within-and-between-analyses described by Dansereau, Alutto, and Yammarino (1984) and the dyad-level analyses described by Gonzalez and Griffin (1999) and Griffin and Gonzalez (1995). Contrasts with general multilevel modeling procedures are then discussed.
Factors Influencing Stress, Burnout, and Retention of Secondary Teachers
ERIC Educational Resources Information Center
Fisher, Molly H.
2011-01-01
This study examines the stress, burnout, satisfaction, and preventive coping skills of nearly 400 secondary teachers to determine variables contributing to these major factors influencing teachers. Analysis of Variance (ANOVA) statistics were conducted that found the burnout levels between new and experienced teachers are significantly different,…
Emergent Readers' Social Interaction Styles and Their Comprehension Processes during Buddy Reading
ERIC Educational Resources Information Center
Christ, Tanya; Wang, X. Christine; Chiu, Ming Ming
2015-01-01
To examine the relations between emergent readers' social interaction styles and their comprehension processes, we adapted sociocultural and transactional views of learning and reading, and conducted statistical discourse analysis of 1,359 conversation turns transcribed from 14 preschoolers' 40 buddy reading events. Results show that interaction…
AQAK: A Library Anxiety Scale for Undergraduate Students
ERIC Educational Resources Information Center
Anwar, Mumtaz A.; Al-Qallaf, Charlene L.; Al-Kandari, Noriah M.; Al-Ansari, Husain A.
2012-01-01
The library environment has drastically changed since 1992 when Bostick's Library Anxiety Scale was developed. This project aimed to develop a scale specifically for undergraduate students. A three-stage study was conducted, using students of Kuwait University. A variety of statistical measures, including factor analysis, were used to process the…
Faculty Perceptions of Transition Personnel Preparation in Saudi Arabia
ERIC Educational Resources Information Center
Alhossan, Bandar A.; Trainor, Audrey A.
2017-01-01
This study investigated to what extent faculty members include and value transition curricula in special education preparation programs in Saudi Arabia. A web-based survey was conducted and sent to special education professors across 20 universities. Descriptive statistics and a t-test analysis generated three main findings: (a) Institutions…
A global estimate of the Earth's magnetic crustal thickness
NASA Astrophysics Data System (ADS)
Vervelidou, Foteini; Thébault, Erwan
2014-05-01
The Earth's lithosphere is considered to be magnetic only down to the Curie isotherm. Therefore the Curie isotherm can, in principle, be estimated by analysis of magnetic data. Here, we propose such an analysis in the spectral domain by means of a newly introduced regional spatial power spectrum. This spectrum is based on the Revised Spherical Cap Harmonic Analysis (R-SCHA) formalism (Thébault et al., 2006). We briefly discuss its properties and its relationship with the Spherical Harmonic spatial power spectrum. This relationship allows us to adapt any theoretical expression of the lithospheric field power spectrum expressed in Spherical Harmonic degrees to the regional formulation. We compared previously published statistical expressions (Jackson, 1994 ; Voorhies et al., 2002) to the recent lithospheric field models derived from the CHAMP and airborne measurements and we finally developed a new statistical form for the power spectrum of the Earth's magnetic lithosphere that we think provides more consistent results. This expression depends on the mean magnetization, the mean crustal thickness and a power law value that describes the amount of spatial correlation of the sources. In this study, we make a combine use of the R-SCHA surface power spectrum and this statistical form. We conduct a series of regional spectral analyses for the entire Earth. For each region, we estimate the R-SCHA surface power spectrum of the NGDC-720 Spherical Harmonic model (Maus, 2010). We then fit each of these observational spectra to the statistical expression of the power spectrum of the Earth's lithosphere. By doing so, we estimate the large wavelengths of the magnetic crustal thickness on a global scale that are not accessible directly from the magnetic measurements due to the masking core field. We then discuss these results and compare them to the results we obtained by conducting a similar spectral analysis, but this time in the cartesian coordinates, by means of a published statistical expression (Maus et al., 1997). We also compare our results to crustal thickness global maps derived by means of additional geophysical data (Purucker et al., 2002).
ERIC Educational Resources Information Center
Asahina, Roberta R.
A two-fold statistical analysis examined the creative development of the 15-second television commercial, providing a follow-up to a similar study conducted in 1986. Study 1 of the present analysis examined 335 actual 15-second spots extracted from 30 hours of network daytime and primetime programming in the fourth and first quarters of 1988-1989.…
Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W
2015-10-01
Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.
Analysis of the sleep quality of elderly people using biomedical signals.
Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A
2015-01-01
This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.
Climate drivers on malaria transmission in Arunachal Pradesh, India.
Upadhyayula, Suryanaryana Murty; Mutheneni, Srinivasa Rao; Chenna, Sumana; Parasaram, Vaideesh; Kadiri, Madhusudhan Rao
2015-01-01
The present study was conducted during the years 2006 to 2012 and provides information on prevalence of malaria and its regulation with effect to various climatic factors in East Siang district of Arunachal Pradesh, India. Correlation analysis, Principal Component Analysis and Hotelling's T² statistics models are adopted to understand the effect of weather variables on malaria transmission. The epidemiological study shows that the prevalence of malaria is mostly caused by the parasite Plasmodium vivax followed by Plasmodium falciparum. It is noted that, the intensity of malaria cases declined gradually from the year 2006 to 2012. The transmission of malaria observed was more during the rainy season, as compared to summer and winter seasons. Further, the data analysis study with Principal Component Analysis and Hotelling's T² statistic has revealed that the climatic variables such as temperature and rainfall are the most influencing factors for the high rate of malaria transmission in East Siang district of Arunachal Pradesh.
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D
2017-01-01
If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.
Jahn, I; Foraita, R
2008-01-01
In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.
Kazdal, Hizir; Kanat, Ayhan; Aydin, Mehmet Dumlu; Yazar, Ugur; Guvercin, Ali Riza; Calik, Muhammet; Gundogdu, Betul
2017-01-01
Context: Sudden death from subarachnoid hemorrhage (SAH) is not uncommon. Aims: The goal of this study is to elucidate the effect of the cervical spinal roots and the related dorsal root ganglions (DRGs) on cardiorespiratory arrest following SAH. Settings and Design: This was an experimental study conducted on rabbits. Materials and Methods: This study was conducted on 22 rabbits which were randomly divided into three groups: control (n = 5), physiologic serum saline (SS; n = 6), SAH groups (n = 11). Experimental SAH was performed. Seven of 11 rabbits with SAH died within the first 2 weeks. After 20 days, other animals were sacrificed. The anterior spinal arteries, arteriae nervorum of cervical nerve roots (C6–C8), DRGs, and lungs were histopathologically examined and estimated stereologically. Statistical Analysis Used: Statistical analysis was performed using the PASW Statistics 18.0 for Windows (SPSS Inc., Chicago, Illinois, USA). Intergroup differences were assessed using a one-way ANOVA. The statistical significance was set at P < 0.05. Results: In the SAH group, histopathologically, severe anterior spinal artery (ASA) and arteriae nervorum vasospasm, axonal and neuronal degeneration, and neuronal apoptosis were observed. Vasospasm of ASA did not occur in the SS and control groups. There was a statistically significant increase in the degenerated neuron density in the SAH group as compared to the control and SS groups (P < 0.05). Cardiorespiratory disturbances, arrest, and lung edema more commonly developed in animals in the SAH group. Conclusion: We noticed interestingly that C6–C8 DRG degenerations were secondary to the vasospasm of ASA, following SAH. Cardiorespiratory disturbances or arrest can be explained with these mechanisms. PMID:28250634
Statistical analysis of Turbine Engine Diagnostic (TED) field test data
NASA Astrophysics Data System (ADS)
Taylor, Malcolm S.; Monyak, John T.
1994-11-01
During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.
Badran, M; Morsy, R; Soliman, H; Elnimr, T
2016-01-01
The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.
[Interactive workshops as a dissemination strategy in psychology].
Martínez-Martínez, Kalina Isela; Carrascosa-Venegas, César; Ayala-Velázquez, Héctor
2003-01-01
To assess whether interactive workshops are an effective strategy for promoting a psychological intervention model among healthcare providers, to treat problem drinkers. The study was conducted between the years 1999 and 2000, among 206 healthcare providers at seven Instituto Mexicano del Seguro Social (Mexican Institute of Social Security, IMSS) clinics. Study subjects were selected by hospital executive officers. The study design is a quasi-experimental pre-test/post-test study. Data on providers' attitudes, interests, and knowledge were collected using a questionnaire. After that, interactive workshops were conducted, and the same questionnaire was applied again at the end of the workshops. Statistical analysis was carried out using Student's t test for matched samples. Statistically significant differences were found in participants' knowledge on alcoholism t (206, 205) = -9.234, p = 0.001, as well as in their interest t (206, 205) = -2.318, p = 0.021. Interactive workshops are an effective tool to disseminate the Guided Self-Help Program conducted in IMSS clinics. Healthcare providers can become change-inducing/promoting agents of psychological innovations.
NASA Astrophysics Data System (ADS)
Yoo, Donghoon; Lee, Joohyun; Lee, Byeongchan; Kwon, Suyong; Koo, Junemo
2018-02-01
The Transient Hot-Wire Method (THWM) was developed to measure the absolute thermal conductivity of gases, liquids, melts, and solids with low uncertainty. The majority of nanofluid researchers used THWM to measure the thermal conductivity of test fluids. Several reasons have been suggested for the discrepancies in these types of measurements, including nanofluid generation, nanofluid stability, and measurement challenges. The details of the transient hot-wire method such as the test cell size, the temperature coefficient of resistance (TCR) and the sampling number are further investigated to improve the accuracy and consistency of the measurements of different researchers. It was observed that smaller test apparatuses were better because they can delay the onset of natural convection. TCR values of a coated platinum wire were measured and statistically analyzed to reduce the uncertainty in thermal conductivity measurements. For validation, ethylene glycol (EG) and water thermal conductivity were measured and analyzed in the temperature range between 280 and 310 K. Furthermore, a detailed statistical analysis was conducted for such measurements, and the results confirmed the minimum number of samples required to achieve the desired resolution and precision of the measurements. It is further proposed that researchers fully report the information related to their measurements to validate the measurements and to avoid future inconsistent nanofluid data.
Chan, Y; Walmsley, R P
1997-12-01
When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.
Atomic structure of water/Au, Ag, Cu and Pt atomic junctions.
Li, Yu; Kaneko, Satoshi; Fujii, Shintaro; Nishino, Tomoaki; Kiguchi, Manabu
2017-02-08
Much progress has been made in understanding the transport properties of atomic-scale conductors. We prepared atomic-scale metal contacts of Cu, Ag, Au and Pt using a mechanically controllable break junction method at 10 K in a cryogenic vacuum. Water molecules were exposed to the metal atomic contacts and the effect of molecular adsorption was investigated by electronic conductance measurements. Statistical analysis of the electronic conductance showed that the water molecule(s) interacted with the surface of the inert Au contact and the reactive Cu ant Pt contacts, where molecular adsorption decreased the electronic conductance. A clear conductance signature of water adsorption was not apparent at the Ag contact. Detailed analysis of the conductance behaviour during a contact-stretching process indicated that metal atomic wires were formed for the Au and Pt contacts. The formation of an Au atomic wire consisting of low coordination number atoms leads to increased reactivity of the inert Au surface towards the adsorption of water.
A methodological analysis of chaplaincy research: 2000-2009.
Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F
2011-01-01
The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.
Bibliometric Analysis of Journal of Clinical and Diagnostic Research (Dentistry Section; 2007-2014)
Basavaraj, P; Singla, Ashish; Singh, Khushboo; Kundu, Hansa; Vashishtha, Vaibhav; Pandita, Venisha; Malhi, Ravneet
2015-01-01
Background: The role of scientific journals in diffusion of data concerning researches in the field of Public Health Dentistry is of premier importance. Bibliometric analysis involves analysis of publications reflecting the type of research work. Aim: The present study was conducted with an aim to determine the number and trends of published articles in Journal of Clinical and Diagnostic Research (JCDR) from Feb. 2007 to Oct.2014. Settings and Design: A retrospective observational study was conducted for JCDR. Materials and Methods: All issues of JCDR were electronically searched for the parameters : study design, area of interest of research, state /college where research was conducted, authorship pattern, source of articles published each year, changing study trends, disease under study and publication bias. Statistical Analysis used: The data was organized and analyzed using software SPSS - version 21.0; descriptive statistics was used. Results: Bibliometric analysis was done for 601 articles of JCDR published from Feb. 2007 to Oct. 2014. The total number of articles published under Dentistry section have tremendously increased from mere 2 articles in 2007 to 328 articles in 2014.Majority of the study designs published in both the journal were case reports (42.6%) followed by cross sectional studies (24.8%). 96.3% of the articles were from India. Majority of the articles published were of multi authors (65.2%) and from Educational institutes (98.4%). The trends of the articles published indicated that the case reports/series formed the major bulk (others=59.1%) followed by research studies (21.3%). Conclusion: It was concluded that most articles published were case reports followed by researches indicating an inclination towards better quality methodology. The SJR and the citation count of the articles published also indicated the quality of the scientific articles published. PMID:26023643
Statistical Analysis of the Polarimetric Cloud Analysis and Seeding Test (POLCAST) Field Projects
NASA Astrophysics Data System (ADS)
Ekness, Jamie Lynn
The North Dakota farming industry brings in more than $4.1 billion annually in cash receipts. Unfortunately, agriculture sales vary significantly from year to year, which is due in large part to weather events such as hail storms and droughts. One method to mitigate drought is to use hygroscopic seeding to increase the precipitation efficiency of clouds. The North Dakota Atmospheric Research Board (NDARB) sponsored the Polarimetric Cloud Analysis and Seeding Test (POLCAST) research project to determine the effectiveness of hygroscopic seeding in North Dakota. The POLCAST field projects obtained airborne and radar observations, while conducting randomized cloud seeding. The Thunderstorm Identification Tracking and Nowcasting (TITAN) program is used to analyze radar data (33 usable cases) in determining differences in the duration of the storm, rain rate and total rain amount between seeded and non-seeded clouds. The single ratio of seeded to non-seeded cases is 1.56 (0.28 mm/0.18 mm) or 56% increase for the average hourly rainfall during the first 60 minutes after target selection. A seeding effect is indicated with the lifetime of the storms increasing by 41 % between seeded and non-seeded clouds for the first 60 minutes past seeding decision. A double ratio statistic, a comparison of radar derived rain amount of the last 40 minutes of a case (seed/non-seed), compared to the first 20 minutes (seed/non-seed), is used to account for the natural variability of the cloud system and gives a double ratio of 1.85. The Mann-Whitney test on the double ratio of seeded to non-seeded cases (33 cases) gives a significance (p-value) of 0.063. Bootstrapping analysis of the POLCAST set indicates that 50 cases would provide statistically significant results based on the Mann-Whitney test of the double ratio. All the statistical analysis conducted on the POLCAST data set show that hygroscopic seeding in North Dakota does increase precipitation. While an additional POLCAST field project would be necessary to obtain standardly accepted statistically significant results (p < 0.5) for the double ratio of precipitation amount, the obtained p-value of 0.063 is close and considering the positive result from other hygroscopic seeding experiments, the North Dakota Cloud Modification Project should consider implementation of hygroscopic seeding.
Analysis of repeated measurement data in the clinical trials
Singh, Vineeta; Rana, Rakesh Kumar; Singhal, Richa
2013-01-01
Statistics is an integral part of Clinical Trials. Elements of statistics span Clinical Trial design, data monitoring, analyses and reporting. A solid understanding of statistical concepts by clinicians improves the comprehension and the resulting quality of Clinical Trials. In biomedical research it has been seen that researcher frequently use t-test and ANOVA to compare means between the groups of interest irrespective of the nature of the data. In Clinical Trials we record the data on the patients more than two times. In such a situation using the standard ANOVA procedures is not appropriate as it does not consider dependencies between observations within subjects in the analysis. To deal with such types of study data Repeated Measure ANOVA should be used. In this article the application of One-way Repeated Measure ANOVA has been demonstrated by using the software SPSS (Statistical Package for Social Sciences) Version 15.0 on the data collected at four time points 0 day, 15th day, 30th day, and 45th day of multicentre clinical trial conducted on Pandu Roga (~Iron Deficiency Anemia) with an Ayurvedic formulation Dhatrilauha. PMID:23930038
NASA Astrophysics Data System (ADS)
Cianciara, Aleksander
2016-09-01
The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.
GWAR: robust analysis and meta-analysis of genome-wide association studies.
Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G
2017-05-15
In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
The Relationship between Zinc Levels and Autism: A Systematic Review and Meta-analysis.
Babaknejad, Nasim; Sayehmiri, Fatemeh; Sayehmiri, Kourosh; Mohamadkhani, Ashraf; Bahrami, Somaye
2016-01-01
Autism is a complex behaviorally defined disorder.There is a relationship between zinc (Zn) levels in autistic patients and development of pathogenesis, but the conclusion is not permanent. The present study conducted to estimate this probability using meta-analysis method. In this study, Fixed Effect Model, twelve articles published from 1978 to 2012 were selected by searching Google scholar, PubMed, ISI Web of Science, and Scopus and information were analyzed. I² statistics were calculated to examine heterogeneity. The information was analyzed using R and STATA Ver. 12.2. There was no significant statistical difference between hair, nail, and teeth Zn levels between controls and autistic patients: -0.471 [95% confidence interval (95% CI): -1.172 to 0.231]. There was significant statistical difference between plasma Zn concentration and autistic patients besides healthy controls: -0.253 (95% CI: 0.498 to -0.007). Using a Random Effect Model, the overall Integration of data from the two groups was -0.414 (95% CI: -0.878 to -0.051). Based on sensitivity analysis, zinc supplements can be used for the nutritional therapy for autistic patients.
NASA Astrophysics Data System (ADS)
Sanchez, J.
2018-06-01
In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.
Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed
2014-01-01
Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576
Depictions and Gaps: Portrayal of U.S. Poverty in Realistic Fiction Children's Picture Books
ERIC Educational Resources Information Center
Kelley, Jane E.; Darragh, Janine J.
2011-01-01
Researchers conducted a critical multicultural analysis of 58 realistic fiction children's picture books that portray people living in poverty and compared these depictions to recent statistics from the United States Census Bureau. The picture books were examined for the following qualities: main character, geographic locale and time era, focal…
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
Richard A. Birdsey; William H. McWilliams
1986-01-01
The forest inventory and analysis unit of the southern forest experiment stations (Forest Survey) conducts periodic inventories at approximately 10-year intervals of the forest resources of the Midsouth States (fig. 1). This report contains a summary of forest acreage estimates made between 1950 and 1985. The statistics are based on published forest survey reports and...
Clinical Efficacy of Psychoeducational Interventions with Family Caregivers
ERIC Educational Resources Information Center
Limiñana-Gras, Rosa M.; Colodro-Conde, Lucía; Cuéllar-Flores, Isabel; Sánchez-López, M. Pilar
2016-01-01
The goal of this study is to investigate the efficacy of psychoeducational interventions geared to reducing psychological distress for caregivers in a sample of 90 family caregivers of elderly dependent (78 women and 12 men). We conducted an analysis of the statistical and clinical significance of the changes observed in psychological health…
Return to Our Roots: Raising Radishes to Teach Experimental Design. Methods and Techniques.
ERIC Educational Resources Information Center
Stallings, William M.
1993-01-01
Reviews research in teaching applied statistics. Concludes that students should analyze data from studies they have designed and conducted. Describes an activity in which students study germination and growth of radish seeds. Includes a table providing student instructions for both the experimental procedure and data analysis. (CFR)
Forest statistics for Arkansas' Ozark counties - 1995
James F. Rosson; Jack D. London
1997-01-01
Periodic surveys of forest resources are authorized by the Forest Service and Rangeland Renewable Resources Research Act of 1978. These surveys are a continuing, nationwide undertaking by the Regional Experiment Stations of the USDA Forest Service. In the Southern United States, these surveys are conducted by the two Forest Inventory and Analysis (FIA) Research Work...
ERIC Educational Resources Information Center
Al-Hattami, Abdulghani Ali Dawod; Al-Ahdal, Arif Ahmed Mohammed Hassa
2015-01-01
Scientific research plays an important role in creating growth and progress in developing countries (Greenstone, 2010). Developed countries have realized that importance and focused on conducting scientific researches to help them make valuable decisions. Many Arab countries, including Saudi Arabia, are trying to encourage faculty members at all…
Forest statistics for Northwest Florida, 1987
Mark J. Brown
1987-01-01
The Forest Inventory and Analysis (Forest Survey) Research Work Unit at the Southeastern Forest Experiment Station recently conducted a review of its data processing procedures. During this process, a computer error was discovered which led to inflated estimates of annual removals, net annual growth, and annual mortality for the 1970-1980 remeasurement period in...
2012-01-01
EMG studies). Data Management and Analysis Descriptive statistics for subject demographics and nerve conduction study variables were calculated using...military N/A Family history of CTS; previous work history as electrician, guitar player 49 (R) None N/A Dental assistant; waiter NCS indicates
ERIC Educational Resources Information Center
O'Connell, Ann Aileen
The relationships among types of errors observed during probability problem solving were studied. Subjects were 50 graduate students in an introductory probability and statistics course. Errors were classified as text comprehension, conceptual, procedural, and arithmetic. Canonical correlation analysis was conducted on the frequencies of specific…
The Role of Personal Values in Social Entrepreneurship
ERIC Educational Resources Information Center
Akar, Hüseyin; Dogan, Yildiz Burcu
2018-01-01
The purpose of this research is to examine to what extent pre-service teachers' personal values predict their social entrepreneurship characteristics. In this context, statistical analysis was conducted on the data obtained from 393 pre-service teachers studying at the Faculty of Muallim Rifat Education at Kilis 7 Aralik University in 2016-2017…
T.M. Barrett
2004-01-01
During the 1990s, forest inventories for California, Oregon, and Washington were conducted by different agencies using different methods. The Pacific Northwest Research Station Forest Inventory and Analysis program recently integrated these inventories into a single database. This document briefly describes potential statistical methods for estimating population totals...
Forest statistics for Southwest Mississippi counties - 1994
Joanne L. Faulkner; Andrew J. Hartsell; Jack D. London
1995-01-01
Tabulated results were derived from data obtained during a 1994 forest inventory of southwest Mississippi counties (fig. I). These data are considered preliminary. Field work was conducted from February to august 1994. Core tables 1 through 25 are compatible among forest Inventory and Analysis (FIA) units in the Eastern United States. Supplemental tables 26 through 44...
Methods of Learning in Statistical Education: A Randomized Trial of Public Health Graduate Students
ERIC Educational Resources Information Center
Enders, Felicity Boyd; Diener-West, Marie
2006-01-01
A randomized trial of 265 consenting students was conducted within an introductory biostatistics course: 69 received eight small group cooperative learning sessions; 97 accessed internet learning sessions; 96 received no intervention. Effect on examination score (95% CI) was assessed by intent-to-treat analysis and by incorporating reported…
A combined epidemiological-exposure panel study was conducted during the summer of 1998 in Baltimore, Maryland. The objectives of the exposure analysis component of the 28-day study were to investigate the statistical relationships between particulate matter (PM) and related co...
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... considerations affecting the design and conduct of repellent studies when human subjects are involved. Any... recommendations for the design and execution of studies to evaluate the performance of pesticide products intended... recommends appropriate study designs and methods for selecting subjects, statistical analysis, and reporting...
A Comparative Study of Student Math Skills: Perceptions, Validation, and Recommendations
ERIC Educational Resources Information Center
Jones, Thomas W.; Price, Barbara A.; Randall, Cindy H.
2011-01-01
A study was conducted at a southern university in sophomore level production classes to assess skills such as the order of arithmetic operations, decimal and percent conversion, solving of algebraic expressions, and evaluation of formulas. The study was replicated using business statistics and quantitative analysis classes at a southeastern…
Forest statistics for Arkansas counties - 1995
Jack D. London
1997-01-01
Periodic surveys of forest resources are authorized by the Forest Service and Rangeland Renewable Resources Research Act of 1978. These surveys are a continuing, nationwide undertaking by the Regional Experiment Stations of the USDA Forest Service. In the Southern United States, these surveys are conducted by the two Forest Inventory and Analysis (FIA) Research Work...
Forest statistics for Arkansas' Ouachita counties - 1995
James F. Rosson; Jack D. London
1997-01-01
Periodic surveys of forest resources are authorized by the Forest Service and Rangeland Renewable Resources Research Act of 1978. These surveys are a continuing, nationwide undertaking by the Regional Experiment Stations of the USDA Forest Service. In the Southern United States, these surveys are conducted by the two Forest Inventory and Analysis (FIA) Research Work...
Welding of AM350 and AM355 steel
NASA Technical Reports Server (NTRS)
Davis, R. J.; Wroth, R. S.
1967-01-01
A series of tests was conducted to establish optimum procedures for TIG welding and heat treating of AM350 and AM355 steel sheet in thicknesses ranging from 0.010 inch to 0.125 inch. Statistical analysis of the test data was performed to determine the anticipated minimum strength of the welded joints.
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
Pantyley, Viktoriya
2017-09-21
The primary goals of the study were a critical analysis of the concepts associated with health from the perspective of sustainable development, and empirical analysis of health and health- related issues among the rural and urban residents of Eastern Poland in the context of the sustainable development of the region. The study was based on the following research methods: a systemic approach, selection and analysis of the literature and statistical data, developing a special questionnaire concerning socio-economic and health inequalities among the population in the studied area, field research with an interview questionnaire conducted on randomly-selected respondents (N=1,103) in randomly selected areas of the Lubelskie, Podkarpackie, Podlaskie and eastern part of Mazowieckie Provinces (with the division between provincial capital cities - county capital cities - other cities - rural areas). The results of statistical surveys in the studied area with the use of chi-square test and contingence quotients indicated a correlation between the state of health and the following independent variables: age, life quality, social position and financial situation (C-Pearson's coefficient over 0,300); a statistically significant yet weak correlation was recorded for gender, household size, place of residence and amount of free time. The conducted analysis proved the existence of a huge gap between state of health of the population in urban and rural areas. In order to eliminate unfavourable differences in the state iof health among the residents of Eastern Poland, and provide equal sustainable development in urban and rural areas of the examined areas, special preventive programmes aimed at the residents of peripheral, marginalized rural areas should be implemented. In these programmes, attention should be paid to preventive measures, early diagnosis of basic civilization and social diseases, and better accessibility to medical services for the residents.
Turgeon, Ricky D; Wilby, Kyle J; Ensom, Mary H H
2015-06-01
We conducted a systematic review with meta-analysis to evaluate the efficacy of antiviral agents on complete recovery of Bell's palsy. We searched CENTRAL, Embase, MEDLINE, International Pharmaceutical Abstracts, and sources of unpublished literature to November 1, 2014. Primary and secondary outcomes were complete and satisfactory recovery, respectively. To evaluate statistical heterogeneity, we performed subgroup analysis of baseline severity of Bell's palsy and between-study sensitivity analyses based on risk of allocation and detection bias. The 10 included randomized controlled trials (2419 patients; 807 with severe Bell's palsy at onset) had variable risk of bias, with 9 trials having a high risk of bias in at least 1 domain. Complete recovery was not statistically significantly greater with antiviral use versus no antiviral use in the random-effects meta-analysis of 6 trials (relative risk, 1.06; 95% confidence interval, 0.97-1.16; I(2) = 65%). Conversely, random-effects meta-analysis of 9 trials showed a statistically significant difference in satisfactory recovery (relative risk, 1.10; 95% confidence interval, 1.02-1.18; I(2) = 63%). Response to antiviral agents did not differ visually or statistically between patients with severe symptoms at baseline and those with milder disease (test for interaction, P = .11). Sensitivity analyses did not show a clear effect of bias on outcomes. Antiviral agents are not efficacious in increasing the proportion of patients with Bell's palsy who achieved complete recovery, regardless of baseline symptom severity. Copyright © 2015 Elsevier Inc. All rights reserved.
The Canadian Precipitation Analysis (CaPA): Evaluation of the statistical interpolation scheme
NASA Astrophysics Data System (ADS)
Evans, Andrea; Rasmussen, Peter; Fortin, Vincent
2013-04-01
CaPA (Canadian Precipitation Analysis) is a data assimilation system which employs statistical interpolation to combine observed precipitation with gridded precipitation fields produced by Environment Canada's Global Environmental Multiscale (GEM) climate model into a final gridded precipitation analysis. Precipitation is important in many fields and applications, including agricultural water management projects, flood control programs, and hydroelectric power generation planning. Precipitation is a key input to hydrological models, and there is a desire to have access to the best available information about precipitation in time and space. The principal goal of CaPA is to produce this type of information. In order to perform the necessary statistical interpolation, CaPA requires the estimation of a semi-variogram. This semi-variogram is used to describe the spatial correlations between precipitation innovations, defined as the observed precipitation amounts minus the GEM forecasted amounts predicted at the observation locations. Currently, CaPA uses a single isotropic variogram across the entire analysis domain. The present project investigates the implications of this choice by first conducting a basic variographic analysis of precipitation innovation data across the Canadian prairies, with specific interest in identifying and quantifying potential anisotropy within the domain. This focus is further expanded by identifying the effect of storm type on the variogram. The ultimate goal of the variographic analysis is to develop improved semi-variograms for CaPA that better capture the spatial complexities of precipitation over the Canadian prairies. CaPA presently applies a Box-Cox data transformation to both the observations and the GEM data, prior to the calculation of the innovations. The data transformation is necessary to satisfy the normal distribution assumption, but introduces a significant bias. The second part of the investigation aims at devising a bias correction scheme based on a moving-window averaging technique. For both the variogram and bias correction components of this investigation, a series of trial runs are conducted to evaluate the impact of these changes on the resulting CaPA precipitation analyses.
Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter
2017-09-01
To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-10-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-01-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512
Paranjpe, Madhav G; Denton, Melissa D; Vidmar, Tom; Elbekai, Reem H
2014-01-01
Carcinogenicity studies have been performed in conventional 2-year rodent studies for at least 3 decades, whereas the short-term carcinogenicity studies in transgenic mice, such as Tg.rasH2, have only been performed over the last decade. In the 2-year conventional rodent studies, interlinked problems, such as increasing trends in the initial body weights, increased body weight gains, high incidence of spontaneous tumors, and low survival, that complicate the interpretation of findings have been well established. However, these end points have not been evaluated in the short-term carcinogenicity studies involving the Tg.rasH2 mice. In this article, we present retrospective analysis of data obtained from control groups in 26-week carcinogenicity studies conducted in Tg.rasH2 mice since 2004. Our analysis showed statistically significant decreasing trends in initial body weights of both sexes. Although the terminal body weights did not show any significant trends, there was a statistically significant increasing trend toward body weight gains, more so in males than in females, which correlated with increasing trends in the food consumption. There were no statistically significant alterations in mortality trends. In addition, the incidence of all common spontaneous tumors remained fairly constant with no statistically significant differences in trends. © The Author(s) 2014.
Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.
2014-01-01
Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060
Wear behavior of AA 5083/SiC nano-particle metal matrix composite: Statistical analysis
NASA Astrophysics Data System (ADS)
Hussain Idrisi, Amir; Ismail Mourad, Abdel-Hamid; Thekkuden, Dinu Thomas; Christy, John Victor
2018-03-01
This paper reports study on statistical analysis of the wear characteristics of AA5083/SiC nanocomposite. The aluminum matrix composites with different wt % (0%, 1% and 2%) of SiC nanoparticles were fabricated by using stir casting route. The developed composites were used in the manufacturing of spur gears on which the study was conducted. A specially designed test rig was used in testing the wear performance of the gears. The wear was investigated under different conditions of applied load (10N, 20N, and 30N) and operation time (30 mins, 60 mins, 90 mins, and 120mins). The analysis carried out at room temperature under constant speed of 1450 rpm. The wear parameters were optimized by using Taguchi’s method. During this statistical approach, L27 Orthogonal array was selected for the analysis of output. Furthermore, analysis of variance (ANOVA) was used to investigate the influence of applied load, operation time and SiC wt. % on wear behaviour. The wear resistance was analyzed by selecting “smaller is better” characteristics as the objective of the model. From this research, it is observed that experiment time and SiC wt % have the most significant effect on the wear performance followed by the applied load.
Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T
2017-01-01
We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.
Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs
Irvine, Kathryn M.; Rodhouse, Thomas J.
2014-01-01
As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially different results and/or computational instability. However, when only fixed effects are of interest, the survey package (svyglm and svyolr) may be suitable for a model-assisted analysis for trend. We provide possible directions for future research into combined analysis for ordinal and continuous vital sign indictors.
Effects of Inaccurate Identification of Interictal Epileptiform Discharges in Concurrent EEG-fMRI
NASA Astrophysics Data System (ADS)
Gkiatis, K.; Bromis, K.; Kakkos, I.; Karanasiou, I. S.; Matsopoulos, G. K.; Garganis, K.
2017-11-01
Concurrent continuous EEG-fMRI is a novel multimodal technique that is finding its way into clinical practice in epilepsy. EEG timeseries are used to identify the timing of interictal epileptiform discharges (IEDs) which is then included in a GLM analysis in fMRI to localize the epileptic onset zone. Nevertheless, there are still some concerns about its reliability concerning BOLD changes correlated with IEDs. Even though IEDs are identified by an experienced neurologist-epiliptologist, the reliability and concordance of the mark-ups is depending on many factors including the level of fatigue, the amount of time that he spent or, in some cases, even the screen that is being used for the display of timeseries. This investigation is aiming to unravel the effect of misidentification or inaccuracy in the mark-ups of IEDs in the fMRI statistical parametric maps. Concurrent EEG-fMRI was conducted in six subjects with various types of epilepsy. IEDs were identified by an experienced neurologist-epiliptologist. Analysis of EEG was performed with EEGLAB and analysis of fMRI was conducted in FSL. Preliminary results revealed lower statistical significance for missing events or larger period of IEDs than the actual ones and the introduction of false positives and false negatives in statistical parametric maps when random events were included in the GLM on top of the IEDs. Our results suggest that mark-ups in EEG for simultaneous EEG-fMRI should be done with caution from an experienced and restful neurologist as it affects the fMRI results in various and unpredicted ways.
Low statistical power in biomedical science: a review of three human research domains.
Dumas-Mallet, Estelle; Button, Katherine S; Boraud, Thomas; Gonon, Francois; Munafò, Marcus R
2017-02-01
Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0-10% or 11-20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.
Low statistical power in biomedical science: a review of three human research domains
Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois
2017-01-01
Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409
Mechanical properties of silicate glasses exposed to a low-Earth orbit
NASA Technical Reports Server (NTRS)
Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.
1992-01-01
The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.
NASA Astrophysics Data System (ADS)
Whittaker, Kara A.; McShane, Dan
2013-02-01
A large storm event in southwest Washington State triggered over 2500 landslides and provided an opportunity to assess two slope stability screening tools. The statistical analysis conducted demonstrated that both screening tools are effective at predicting where landslides were likely to take place (Whittaker and McShane, 2012). Here we reply to two discussions of this article related to the development of the slope stability screening tools and the accuracy and scale of the spatial data used. Neither of the discussions address our statistical analysis or results. We provide greater detail on our sampling criteria and also elaborate on the policy and management implications of our findings and how they complement those of a separate investigation of landslides resulting from the same storm. The conclusions made in Whittaker and McShane (2012) stand as originally published unless future analysis indicates otherwise.
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.
Toranjian, Amin; Marofi, Safar
2017-05-01
Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.
A guide to understanding meta-analysis.
Israel, Heidi; Richter, Randy R
2011-07-01
With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.
Austin, Peter C
2007-11-01
I conducted a systematic review of the use of propensity score matching in the cardiovascular surgery literature. I examined the adequacy of reporting and whether appropriate statistical methods were used. I examined 60 articles published in the Annals of Thoracic Surgery, European Journal of Cardio-thoracic Surgery, Journal of Cardiovascular Surgery, and the Journal of Thoracic and Cardiovascular Surgery between January 1, 2004, and December 31, 2006. Thirty-one of the 60 studies did not provide adequate information on how the propensity score-matched pairs were formed. Eleven (18%) of studies did not report on whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. No studies used appropriate methods to compare baseline characteristics between treated and untreated subjects in the propensity score-matched sample. Eight (13%) of the 60 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Two studies used appropriate methods for some outcomes, but not for all outcomes. Thirty-nine (65%) studies explicitly used statistical methods that were inappropriate for matched-pairs data when estimating the effect of treatment on outcomes. Eleven studies did not report the statistical tests that were used to assess the statistical significance of the treatment effect. Analysis of propensity score-matched samples tended to be poor in the cardiovascular surgery literature. Most statistical analyses ignored the matched nature of the sample. I provide suggestions for improving the reporting and analysis of studies that use propensity score matching.
Arslan, Miray; Şar, Sevgi
2017-12-11
Logistics activities play a prominent role in enabling manufacturers, distribution channels, and pharmacies to work in harmony. Nowadays these activities have become increasingly striking in the pharmaceutical industry and seen as a development area for this sector. Additionally, green practices are beginning to be more attracting particularly in decreasing costs and increasing image of pharmaceutical companies. The main objective of this study was modeling green logistics (GL) behavior of the managers in the pharmaceutical sector in the theory of planned behavior (TPB) frame via structural equation modeling (SEM). A measurement tool was developed according to TPB. Exploratory factor analysis was conducted to determine subfactors of GL behavior. In the second step, confirmatory factor analysis (CFA) was conducted for confirming whether there is a relationship between the observed variables and their underlying latent constructs. Finally, structural equation model was conducted to specify the relationships between latent variables. In the proposed green logistics behavior (GLB) model, the positive effect of environmental attitude towards GL, perceived behavioral control related GL, and subjective norm about GL on intention towards GL were found statistically significant. Nevertheless, the effect of attitude towards costs of GL on intention towards GL was not found statistically significant. Intention towards GL has been found to have a positive statistically significant effect on the GL behavior. Based on the results of this study, it is possible to say that TPB is an appropriate theory for modeling green logistics behavior of managers. This model can be seen as a guide to the companies in the pharmaceutical sector to participate in green logistics. Copyright © 2017 Elsevier Inc. All rights reserved.
Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F
2015-01-01
Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.
Peronard, Jean-Paul
2013-01-01
This article is a comparative analysis between workers in health care with high and low degree of readiness for living technology such as robotics. To explore the differences among workers' readiness, statistical analysis was conducted in a data set obtained from 200 respondents. The results showed important differences between high- and low-readiness types on issues such as staff security, documentation, autonomy, and future challenges.
2011-05-23
determine if the group means are significantly different. Analysis should include comparison of pretest - posttest means within- group and between... groups . A repeated measures analysis should also be conducted on subject‘s pretest - posttest means within groups . Information gained from the statistical...Black Single Unemployed 30 - 39 Senior NCO Hispanic ≥ 40 Officers Other - A research design embedding a pretest / posttest with a stratified then
Hariri, Azian; Paiman, Nuur Azreen; Leman, Abdul Mutalib; Md Yusof, Mohammad Zainal
2014-08-01
This study aimed to develop an index that can rank welding workplace that associate well with possible health risk of welders. Welding Fumes Health Index (WFHI) were developed based on data from case studies conducted in Plant 1 and Plant 2. Personal sampling of welding fumes to assess the concentration of metal constituents along with series of lung function tests was conducted. Fifteen metal constituents were investigated in each case study. Index values were derived from aggregation analysis of metal constituent concentration while significant lung functions were recognized through statistical analysis in each plant. The results showed none of the metal constituent concentration was exceeding the permissible exposure limit (PEL) for all plants. However, statistical analysis showed significant mean differences of lung functions between welders and non-welders. The index was then applied to one of the welding industry (Plant 3) for verification purpose. The developed index showed its promising ability to rank welding workplace, according to the multiple constituent concentrations of welding fumes that associates well with lung functions of the investigated welders. There was possibility that some of the metal constituents were below the detection limit leading to '0' value of sub index, thus the multiplicative form of aggregation model was not suitable for analysis. On the other hand, maximum or minimum operator forms suffer from compensation issues and were not considered in this study.
Levy, Jonathan I.; Diez, David; Dou, Yiping; Barr, Christopher D.; Dominici, Francesca
2012-01-01
Health risk assessments of particulate matter less than 2.5 μm in diameter (PM2.5) often assume that all constituents of PM2.5 are equally toxic. While investigators in previous epidemiologic studies have evaluated health risks from various PM2.5 constituents, few have conducted the analyses needed to directly inform risk assessments. In this study, the authors performed a literature review and conducted a multisite time-series analysis of hospital admissions and exposure to PM2.5 constituents (elemental carbon, organic carbon matter, sulfate, and nitrate) in a population of 12 million US Medicare enrollees for the period 2000–2008. The literature review illustrated a general lack of multiconstituent models or insight about probabilities of differential impacts per unit of concentration change. Consistent with previous results, the multisite time-series analysis found statistically significant associations between short-term changes in elemental carbon and cardiovascular hospital admissions. Posterior probabilities from multiconstituent models provided evidence that some individual constituents were more toxic than others, and posterior parameter estimates coupled with correlations among these estimates provided necessary information for risk assessment. Ratios of constituent toxicities, commonly used in risk assessment to describe differential toxicity, were extremely uncertain for all comparisons. These analyses emphasize the subtlety of the statistical techniques and epidemiologic studies necessary to inform risk assessments of particle constituents. PMID:22510275
Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin
We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less
Structural texture similarity metrics for image analysis and retrieval.
Zujovic, Jana; Pappas, Thrasyvoulos N; Neuhoff, David L
2013-07-01
We develop new metrics for texture similarity that accounts for human visual perception and the stochastic nature of textures. The metrics rely entirely on local image statistics and allow substantial point-by-point deviations between textures that according to human judgment are essentially identical. The proposed metrics extend the ideas of structural similarity and are guided by research in texture analysis-synthesis. They are implemented using a steerable filter decomposition and incorporate a concise set of subband statistics, computed globally or in sliding windows. We conduct systematic tests to investigate metric performance in the context of "known-item search," the retrieval of textures that are "identical" to the query texture. This eliminates the need for cumbersome subjective tests, thus enabling comparisons with human performance on a large database. Our experimental results indicate that the proposed metrics outperform peak signal-to-noise ratio (PSNR), structural similarity metric (SSIM) and its variations, as well as state-of-the-art texture classification metrics, using standard statistical measures.
Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries
Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...
2016-03-09
We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less
NASA Astrophysics Data System (ADS)
Kimball, Jorja; Cole, Bryan; Hobson, Margaret; Watson, Karan; Stanley, Christine
This paper reports findings on gender that were part of a larger study reviewing time to completion of course work that includes the first two semesters of calculus, chemistry, and physics, which are often considered the stumbling points or "barrier courses" to an engineering baccalaureate degree. Texas A&M University terms these courses core body of knowledge (CBK), and statistical analysis was conducted on two cohorts of first-year enrolling engineering students at the institution. Findings indicate that gender is statistically significantly related to completion of CBK with female engineering students completing required courses faster than males at the .01 level (p = 0.008). Statistical significance for gender and ethnicity was found between white male and white female students at the .01 level (p = 0.008). Descriptive analysis indicated that of the five majors studied (chemical, civil, computer, electrical, and mechanical engineering), women completed CBK faster than men, and African American and Hispanic women completed CBK faster than males of the same ethnicity.
NASA Astrophysics Data System (ADS)
Dufoyer, A.; Lecoq, N.; Massei, N.; Marechal, J. C.
2017-12-01
Physics-based modeling of karst systems remains almost impossible without enough accurate information about the inner physical characteristics. Usually, the only available hydrodynamic information is the flow rate at the karst outlet. Numerous works in the past decades have used and proven the usefulness of time-series analysis and spectral techniques applied to spring flow, precipitations or even physico-chemical parameters, for interpreting karst hydrological functioning. However, identifying or interpreting the karst systems physical features that control statistical or spectral characteristics of spring flow variations is still challenging, not to say sometimes controversial. The main objective of this work is to determine how the statistical and spectral characteristics of the hydrodynamic signal at karst springs can be related to inner physical and hydraulic properties. In order to address this issue, we undertake an empirical approach based on the use of both distributed and physics-based models, and on synthetic systems responses. The first step of the research is to conduct a sensitivity analysis of time-series/spectral methods to karst hydraulic and physical properties. For this purpose, forward modeling of flow through several simple, constrained and synthetic cases in response to precipitations is undertaken. It allows us to quantify how the statistical and spectral characteristics of flow at the outlet are sensitive to changes (i) in conduit geometries, and (ii) in hydraulic parameters of the system (matrix/conduit exchange rate, matrix hydraulic conductivity and storativity). The flow differential equations resolved by MARTHE, a computer code developed by the BRGM, allows karst conduits modeling. From signal processing on simulated spring responses, we hope to determine if specific frequencies are always modified, thanks to Fourier series and multi-resolution analysis. We also hope to quantify which parameters are the most variable with auto-correlation analysis: first results seem to show higher variations due to conduit conductivity than the ones due to matrix/conduit exchange rate. Future steps will be using another computer code, based on double-continuum approach and allowing turbulent conduit flow, and modeling a natural system.
Howard, Richard; Finn, Peter; Jose, Paul; Gallagher, Jennifer
2011-12-16
This study tested the hypothesis that adolescent-onset alcohol abuse (AOAA) would both mediate and moderate the effect of childhood conduct disorder on antisocial behaviour in late adolescence and early adulthood. A sample comprising 504 young men and women strategically recruited from the community were grouped using the criteria of the Diagnostic and Statistical Manual (DSM-IV, American Psychiatric Association. (1994). Diagnostic and statistical manual of mental disorders (4th ed.). Washington, DC: APA), as follows: neither childhood conduct disorder (CCD) nor alcohol abuse/dependence; CCD but no alcohol abuse or dependence; alcohol abuse/dependence but no CCD; both CCD and alcohol abuse/dependence. The outcome measure was the sum of positive responses to 55 interview items capturing a variety of antisocial behaviours engaged in since age 15. Severity of lifetime alcohol-related and CCD problems served as predictor variables in regression analysis. Antisocial behaviour problems were greatest in individuals with a history of co-occurring conduct disorder (CD) and alcohol abuse/dependence. While CCD was strongly predictive of adult antisocial behaviour, this effect was both mediated and moderated (exacerbated) by AOAA.
Howard, Richard; Finn, Peter; Jose, Paul; Gallagher, Jennifer
2012-01-01
This study tested the hypothesis that adolescent-onset alcohol abuse (AOAA) would both mediate and moderate the effect of childhood conduct disorder on antisocial behaviour in late adolescence and early adulthood. A sample comprising 504 young men and women strategically recruited from the community were grouped using the criteria of the Diagnostic and Statistical Manual (DSM-IV, American Psychiatric Association. (1994). Diagnostic and statistical manual of mental disorders (4th ed.). Washington, DC: APA), as follows: neither childhood conduct disorder (CCD) nor alcohol abuse/dependence; CCD but no alcohol abuse or dependence; alcohol abuse/dependence but no CCD; both CCD and alcohol abuse/dependence. The outcome measure was the sum of positive responses to 55 interview items capturing a variety of antisocial behaviours engaged in since age 15. Severity of lifetime alcohol-related and CCD problems served as predictor variables in regression analysis. Antisocial behaviour problems were greatest in individuals with a history of co-occurring conduct disorder (CD) and alcohol abuse/dependence. While CCD was strongly predictive of adult antisocial behaviour, this effect was both mediated and moderated (exacerbated) by AOAA. PMID:23459369
NASA Technical Reports Server (NTRS)
Ellis, David L.
2007-01-01
Room temperature tensile testing of Chemically Pure (CP) Titanium Grade 2 was conducted for as-received commercially produced sheet and following thermal exposure at 550 and 650 K for times up to 5,000 h. No significant changes in microstructure or failure mechanism were observed. A statistical analysis of the data was performed. Small statistical differences were found, but all properties were well above minimum values for CP Ti Grade 2 as defined by ASTM standards and likely would fall within normal variation of the material.
Search for correlation between geomagnetic disturbances and mortality
NASA Technical Reports Server (NTRS)
Lipa, B. J.; Sturrock, P. A.; Rogot, F.
1976-01-01
A search is conducted for a possible correlation between solar activity and myocardial infarction and stroke in the United States. A statistical analysis is performed using data on geomagnetic activity and the daily U.S. mortality due to coronary heart disease and stroke for the years 1962 through 1966. None of the results are found to yield any evidence of a correlation. It is concluded that correlations claimed by Soviet workers between geomagnetic activity and the incidence of various human diseases are probably not statistically significant or probably are not due to a causal relation between geomagnetic activity and disease.
Correlation and simple linear regression.
Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G
2003-06-01
In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.
Bernardo, Thaís Honório Lins; Sales Santos Veríssimo, Regina Célia; Alvino, Valter; Silva Araujo, Maria Gabriella; Evangelista Pires dos Santos, Raíssa Fernanda; Maurício Viana, Max Denisson; de Assis Bastos, Maria Lysete; Alexandre-Moreira, Magna Suzana; de Araújo-Júnior, João Xavier
2015-01-01
Introduction. Surgical site infection remains a challenge for hospital infection control, especially when it relates to skin antisepsis in the surgical site. Objective. To analyze the antimicrobial activity in vivo of an antiseptic from ethanol crude extracts of P. granatum and E. uniflora against Gram-positive and Gram-negative bacteria. Methods. Agar drilling and minimal inhibitory tests were conducted for in vitro evaluation. In the in vivo bioassay were used Wistar rats and Staphylococcus aureus (ATCC 25923) and Staphylococcus epidermidis (ATCC 14990). Statistical analysis was performed through variance analysis and Scott-Knott cluster test at 5% probability and significance level. Results. In the in vitro, ethanolic extracts of Punica granatum and Eugenia uniflora and their combination showed the best antimicrobial potential against S. epidermidis and S. aureus. In the in vivo bioassay against S. epidermidis, there was no statistically significant difference between the tested product and the patterns used after five minutes of applying the product. Conclusion. The results indicate that the originated product is an antiseptic alternative source against S. epidermidis compared to chlorhexidine gluconate. It is suggested that further researches are to be conducted in different concentrations of the test product, evaluating its effectiveness and operational costs. PMID:26146655
Objective forensic analysis of striated, quasi-striated and impressed toolmarks
NASA Astrophysics Data System (ADS)
Spotts, Ryan E.
Following the 1993 Daubert v. Merrell Dow Pharmaceuticals, Inc. court case and continuing to the 2010 National Academy of Sciences report, comparative forensic toolmark examination has received many challenges to its admissibility in court cases and its scientific foundations. Many of these challenges deal with the subjective nature in determining whether toolmarks are identifiable. This questioning of current identification methods has created a demand for objective methods of identification - "objective" implying known error rates and statistically reliability. The demand for objective methods has resulted in research that created a statistical algorithm capable of comparing toolmarks to determine their statistical similarity, and thus the ability to separate matching and nonmatching toolmarks. This was expanded to the creation of virtual toolmarking (characterization of a tool to predict the toolmark it will create). The statistical algorithm, originally designed for two-dimensional striated toolmarks, had been successfully applied to striated screwdriver and quasi-striated plier toolmarks. Following this success, a blind study was conducted to validate the virtual toolmarking capability using striated screwdriver marks created at various angles of incidence. Work was also performed to optimize the statistical algorithm by implementing means to ensure the algorithm operations were constrained to logical comparison regions (e.g. the opposite ends of two toolmarks do not need to be compared because they do not coincide with each other). This work was performed on quasi-striated shear cut marks made with pliers - a previously tested, more difficult application of the statistical algorithm that could demonstrate the difference in results due to optimization. The final research conducted was performed with pseudostriated impression toolmarks made with chisels. Impression marks, which are more complex than striated marks, were analyzed using the algorithm to separate matching and nonmatching toolmarks. Results of the conducted research are presented as well as evidence of the primary assumption of forensic toolmark examination; all tools can create identifiably unique toolmarks.
Gene-Based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions.
Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y; Chen, Wei
2016-02-01
Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, here we develop Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT), which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. © 2016 WILEY PERIODICALS, INC.
Gene-based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions
Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E.; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y.; Chen, Wei
2015-01-01
Summary Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, we develop here Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT) which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. PMID:26782979
Public Concepts of the Values and Costs of Higher Education, 1963-1974. A Preliminary Analysis.
ERIC Educational Resources Information Center
Minor, Michael J.; Murray, James R.
Statistical data are presented on interviews conducted through the Continuous National Survey (CNS) at the National Opinion Research Center in Chicago and based on results reprinted from "Public Concepts of the Values and Costs of Higher Education," by Angus Campbell and William C. Eckerman. The CNS results presented in this report are…
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Div. of Human Resources.
In response to Congressional requests, this report describes the extent of Hispanic American health and nutrition data available from federal sources. Oversampling of a minority group by a national survey is necessary for valid analysis of group characteristics. Among the four national health and nutrition surveys conducted by the Department of…
Faculty Salary Equity: Issues in Regression Model Selection. AIR 1992 Annual Forum Paper.
ERIC Educational Resources Information Center
Moore, Nelle
This paper discusses the determination of college faculty salary inequity and identifies the areas in which human judgment must be used in order to conduct a statistical analysis of salary equity. In addition, it provides some informed guidelines for making those judgments. The paper provides a framework for selecting salary equity models, based…
An Analysis of Questionnaire Survey on Online Evaluation of Teaching by University Undergraduates
ERIC Educational Resources Information Center
Sun, Dongyun
2013-01-01
This paper takes into consideration of the problems discovered in the teaching evaluation data statistics over the years in Changchun University of Science and Technology and cooperates with related departments to conduct a questionnaire survey on an online evaluation of teaching, with the purpose of detecting cognition of students in evaluation…
ERIC Educational Resources Information Center
Koutrouba, Konstantina; Karageorgou, Elissavet
2013-01-01
The present questionnaire-based study was conducted in 2010 in order to examine 677 Greek Second Chance School (SCS) students' perceptions about the cognitive and socio-affective outcomes of project-based learning. Data elaboration, statistical and factor analysis showed that the participants found that project-based learning offered a second…
ERIC Educational Resources Information Center
Roberge, Pasquale; Marchand, Andre; Reinharz, Daniel; Savard, Pierre
2008-01-01
A randomized, controlled trial was conducted to examine the cost-effectiveness of cognitive-behavioral treatment (CBT) for panic disorder with agoraphobia. A total of 100 participants were randomly assigned to standard (n = 33), group (n = 35), and brief (n = 32) treatment conditions. Results show significant clinical and statistical improvement…
A spatial database of wildfires in the United States, 1992-2011
K. C. Short
2014-01-01
The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). However, there are multiple federal, state, and local entities with wildfire protection and reporting responsibilities in the US, and no single, unified system of wildfire record keeping exists. To conduct even the...
A spatial database of wildfires in the United States, 1992-2011 [Discussions
K. C. Short
2013-01-01
The statistical analysis of wildfire activity is a critical component of national wildfire planning, operations, and research in the United States (US). However, there are multiple federal, state, and local entities with wildfire protection and reporting responsibilities in the US, and no single, unified system of wildfire record-keeping exists. To conduct even the...
Color Charts, Esthetics, and Subjective Randomness
ERIC Educational Resources Information Center
Sanderson, Yasmine B.
2012-01-01
Color charts, or grids of evenly spaced multicolored dots or squares, appear in the work of modern artists and designers. Often the artist/designer distributes the many colors in a way that could be described as "random," that is, without an obvious pattern. We conduct a statistical analysis of 125 "random-looking" art and design color charts and…
2015-09-30
oil spills (unpublished data, Kellar). The second will be to conduct a more fine-scale analysis of the areas examined during this study. For this...REFERENCES Carlin BP , Chib S (1995) Bayesian model choice via Markov-chain Monte-Carlo methods. Journal of the Royal Statistical Society
ERIC Educational Resources Information Center
Siphai, Sunan
2015-01-01
The objective of this study is to investigate the influences of moral, emotional and adversity quotient on good citizenship of Rajabhat University's students in Northeastern Region of Thailand. The samples included 1,087 undergraduate students from 8 different Rajabhat universities. Data analysis was conducted in descriptive statistics and…
School-to-Work Transition and After: Do Inequalities between the Sexes Defy Diplomas?
ERIC Educational Resources Information Center
Couppie, Thomas; Epiphane, Dominique; Fournier, Christine
1997-01-01
Sex-related differences between the employment opportunities available in France to males and females with comparable levels of education were examined through an analysis of data from two types of sources: statistics derived from quantitative surveys conducted on broad samples of graduates 2-4 years after the end of their training and in-depth…
NASA Astrophysics Data System (ADS)
Park, K. W.; Dasika, V. D.; Nair, H. P.; Crook, A. M.; Bank, S. R.; Yu, E. T.
2012-06-01
We have used conductive atomic force microscopy to investigate the influence of growth temperature on local current flow in GaAs pn junctions with embedded ErAs nanoparticles grown by molecular beam epitaxy. Three sets of samples, one with 1 ML ErAs deposited at different growth temperatures and two grown at 530 °C and 575 °C with varying ErAs depositions, were characterized. Statistical analysis of local current images suggests that the structures grown at 575 °C have about 3 times thicker ErAs nanoparticles than structures grown at 530 °C, resulting in degradation of conductivity due to reduced ErAs coverage. These findings explain previous studies of macroscopic tunnel junctions.
Miyagi, Atsushi
2017-09-01
Detailed exploration of sensory perception as well as preference across gender and age for a certain food is very useful for developing a vendible food commodity related to physiological and psychological motivation for food preference. Sensory tests including color, sweetness, bitterness, fried peanut aroma, textural preference and overall liking of deep-fried peanuts with varying frying time (2, 4, 6, 9, 12 and 15 min) at 150 °C were carried out using 417 healthy Japanese consumers. To determine the influence of gender and age on sensory evaluation, systematic statistical analysis including one-way analysis of variance, polynomial regression analysis and multiple regression analysis was conducted using the collected data. The results indicated that females were more sensitive to bitterness than males. This may affect sensory preference; female subjects favored peanuts prepared with a shorter frying time more than male subjects did. With advancing age, textural preference played a more important role in overall preference. Older subjects liked deeper-fried peanuts, which are more brittle, more than younger subjects did. In the present study, systematic statistical analysis based on collected sensory evaluation data using deep-fried peanuts was conducted and the tendency of sensory perception and preference across gender and age was clarified. These results may be useful for engineering optimal strategies to target specific segments to gain greater acceptance in the market. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Notes for Brazil sampling frame evaluation trip
NASA Technical Reports Server (NTRS)
Horvath, R. (Principal Investigator); Hicks, D. R. (Compiler)
1981-01-01
Field notes describing a trip conducted in Brazil are presented. This trip was conducted for the purpose of evaluating a sample frame developed using LANDSAT full frame images by the USDA Economic and Statistics Service for the eventual purpose of cropland production estimation with LANDSAT by the Foreign Commodity Production Forecasting Project of the AgRISTARS program. Six areas were analyzed on the basis of land use, crop land in corn and soybean, field size and soil type. The analysis indicated generally successful use of LANDSAT images for purposes of remote large area land use stratification.
Thermal conductance of and heat generation in tire-pavement interface and effect on aircraft braking
NASA Technical Reports Server (NTRS)
Miller, C. D.
1976-01-01
A finite-difference analysis was performed on temperature records obtained from a free rolling automotive tire and from pavement surface. A high thermal contact conductance between tire and asphalt was found on a statistical basis. Average slip due to squirming between tire and asphalt was about 1.5 mm. Consequent friction heat was estimated as 64 percent of total power absorbed by bias-ply, belted tire. Extrapolation of results to aircraft tire indicates potential braking improvement by even moderate increase of heat absorbing capacity of runway surface.
NASA Astrophysics Data System (ADS)
Sergeenko, N. P.
2017-11-01
An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.
Nemr, Kátia; Amar, Ali; Abrahão, Marcio; Leite, Grazielle Capatto de Almeida; Köhle, Juliana; Santos, Alexandra de O; Correa, Luiz Artur Costa
2005-01-01
As a result of technology evolution and development, methods of voice evaluation have changed both in medical and speech and language pathology practice. To relate the results of perceptual evaluation, acoustic analysis and medical evaluation in the diagnosis of vocal and/or laryngeal affections of the population with vocal complaint. Clinical prospective. 29 people that attended vocal health protection campaign were evaluated. They were submitted to perceptual evaluation (AFPA), acoustic analysis (AA), indirect laryngoscopy (LI) and telelaryngoscopy (TL). Correlations between medical and speech language pathology evaluation methods were established, verifying possible statistical signification with the application of Fischer Exact Test. There were statistically significant results in the correlation between AFPA and LI, AFPA and TL, LI and TL. This research study conducted in a vocal health protection campaign presented correlations between speech language pathology evaluation and perceptual evaluation and clinical evaluation, as well as between vocal affection and/or laryngeal medical exams.
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
Alves, Darlan Daniel; Riegel, Roberta Plangg; de Quevedo, Daniela Müller; Osório, Daniela Montanari Migliavacca; da Costa, Gustavo Marques; do Nascimento, Carlos Augusto; Telöken, Franko
2018-06-08
Assessment of surface water quality is an issue of currently high importance, especially in polluted rivers which provide water for treatment and distribution as drinking water, as is the case of the Sinos River, southern Brazil. Multivariate statistical techniques allow a better understanding of the seasonal variations in water quality, as well as the source identification and source apportionment of water pollution. In this study, the multivariate statistical techniques of cluster analysis (CA), principal component analysis (PCA), and positive matrix factorization (PMF) were used, along with the Kruskal-Wallis test and Spearman's correlation analysis in order to interpret a water quality data set resulting from a monitoring program conducted over a period of almost two years (May 2013 to April 2015). The water samples were collected from the raw water inlet of the municipal water treatment plant (WTP) operated by the Water and Sewage Services of Novo Hamburgo (COMUSA). CA allowed the data to be grouped into three periods (autumn and summer (AUT-SUM); winter (WIN); spring (SPR)). Through the PCA, it was possible to identify that the most important parameters in contribution to water quality variations are total coliforms (TCOLI) in SUM-AUT, water level (WL), water temperature (WT), and electrical conductivity (EC) in WIN and color (COLOR) and turbidity (TURB) in SPR. PMF was applied to the complete data set and enabled the source apportionment water pollution through three factors, which are related to anthropogenic sources, such as the discharge of domestic sewage (mostly represented by Escherichia coli (ECOLI)), industrial wastewaters, and agriculture runoff. The results provided by this study demonstrate the contribution provided by the use of integrated statistical techniques in the interpretation and understanding of large data sets of water quality, showing also that this approach can be used as an efficient methodology to optimize indicators for water quality assessment.
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Herle, Pradyumna; Shukla, Lipi; Morrison, Wayne A; Shayan, Ramin
2015-03-01
There is a general consensus among reconstructive surgeons that preoperative radiotherapy is associated with a higher risk of flap failure and complications in head and neck surgery. Opinion is also divided regarding the effects of radiation dose on free flap outcomes and timing of preoperative radiation to minimize adverse outcomes. Our meta-analysis will attempt to address these issues. A systematic review of the literature was conducted in concordance to PRISMA protocol. Data were combined using STATA 12 and Open Meta-Analyst software programmes. Twenty-four studies were included comparing 2842 flaps performed in irradiated fields and 3491 flaps performed in non-irradiated fields. Meta-analysis yielded statistically significant risk ratios for flap failure (RR 1.48, P = 0.004), complications (RR 1.84, P < 0.001), reoperation (RR 2.06, P < 0.001) and fistula (RR 2.05, P < 0.001). Mean radiation dose demonstrated a trend towards increased risk of flap failure, but this was not statistically significant. On subgroup analysis, flaps with >60 Gy radiation had a non-statistically significant higher risk of flap failure (RR 1.61, P = 0.145). Preoperative radiation is associated with a statistically significant increased risk of flap complications, failure and fistula. Preoperative radiation in excess of 60 Gy after radiotherapy represents a potential risk factor for increased flap loss and should be avoided where possible. © 2014 Royal Australasian College of Surgeons.
Study of subgrid-scale velocity models for reacting and nonreacting flows
NASA Astrophysics Data System (ADS)
Langella, I.; Doan, N. A. K.; Swaminathan, N.; Pope, S. B.
2018-05-01
A study is conducted to identify advantages and limitations of existing large-eddy simulation (LES) closures for the subgrid-scale (SGS) kinetic energy using a database of direct numerical simulations (DNS). The analysis is conducted for both reacting and nonreacting flows, different turbulence conditions, and various filter sizes. A model, based on dissipation and diffusion of momentum (LD-D model), is proposed in this paper based on the observed behavior of four existing models. Our model shows the best overall agreements with DNS statistics. Two main investigations are conducted for both reacting and nonreacting flows: (i) an investigation on the robustness of the model constants, showing that commonly used constants lead to a severe underestimation of the SGS kinetic energy and enlightening their dependence on Reynolds number and filter size; and (ii) an investigation on the statistical behavior of the SGS closures, which suggests that the dissipation of momentum is the key parameter to be considered in such closures and that dilatation effect is important and must be captured correctly in reacting flows. Additional properties of SGS kinetic energy modeling are identified and discussed.
Impact of work environment and work-related stress on turnover intention in physical therapists.
Lee, Byoung-Kwon; Seo, Dong-Kwon; Lee, Jang-Tae; Lee, A-Ram; Jeon, Ha-Neul; Han, Dong-Uk
2016-08-01
[Purpose] This study was conducted to provide basic data for solutions to reduce the turnover rate of physical therapists. It should help create efficient personnel and organization management by exploring the impact of the work environment and work-related stress on turnover intention and analyzing the correlation between them. [Subjects and Methods] A survey was conducted with 236 physical therapists working at medical institutions in the Daejeon and Chungcheong areas. For the analysis on the collected data, correlational and linear regression analyses were conducted using the SPSS 18.0 program and Cronbach's alpha coefficient. [Results] The results showed a statistically significant positive correlation between turnover intention and work-related stress but a statistically significant negative correlation respectively between turnover intention and work environment. Work-related stress (β=0.415) had a significant positive impact on turnover intention and work environment (β=-0.387) had a significant negative impact on turnover intention. [Conclusion] To increase satisfaction level with the profession as well as the workplace for physical therapists, improvement of the work environment was the most necessary primary improvement.
Analysis of recent climatic changes in the Arabian Peninsula region
NASA Astrophysics Data System (ADS)
Nasrallah, H. A.; Balling, R. C.
1996-12-01
Interest in the potential climatic consequences of the continued buildup of anthropo-generated greenhouse gases has led many scientists to conduct extensive climate change studies at the global, hemispheric, and regional scales. In this investigation, analyses are conducted on long-term historical climate records from the Arabian Peninsula region. Over the last 100 years, temperatures in the region increased linearly by 0.63 °C. However, virtually all of this warming occurred from 1911 1935, and over the most recent 50 years, the Arabian Peninsula region has cooled slightly. In addition, the satellite-based measurements of lower-tropospheric temperatures for the region do not show any statistically significant warming over the period 1979 1991. While many other areas of the world are showing a decrease in the diurnal temperature range, the Arabian Peninsula region reveals no evidence of a long-term change in this parameter. Precipitation records for the region show a slight, statistically insignificant decrease over the past 40 years. The results from this study should complement the mass of information that has resulted from similar regional climate studies conducted in the United States, Europe, and Australia.
Impact of work environment and work-related stress on turnover intention in physical therapists
Lee, Byoung-kwon; Seo, Dong-kwon; Lee, Jang-Tae; Lee, A-Ram; Jeon, Ha-Neul; Han, Dong-Uk
2016-01-01
[Purpose] This study was conducted to provide basic data for solutions to reduce the turnover rate of physical therapists. It should help create efficient personnel and organization management by exploring the impact of the work environment and work-related stress on turnover intention and analyzing the correlation between them. [Subjects and Methods] A survey was conducted with 236 physical therapists working at medical institutions in the Daejeon and Chungcheong areas. For the analysis on the collected data, correlational and linear regression analyses were conducted using the SPSS 18.0 program and Cronbach’s alpha coefficient. [Results] The results showed a statistically significant positive correlation between turnover intention and work-related stress but a statistically significant negative correlation respectively between turnover intention and work environment. Work-related stress (β=0.415) had a significant positive impact on turnover intention and work environment (β=−0.387) had a significant negative impact on turnover intention. [Conclusion] To increase satisfaction level with the profession as well as the workplace for physical therapists, improvement of the work environment was the most necessary primary improvement. PMID:27630432
Ensor, Joie; Riley, Richard D.
2016-01-01
Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915
DiNardo, Thomas P.; Jackson, R. Alan
1984-01-01
An analysis of land use change for an area in Boulder County, Colorado, was conducted using digital cartographic data. The authors selected data in the Geographic Information Retrieval and Analysis System (GIRAS) format which is digitized from the 1:250,000-scale land use and land cover map series. The Map Overlay and Statistical System (MOSS) was used as an analytical tool for the study. The authors describe the methodology used in converting the GIRAS file into a MOSS format and the activities associated with the conversion.
2015-09-01
problem or defect must first be present. The qualitative interview data along with reduction and analysis provided the bulk of this research . The...problem and to provide grounds for the purpose of this research . Common themes extracted from interview data will comprise the qualitative data set...communication, we used mixed methods’ research to examine current statistics and to conduct in-person interviews . With this research , we found a link
General aviation air traffic pattern safety analysis
NASA Technical Reports Server (NTRS)
Parker, L. C.
1973-01-01
A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.
Accuracy of trace element determinations in alternate fuels
NASA Technical Reports Server (NTRS)
Greenbauer-Seng, L. A.
1980-01-01
A review of the techniques used at Lewis Research Center (LeRC) in trace metals analysis is presented, including the results of Atomic Absorption Spectrometry and DC Arc Emission Spectrometry of blank levels and recovery experiments for several metals. The design of an Interlaboratory Study conducted by LeRC is presented. Several factors were investigated, including: laboratory, analytical technique, fuel type, concentration, and ashing additive. Conclusions drawn from the statistical analysis will help direct research efforts toward those areas most responsible for the poor interlaboratory analytical results.
Besley, John C; Oh, Sang-Hwa
2014-05-01
This study involves the analysis of three waves of survey data about nuclear energy using a probability-based online panel of respondents in the United States. Survey waves included an initial baseline survey conducted in early 2010, a follow-up survey conducted in 2010 following the Deepwater Horizon oil spill in the Gulf of Mexico, and an additional follow-up conducted just after the 2011 Fukushima, Japan, nuclear accident. The central goal is to assess the degree to which changes in public views following an accident are contingent on individual attention and respondent predispositions. Such results would provide real-world evidence of motivated reasoning. The primary analysis focuses on the impact of Fukushima and how the impact of individual attention to energy issues is moderated by both environmental views and political ideology over time. The analysis uses both mean comparisons and multivariate statistics to test key relationships. Additional variables common in the study of emerging technologies are included in the analysis, including demographics, risk and benefit perceptions, and views about the fairness of decisionmakers in both government and the private sector. © 2013 Society for Risk Analysis.
Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S
2015-02-25
Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.
Badenes-Ribera, Laura; Frias-Navarro, Dolores; Pascual-Soler, Marcos; Monterde-I-Bort, Héctor
2016-11-01
The statistical reform movement and the American Psychological Association (APA) defend the use of estimators of the effect size and its confidence intervals, as well as the interpretation of the clinical significance of the findings. A survey was conducted in which academic psychologists were asked about their behavior in designing and carrying out their studies. The sample was composed of 472 participants (45.8% men). The mean number of years as a university professor was 13.56 years (SD= 9.27). The use of effect-size estimators is becoming generalized, as well as the consideration of meta-analytic studies. However, several inadequate practices still persist. A traditional model of methodological behavior based on statistical significance tests is maintained, based on the predominance of Cohen’s d and the unadjusted R2/η2, which are not immune to outliers or departure from normality and the violations of statistical assumptions, and the under-reporting of confidence intervals of effect-size statistics. The paper concludes with recommendations for improving statistical practice.
Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J
2017-12-01
qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.
NASA Astrophysics Data System (ADS)
Iswandhani, N.; Muhajir, M.
2018-03-01
This research was conducted in Department of Statistics Islamic University of Indonesia. The data used are primary data obtained by post @explorejogja instagram account from January until December 2016. In the @explorejogja instagram account found many tourist destinations that can be visited by tourists both in the country and abroad, Therefore it is necessary to form a cluster of existing tourist destinations based on the number of likes from user instagram assumed as the most popular. The purpose of this research is to know the most popular distribution of tourist spot, the cluster formation of tourist destinations, and central popularity of tourist destinations based on @explorejogja instagram account in 2016. Statistical analysis used is descriptive statistics, k-means clustering, and social network analysis. The results of this research were obtained the top 10 most popular destinations in Yogyakarta, map of html-based tourist destination distribution consisting of 121 tourist destination points, formed 3 clusters each consisting of cluster 1 with 52 destinations, cluster 2 with 9 destinations and cluster 3 with 60 destinations, and Central popularity of tourist destinations in the special region of Yogyakarta by district.
Lee, Ellen E; Della Selva, Megan P; Liu, Anson; Himelhoch, Seth
2015-01-01
Given the significant disability, morbidity and mortality associated with depression, the promising recent trials of ketamine highlight a novel intervention. A meta-analysis was conducted to assess the efficacy of ketamine in comparison with placebo for the reduction of depressive symptoms in patients who meet criteria for a major depressive episode. Two electronic databases were searched in September 2013 for English-language studies that were randomized placebo-controlled trials of ketamine treatment for patients with major depressive disorder or bipolar depression and utilized a standardized rating scale. Studies including participants receiving electroconvulsive therapy and adolescent/child participants were excluded. Five studies were included in the quantitative meta-analysis. The quantitative meta-analysis showed that ketamine significantly reduced depressive symptoms. The overall effect size at day 1 was large and statistically significant with an overall standardized mean difference of 1.01 (95% confidence interval 0.69-1.34) (P<.001), with the effects sustained at 7 days postinfusion. The heterogeneity of the studies was low and not statistically significant, and the funnel plot showed no publication bias. The large and statistically significant effect of ketamine on depressive symptoms supports a promising, new and effective pharmacotherapy with rapid onset, high efficacy and good tolerability. Copyright © 2015. Published by Elsevier Inc.
Muhammad, Said; Tahir Shah, M; Khan, Sardar
2010-10-01
The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were <1 in rest of the samples. This level of contamination should have low chronic risk and medium cancer risk when compared with US EPA guidelines. Furthermore, the inter-dependence of physio-chemical parameters and pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
An overview of meta-analysis for clinicians.
Lee, Young Ho
2018-03-01
The number of medical studies being published is increasing exponentially, and clinicians must routinely process large amounts of new information. Moreover, the results of individual studies are often insufficient to provide confident answers, as their results are not consistently reproducible. A meta-analysis is a statistical method for combining the results of different studies on the same topic and it may resolve conflicts among studies. Meta-analysis is being used increasingly and plays an important role in medical research. This review introduces the basic concepts, steps, advantages, and caveats of meta-analysis, to help clinicians understand it in clinical practice and research. A major advantage of a meta-analysis is that it produces a precise estimate of the effect size, with considerably increased statistical power, which is important when the power of the primary study is limited because of a small sample size. A meta-analysis may yield conclusive results when individual studies are inconclusive. Furthermore, meta-analyses investigate the source of variation and different effects among subgroups. In summary, a meta-analysis is an objective, quantitative method that provides less biased estimates on a specific topic. Understanding how to conduct a meta-analysis aids clinicians in the process of making clinical decisions.
Statistical Analysis of CFD Solutions from the Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
2002-01-01
A simple, graphical framework is presented for robust statistical evaluation of results obtained from N-Version testing of a series of RANS CFD codes. The solutions were obtained by a variety of code developers and users for the June 2001 Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration used for the computational tests is the DLR-F4 wing-body combination previously tested in several European wind tunnels and for which a previous N-Version test had been conducted. The statistical framework is used to evaluate code results for (1) a single cruise design point, (2) drag polars and (3) drag rise. The paper concludes with a discussion of the meaning of the results, especially with respect to predictability, Validation, and reporting of solutions.
A Numerical Simulation and Statistical Modeling of High Intensity Radiated Fields Experiment Data
NASA Technical Reports Server (NTRS)
Smith, Laura J.
2004-01-01
Tests are conducted on a quad-redundant fault tolerant flight control computer to establish upset characteristics of an avionics system in an electromagnetic field. A numerical simulation and statistical model are described in this work to analyze the open loop experiment data collected in the reverberation chamber at NASA LaRC as a part of an effort to examine the effects of electromagnetic interference on fly-by-wire aircraft control systems. By comparing thousands of simulation and model outputs, the models that best describe the data are first identified and then a systematic statistical analysis is performed on the data. All of these efforts are combined which culminate in an extrapolation of values that are in turn used to support previous efforts used in evaluating the data.
Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach.
Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem
2013-01-01
This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff.
Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach
Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem
2013-01-01
Objectives This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. Methodology A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. Results The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. Conclusion This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff. PMID:23559904
Complexity quantification of dense array EEG using sample entropy analysis.
Ramanand, Pravitha; Nampoori, V P N; Sreenivasan, R
2004-09-01
In this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.
Hosseini Koupaie, E; Barrantes Leiva, M; Eskicioglu, C; Dutil, C
2014-01-01
The feasibility of anaerobic co-digestion of two juice-based beverage industrial wastes, screen cake (SC) and thickened waste activated sludge (TWAS), along with municipal sludge cake (MC) was investigated. Experiments were conducted in twenty mesophilic batch 160 ml serum bottles with no inhibition occurred. The statistical analysis proved that the substrate type had statistically significant effect on both ultimate biogas and methane yields (P=0.0003<0.05). The maximum and minimum ultimate cumulative methane yields were 890.90 and 308.34 mL/g-VSremoved from the digesters containing only TWAS and SC as substrate. First-order reaction model well described VS utilization in all digesters. The first 2-day and 10-day specific biodegradation rate constants were statistically higher in the digesters containing SC (P=0.004<0.05) and MC (P=0.0005<0.05), respectively. The cost-benefit analysis showed that the capital, operating and total costs can be decreased by 21.5%, 29.8% and 27.6%, respectively using a co-digester rather than two separate digesters. Copyright © 2013 Elsevier Ltd. All rights reserved.
Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne
2017-03-01
The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid the optimization of treatment protocols and therefore contribute to the improvement of the medical response after internal contamination with actinides.
Kim, Sung-Min; Choi, Yosoon
2017-01-01
To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH), high content with a low z-score (HL), low content with a high z-score (LH), and low content with a low z-score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required. PMID:28629168
Kim, Sung-Min; Choi, Yosoon
2017-06-18
To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
The National Aquatic Resource Surveys (NARS) are a series of four statistical surveys conducted by the U.S. Environmental Protection Agency working in collaboration with states, tribal nations and other federal agencies. The surveys are conducted for lakes and reservoirs, streams...
78 FR 26611 - Notice of Intent To Seek Approval To Conduct an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
... Statistics Service Notice of Intent To Seek Approval To Conduct an Information Collection AGENCY: National Agricultural Statistics Service, USDA. ACTION: Notice and request for comments. SUMMARY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the intention of the National Agricultural Statistics...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.
1997-04-01
Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
ERIC Educational Resources Information Center
Katsioloudis, Petros J.; Jones, Mildred V.
2018-01-01
A number of studies indicate that the use of holographic displays can influence spatial visualization ability; however, research provides inconsistent results. Considering this, a quasi-experimental study was conducted to identify the existence of statistically significant effects on sectional view drawing ability due to the impacts of holographic…
Forest resource statistics for the Monongahela National Forest: 2000
Richard H. Widmann; Douglas M. Griffith
2004-01-01
During 1999-2000, the fifth inventory of West Virginia?s forest resources was conducted by the Forest Inventory and Analysis unit of the USDA Forest Service?s Northeastern Research Station. The survey included a subsample within the Monongahela National Forest (MNF). The results showed that the MNF contains 899,000 acres of forest land, or 7.5 percent of the State?s...
ERIC Educational Resources Information Center
Chen, Xianglei
2016-01-01
Every year, millions of new college students arrive on campus lacking the necessary academic skills to perform at the college level. Postsecondary institutions address this problem with extensive remedial programs designed to strengthen students' basic skills. While much research on the effectiveness of remedial education has been conducted,…
ERIC Educational Resources Information Center
Artinian, Vrej-Armen
An extensive investigation of elementary school classrooms was conducted through the collection and statistical analysis of student and teacher responses to questions concerning the educational environment. Several asepcts of the classroom are discussed, including the spatial, thermal, luminous, and aural environments. Questions were organized so…
ERIC Educational Resources Information Center
Elpus, Kenneth
2013-01-01
This study examined the college entrance examination scores of music and non-music students in the United States, drawing data from the restricted-use data set of the Education Longitudinal Study of 2002 (ELS), a nationally representative education study ("N" = 15,630) conducted by the National Center for Education Statistics. Analyses…
The Physical and the Virtual: The Relationship between Library as Place and Electronic Collections
ERIC Educational Resources Information Center
Gerke, Jennifer; Maness, Jack M.
2010-01-01
A statistical analysis of responses to a LibQUAL+™ survey at the University of Colorado at Boulder (UCB) was conducted to investigate factors related to patrons' satisfaction with electronic collections. It was found that a respondent's discipline was not related to his or her satisfaction with the Libraries' electronic collection, nor was the…
The correlation between proprioception and handwriting legibility in children
Hong, So Young; Jung, Nam-Hae; Kim, Kyeong Mi
2016-01-01
[Purpose] This study investigated the association between proprioception, including joint position sense and kinetic sense, and handwriting legibility in healthy children. [Subjects and Methods] Assessment of joint position sense, kinetic sense, and handwriting legibility was conducted for 19 healthy children. Joint position sense was assessed by asking the children to flex their right elbow between 30° to 110° while blindfolded. The range of elbow movement was analyzed with Compact Measuring System 10 for 3D motion Analysis. Kinetic sense was assessed using the Sensory Integration and Praxis Test. The children were directed to write 30 words from the Korean alphabet, and the legibility of their handwriting was scored for form, alignment, space, size, and shape. To analyze the data, descriptive statistics and Spearman correlation analysis were conducted using IBM SPSS Statistics 20.0. [Results] There was significant negative correlation between handwriting legibility and Kinetic sense. A significant correlation between handwriting legibility and Joint position sense was not found. [Conclusion] This study showed that a higher Kinetic sense was associated with better legibility of handwriting. Further work is needed to determine the association of handwriting legibility and speed with Joint position sense of the elbow, wrist, and fingers. PMID:27821948
NASA Astrophysics Data System (ADS)
M, Vasu; Shivananda Nayaka, H.
2018-06-01
In this experimental work dry turning process carried out on EN47 spring steel with coated tungsten carbide tool insert with 0.8 mm nose radius are optimized by using statistical technique. Experiments were conducted at three different cutting speeds (625, 796 and 1250 rpm) with three different feed rates (0.046, 0.062 and 0.093 mm/rev) and depth of cuts (0.2, 0.3 and 0.4 mm). Experiments are conducted based on full factorial design (FFD) 33 three factors and three levels. Analysis of variance is used to identify significant factor for each output response. The result reveals that feed rate is the most significant factor influencing on cutting force followed by depth of cut and cutting speed having less significance. Optimum machining condition for cutting force obtained from the statistical technique. Tool wear measurements are performed with optimum condition of Vc = 796 rpm, ap = 0.2 mm, f = 0.046 mm/rev. The minimum tool wear observed as 0.086 mm with 5 min machining. Analysis of tool wear was done by confocal microscope it was observed that tool wear increases with increasing cutting time.
Akboğa, Özge; Baradan, Selim
2017-02-07
Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods.
Kumar, Parmeshwar; Jithesh, V.; Gupta, Shakti Kumar
2016-01-01
Context: Although Intensive Care Units (ICUs) only account for 10% of the hospital beds, they consume nearly 22% of the hospital resources. Few definitive costing studies have been conducted in Indian settings that would help determine appropriate resource allocation. Aim: The aim of this study was to evaluate and compare the cost of intensive care delivery between multispecialty and neurosurgery ICUs at an apex trauma care facility in India. Materials and Methods: The study was conducted in a polytrauma and neurosurgery ICU at a 203-bedded Level IV trauma care facility in New Delhi, India, from May 1, 2012 to June 30, 2012. The study was cross-sectional, retrospective, and record-based. Traditional costing was used to arrive at the cost for both direct and indirect cost estimates. The cost centers included in the study were building cost, equipment cost, human resources, materials and supplies, clinical and nonclinical support services, engineering maintenance cost, and biomedical waste management. Statistical Analysis: Statistical analysis was performed by Fisher's two tailed t-test. Results: Total cost/bed/day for the multispecialty ICU was Rs. 14,976.9/- and for the neurosurgery ICU, it was Rs. 14,306.7/-, workforce constituting nearly half of the expenditure in both ICUs. The cost center wise and overall difference in the cost among the ICUs were statistically significant. Conclusions: Quantification of expenditure in running an ICU in a trauma center would assist health-care decision makers in better allocation of resources. Although multispecialty ICUs are more cost-effective, other factors will also play a role in defining the kind of ICU that needs to be designed. PMID:27555693
Directions for new developments on statistical design and analysis of small population group trials.
Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel
2016-06-14
Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small population clinical trials. They address various challenges presented by the EMA/CHMP guideline as well as recent discussions about extrapolation. There is a need for involvement of the patients' perspective in the planning and conduct of small population clinical trials for a successful therapy evaluation.
The bedrock electrical conductivity map of the UK
NASA Astrophysics Data System (ADS)
Beamish, David
2013-09-01
Airborne electromagnetic (AEM) surveys, when regionally extensive, may sample a wide-range of geological formations. The majority of AEM surveys can provide estimates of apparent (half-space) conductivity and such derived data provide a mapping capability. Depth discrimination of the geophysical mapping information is controlled by the bandwidth of each particular system. The objective of this study is to assess the geological information contained in accumulated frequency-domain AEM survey data from the UK where existing geological mapping can be considered well-established. The methodology adopted involves a simple GIS-based, spatial join of AEM and geological databases. A lithology-based classification of bedrock is used to provide an inherent association with the petrophysical rock parameters controlling bulk conductivity. At a scale of 1:625k, the UK digital bedrock geological lexicon comprises just 86 lithological classifications compared with 244 standard lithostratigraphic assignments. The lowest common AEM survey frequency of 3 kHz is found to provide an 87% coverage (by area) of the UK formations. The conductivities of the unsampled classes have been assigned on the basis of inherent lithological associations between formations. The statistical analysis conducted uses over 8 M conductivity estimates and provides a new UK national scale digital map of near-surface bedrock conductivity. The new baseline map, formed from central moments of the statistical distributions, allows assessments/interpretations of data exhibiting departures from the norm. The digital conductivity map developed here is believed to be the first such UK geophysical map compilation for over 75 years. The methodology described can also be applied to many existing AEM data sets.
Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?
Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R
2013-01-01
The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Lihua; Cui, Jingkun; Tang, Fengjiao
Purpose: Studies of the association between ataxia telangiectasia–mutated (ATM) gene polymorphisms and acute radiation injuries are often small in sample size, and the results are inconsistent. We conducted the first meta-analysis to provide a systematic review of published findings. Methods and Materials: Publications were identified by searching PubMed up to April 25, 2014. Primary meta-analysis was performed for all acute radiation injuries, and subgroup meta-analyses were based on clinical endpoint. The influence of sample size and radiation injury incidence on genetic effects was estimated in sensitivity analyses. Power calculations were also conducted. Results: The meta-analysis was conducted on the ATMmore » polymorphism rs1801516, including 5 studies with 1588 participants. For all studies, the cut-off for differentiating cases from controls was grade 2 acute radiation injuries. The primary meta-analysis showed a significant association with overall acute radiation injuries (allelic model: odds ratio = 1.33, 95% confidence interval: 1.04-1.71). Subgroup analyses detected an association between the rs1801516 polymorphism and a significant increase in urinary and lower gastrointestinal injuries and an increase in skin injury that was not statistically significant. There was no between-study heterogeneity in any meta-analyses. In the sensitivity analyses, small studies did not show larger effects than large studies. In addition, studies with high incidence of acute radiation injuries showed larger effects than studies with low incidence. Power calculations revealed that the statistical power of the primary meta-analysis was borderline, whereas there was adequate power for the subgroup analysis of studies with high incidence of acute radiation injuries. Conclusions: Our meta-analysis showed a consistency of the results from the overall and subgroup analyses. We also showed that the genetic effect of the rs1801516 polymorphism on acute radiation injuries was dependent on the incidence of the injury. These support the evidence of an association between the rs1801516 polymorphism and acute radiation injuries, encouraging further research of this topic.« less
Emotional and cognitive effects of peer tutoring among secondary school mathematics students
NASA Astrophysics Data System (ADS)
Alegre Ansuategui, Francisco José; Moliner Miravet, Lidón
2017-11-01
This paper describes an experience of same-age peer tutoring conducted with 19 eighth-grade mathematics students in a secondary school in Castellon de la Plana (Spain). Three constructs were analysed before and after launching the program: academic performance, mathematics self-concept and attitude of solidarity. Students' perceptions of the method were also analysed. The quantitative data was gathered by means of a mathematics self-concept questionnaire, an attitude of solidarity questionnaire and the students' numerical ratings. A statistical analysis was performed using Student's t-test. The qualitative information was gathered by means of discussion groups and a field diary. This information was analysed using descriptive analysis and by categorizing the information. Results show statistically significant improvements in all the variables and the positive assessment of the experience and the interactions that took place between the students.
A statistical data analysis and plotting program for cloud microphysics experiments
NASA Technical Reports Server (NTRS)
Jordan, A. J.
1981-01-01
The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.
Evangelou, Marina; Smyth, Deborah J; Fortune, Mary D; Burren, Oliver S; Walker, Neil M; Guo, Hui; Onengut-Gumuscu, Suna; Chen, Wei-Min; Concannon, Patrick; Rich, Stephen S; Todd, John A; Wallace, Chris
2014-01-01
Pathway analysis can complement point-wise single nucleotide polymorphism (SNP) analysis in exploring genomewide association study (GWAS) data to identify specific disease-associated genes that can be candidate causal genes. We propose a straightforward methodology that can be used for conducting a gene-based pathway analysis using summary GWAS statistics in combination with widely available reference genotype data. We used this method to perform a gene-based pathway analysis of a type 1 diabetes (T1D) meta-analysis GWAS (of 7,514 cases and 9,045 controls). An important feature of the conducted analysis is the removal of the major histocompatibility complex gene region, the major genetic risk factor for T1D. Thirty-one of the 1,583 (2%) tested pathways were identified to be enriched for association with T1D at a 5% false discovery rate. We analyzed these 31 pathways and their genes to identify SNPs in or near these pathway genes that showed potentially novel association with T1D and attempted to replicate the association of 22 SNPs in additional samples. Replication P-values were skewed () with 12 of the 22 SNPs showing . Support, including replication evidence, was obtained for nine T1D associated variants in genes ITGB7 (rs11170466, ), NRP1 (rs722988, ), BAD (rs694739, ), CTSB (rs1296023, ), FYN (rs11964650, ), UBE2G1 (rs9906760, ), MAP3K14 (rs17759555, ), ITGB1 (rs1557150, ), and IL7R (rs1445898, ). The proposed methodology can be applied to other GWAS datasets for which only summary level data are available. PMID:25371288
Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan
2017-12-01
Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Parsons, Vickie s.
2009-01-01
The request to conduct an independent review of regression models, developed for determining the expected Launch Commit Criteria (LCC) External Tank (ET)-04 cycle count for the Space Shuttle ET tanking process, was submitted to the NASA Engineering and Safety Center NESC on September 20, 2005. The NESC team performed an independent review of regression models documented in Prepress Regression Analysis, Tom Clark and Angela Krenn, 10/27/05. This consultation consisted of a peer review by statistical experts of the proposed regression models provided in the Prepress Regression Analysis. This document is the consultation's final report.
Guidelines for the Investigation of Mediating Variables in Business Research.
MacKinnon, David P; Coxe, Stefany; Baraldi, Amanda N
2012-03-01
Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized.
Similarity of markers identified from cancer gene expression studies: observations from GEO.
Shi, Xingjie; Shen, Shihao; Liu, Jin; Huang, Jian; Zhou, Yong; Ma, Shuangge
2014-09-01
Gene expression profiling has been extensively conducted in cancer research. The analysis of multiple independent cancer gene expression datasets may provide additional information and complement single-dataset analysis. In this study, we conduct multi-dataset analysis and are interested in evaluating the similarity of cancer-associated genes identified from different datasets. The first objective of this study is to briefly review some statistical methods that can be used for such evaluation. Both marginal analysis and joint analysis methods are reviewed. The second objective is to apply those methods to 26 Gene Expression Omnibus (GEO) datasets on five types of cancers. Our analysis suggests that for the same cancer, the marker identification results may vary significantly across datasets, and different datasets share few common genes. In addition, datasets on different cancers share few common genes. The shared genetic basis of datasets on the same or different cancers, which has been suggested in the literature, is not observed in the analysis of GEO data. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
HARIRI, Azian; PAIMAN, Nuur Azreen; LEMAN, Abdul Mutalib; MD. YUSOF, Mohammad Zainal
2014-01-01
Abstract Background This study aimed to develop an index that can rank welding workplace that associate well with possible health risk of welders. Methods Welding Fumes Health Index (WFHI) were developed based on data from case studies conducted in Plant 1 and Plant 2. Personal sampling of welding fumes to assess the concentration of metal constituents along with series of lung function tests was conducted. Fifteen metal constituents were investigated in each case study. Index values were derived from aggregation analysis of metal constituent concentration while significant lung functions were recognized through statistical analysis in each plant. Results The results showed none of the metal constituent concentration was exceeding the permissible exposure limit (PEL) for all plants. However, statistical analysis showed significant mean differences of lung functions between welders and non-welders. The index was then applied to one of the welding industry (Plant 3) for verification purpose. The developed index showed its promising ability to rank welding workplace, according to the multiple constituent concentrations of welding fumes that associates well with lung functions of the investigated welders. Conclusion There was possibility that some of the metal constituents were below the detection limit leading to ‘0’ value of sub index, thus the multiplicative form of aggregation model was not suitable for analysis. On the other hand, maximum or minimum operator forms suffer from compensation issues and were not considered in this study. PMID:25927034
Gender in the allocation of organs in kidney transplants: meta-analysis
Santiago, Erika Vieira Almeida e; Silveira, Micheline Rosa; de Araújo, Vânia Eloisa; Farah, Katia de Paula; Acurcio, Francisco de Assis; Ceccato, Maria das Graças Braga
2015-01-01
OBJECTIVE To analyze whether gender influence survival results of kidney transplant grafts and patients. METHODS Systematic review with meta-analysis of cohort studies available on Medline (PubMed), LILACS, CENTRAL, and Embase databases, including manual searching and in the grey literature. The selection of studies and the collection of data were conducted twice by independent reviewers, and disagreements were settled by a third reviewer. Graft and patient survival rates were evaluated as effectiveness measurements. Meta-analysis was conducted with the Review Manager® 5.2 software, through the application of a random effects model. Recipient, donor, and donor-recipient gender comparisons were evaluated. RESULTS : Twenty-nine studies involving 765,753 patients were included. Regarding graft survival, those from male donors were observed to have longer survival rates as compared to the ones from female donors, only regarding a 10-year follow-up period. Comparison between recipient genders was not found to have significant differences on any evaluated follow-up periods. In the evaluation between donor-recipient genders, male donor-male recipient transplants were favored in a statistically significant way. No statistically significant differences were observed in regards to patient survival for gender comparisons in all follow-up periods evaluated. CONCLUSIONS The quantitative analysis of the studies suggests that donor or recipient genders, when evaluated isolatedly, do not influence patient or graft survival rates. However, the combination between donor-recipient genders may be a determining factor for graft survival. PMID:26465666
The Business Of Urban Animals Survey: the facts and statistics on companion animals in Canada.
Perrin, Terri
2009-01-01
At the first Banff Summit for Urban Animal Strategies (BSUAS) in 2006, delegates clearly indicated that a lack of reliable Canadian statistics hampers municipal leaders and legislators in their efforts to develop urban animal strategies that create and sustain a healthy community for pets and people. To gain a better understanding of the situation, BSUAS municipal delegates and other industry stakeholders partnered with Ipsos Reid, one of the world's leading polling firms, to conduct a national survey on the "Business of Urban Animals." The results of the survey, summarized in this article, were presented at the BSUAS meeting in October 2008. In addition, each participating community will receive a comprehensive written analysis, as well as a customized report. The online survey was conducted from September 22 to October 1, 2008. There were 7208 participants, including 3973 pet and 3235 non-pet owners from the Ipsos-Reid's proprietary Canadian online panel. The national results were weighted to reflect the true population distribution across Canada and the panel was balanced on all major demographics to mirror Statistics Canada census information. The margin for error for the national results is 1/- 1.15%.
NASA Astrophysics Data System (ADS)
Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie
2017-12-01
To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.
Wavelet analysis in ecology and epidemiology: impact of statistical tests
Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario
2014-01-01
Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method. PMID:24284892
Wavelet analysis in ecology and epidemiology: impact of statistical tests.
Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario
2014-02-06
Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.
Development of a funding, cost, and spending model for satellite projects
NASA Technical Reports Server (NTRS)
Johnson, Jesse P.
1989-01-01
The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.
[Evaluation of using statistical methods in selected national medical journals].
Sych, Z
1996-01-01
The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.
2011-01-01
Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990
Improved biliary detection and diagnosis through intelligent machine analysis.
Logeswaran, Rajasvaran
2012-09-01
This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Malvisi, Lucio; Troisi, Catherine L; Selwyn, Beatrice J
2018-06-23
The risk of malaria infection displays spatial and temporal variability that is likely due to interaction between the physical environment and the human population. In this study, we performed a spatial analysis at three different time points, corresponding to three cross-sectional surveys conducted as part of an insecticide-treated bed nets efficacy study, to reveal patterns of malaria incidence distribution in an area of Northern Guatemala characterized by low malaria endemicity. A thorough understanding of the spatial and temporal patterns of malaria distribution is essential for targeted malaria control programs. Two methods, the local Moran's I and the Getis-Ord G * (d), were used for the analysis, providing two different statistical approaches and allowing for a comparison of results. A distance band of 3.5 km was considered to be the most appropriate distance for the analysis of data based on epidemiological and entomological factors. Incidence rates were higher at the first cross-sectional survey conducted prior to the intervention compared to the following two surveys. Clusters or hot spots of malaria incidence exhibited high spatial and temporal variations. Findings from the two statistics were similar, though the G * (d) detected cold spots using a higher distance band (5.5 km). The high spatial and temporal variability in the distribution of clusters of high malaria incidence seems to be consistent with an area of unstable malaria transmission. In such a context, a strong surveillance system and the use of spatial analysis may be crucial for targeted malaria control activities.
Dragoman, D; Dragoman, M
2009-08-01
In this Brief Report, we present a method for the real-time detection of the bases of the deoxyribonucleic acid using their signatures in negative differential conductance measurements. The present methods of electronic detection of deoxyribonucleic acid bases are based on a statistical analysis because the electrical currents of the four bases are weak and do not differ significantly from one base to another. In contrast, we analyze a device that combines the accumulated knowledge in nanopore and scanning tunneling detection and which is able to provide very distinctive electronic signatures for the four bases.
Characterization of Inclusion Populations in Mn-Si Deoxidized Steel
NASA Astrophysics Data System (ADS)
García-Carbajal, Alfonso; Herrera-Trejo, Martín; Castro-Cedeño, Edgar-Ivan; Castro-Román, Manuel; Martinez-Enriquez, Arturo-Isaias
2017-12-01
Four plant heats of Mn-Si deoxidized steel were conducted to follow the evolution of the inclusion population through ladle furnace (LF) treatment and subsequent vacuum treatment (VT). The liquid steel was sampled, and the chemical composition and size distribution of the inclusion populations were characterized. The Gumbel generalized extreme-value (GEV) and generalized Pareto (GP) distributions were used for the statistical analysis of the inclusion size distributions. The inclusions found at the beginning of the LF treatment were mostly fully liquid SiO2-Al2O3-MnO inclusions, which then evolved into fully liquid SiO2-Al2O3-CaO-MgO and partly liquid SiO2-CaO-MgO-(Al2O3-MgO) inclusions detected at the end of the VT. The final fully liquid inclusions had a desirable chemical composition for plastic behavior in subsequent metallurgical operations. The GP distribution was found to be undesirable for statistical analysis. The GEV distribution approach led to shape parameter values different from the zero value hypothesized from the Gumbel distribution. According to the GEV approach, some of the final inclusion size distributions had statistically significant differences, whereas the Gumbel approach predicted no statistically significant differences. The heats were organized according to indicators of inclusion cleanliness and a statistical comparison of the size distributions.
Comments of statistical issue in numerical modeling for underground nuclear test monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, W.L.; Anderson, K.K.
1993-03-01
The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks. Statistical ideas may be helpful in resolving some numerical modeling issues. Specifically, we comment first on the role of statistical design/analysis in the quantification process to answer the question ``what do we know aboutmore » the numerical modeling of underground nuclear tests?`` and second on the peculiar nature of uncertainty analysis for situations involving numerical modeling. The simulations described in the workshop, though associated with topic areas, were basically sets of examples. Each simulation was tuned towards agreeing with either empirical evidence or an expert`s opinion of what empirical evidence would be. While the discussions were reasonable, whether the embellishments were correct or a forced fitting of reality is unclear and illustrates that ``simulation is easy.`` We also suggest that these examples of simulation are typical and the questions concerning the legitimacy and the role of knowing the reality are fair, in general, with respect to simulation. The answers will help us understand why ``prediction is difficult.``« less
NASA Technical Reports Server (NTRS)
Stefanski, Philip L.
2015-01-01
Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.
Fumonisin B1 and Risk of Hepatocellular Carcinoma in Two Chinese Cohorts
Persson, E. Christina; Sewram, Vikash; Evans, Alison A.; London, W. Thomas; Volkwyn, Yvette; Shen, Yen-Ju; Van Zyl, Jacobus A.; Chen, Gang; Lin, Wenyao; Shephard, Gordon S.; Taylor, Philip R.; Fan, Jin-Hu; Dawsey, Sanford M.; Qiao, You-Lin; McGlynn, Katherine A.; Abnet, Christian C.
2011-01-01
Fumonisin B1 (FB1), a mycotoxin that contaminates corn in certain climates, has been demonstrated to cause hepatocellular cancer (HCC) in animal models. Whether a relationship between FB1 and HCC exists in humans is not known. To examine the hypothesis, we conducted case-control studies nested within two large cohorts in China; the Haimen City Cohort and the General Population Study of the Nutritional Intervention Trials cohort in Linxian. In the Haimen City Cohort, nail FB1 levels were determined in 271 HCC cases and 280 controls. In the General Population Nutritional Intervention Trial, nail FB1 levels were determined in 72 HCC cases and 147 controls. In each population, odds ratios and 95% confidence intervals (95%CI) from logistic regression models estimated the association between measurable FB1 and HCC, adjusting for hepatitis B virus infection and other factors. A meta-analysis that included both populations was also conducted. The analysis revealed no statistically significant association between FB1 and HCC in either Haimen City (OR=1.10, 95%CI=0.64–1.89) or in Linxian (OR=1.47, 95%CI=0.70–3.07). Similarly, the pooled meta-analysis showed no statistically significant association between FB1 exposure and HCC (OR=1.22, 95%CI=0.79–1.89). These findings, although somewhat preliminary, do not support an associated between FB1 and HCC. PMID:22142693
Al-Dubai, Sar; Ganasegeran, K; Barua, A; Rizal, Am; Rampal, Kg
2014-07-01
The 10-item version of Perceived Stress Scale (PSS-10) is a widely used tool to measure stress. The Malay version of the PSS-10 has been validated among Malaysian Medical Students. However, studies have not been conducted to assess its validity in occupational settings. The aim of this study is to assess the psychometric properties of the Malay version of the PSS-10 in two occupational setting in Malaysia. This study was conducted among 191 medical residents and 513 railway workers. An exploratory factor analysis was performed using the principal component method with varimax rotation. Correlation analyses, Kaiser-Meyer-Olkin, Bartlett's test of Sphericity and Cronbach's alpha were obtained. Statistical analysis was carried out using statistical package for the social sciences version 16 (SPSS, Chicago, IL, USA) software. Analysis yielded two factor structure of the Malay version of PSS-10 in both occupational groups. The two factors accounted for 59.2% and 64.8% of the variance in the medical residents and the railway workers respectively. Factor loadings were greater than 0.59 in both occupational groups. Cronbach's alpha co-efficient was 0.70 for medical residents and 0.71 for railway workers. The Malay version of PSS-10 had adequate psychometric properties and can be used to measure stress among occupational settings in Malaysia.
Al-Dubai, SAR; Ganasegeran, K; Barua, A; Rizal, AM; Rampal, KG
2014-01-01
Background: The 10-item version of Perceived Stress Scale (PSS-10) is a widely used tool to measure stress. The Malay version of the PSS-10 has been validated among Malaysian Medical Students. However, studies have not been conducted to assess its validity in occupational settings. Aim: The aim of this study is to assess the psychometric properties of the Malay version of the PSS-10 in two occupational setting in Malaysia. Subjects and Methods: This study was conducted among 191 medical residents and 513 railway workers. An exploratory factor analysis was performed using the principal component method with varimax rotation. Correlation analyses, Kaiser-Meyer-Olkin, Bartlett's test of Sphericity and Cronbach's alpha were obtained. Statistical analysis was carried out using statistical package for the social sciences version 16 (SPSS, Chicago, IL, USA) software. Results: Analysis yielded two factor structure of the Malay version of PSS-10 in both occupational groups. The two factors accounted for 59.2% and 64.8% of the variance in the medical residents and the railway workers respectively. Factor loadings were greater than 0.59 in both occupational groups. Cronbach's alpha co-efficient was 0.70 for medical residents and 0.71 for railway workers. Conclusion: The Malay version of PSS-10 had adequate psychometric properties and can be used to measure stress among occupational settings in Malaysia. PMID:25184074
The effect of kangaroo mother care on mental health of mothers with low birth weight infants.
Badiee, Zohreh; Faramarzi, Salar; MiriZadeh, Tahereh
2014-01-01
The mothers of premature infants are at risk of psychological stress because of separation from their infants. One of the methods influencing the maternal mental health in the postpartum period is kangaroo mother care (KMC). This study was conducted to evaluate the effect of KMC of low birth weight infants on their maternal mental health. The study was conducted in the Department of Pediatrics of Isfahan University of Medical Sciences, Isfahan, Iran. Premature infants were randomly allocated into two groups. The control group received standard caring in the incubator. In the experimental group, caring with three sessions of 60 min KMC daily for 1 week was practiced. Mental health scores of the mothers were evaluated by using the 28-item General Health Questionnaire. Statistical analysis was performed by the analysis of covariance using SPSS. The scores of 50 infant-mother pairs were analyzed totally (25 in KMC group and 25 in standard care group). Results of covariance analysis showed the positive effects of KMC on the rate of maternal mental health scores. There were statistically significant differences between the mean scores of the experimental group and control subjects in the posttest period (P < 0.001). KMC for low birth weight infants is a safe way to improve maternal mental health. Therefore, it is suggested as a useful method that can be recommended for improving the mental health of mothers.
Audiological comparison between two different clips prostheses in stapes surgery.
Potena, M; Portmann, D; Guindi, S
2015-01-01
To compare audiometric results and complications of stapes surgery with two different types of piston prosthesis, the Portmann Clip Piston (Medtronic) (PCP) and the Soft Clip Piston (Kurz) (SCP). Study conducted on 64 patients who underwent primary stapedotomy from 2008 to 2011. We matched for each case of stapedotomy with the PCP (Medtronic Xomed Inc. Portmann Clip Piston Stainless Steel/Fluoroplastic) a case with the SCP (Heinz Kurz GmbH Medizintechnik Soft Piston Clip Titanium). Each group consisted of 32 patients, and patients in both groups were matched with respect to gender, age, bilateral or unilateral otosclerosis, otological symptoms (tinnitus, vertigo or dizziness), family history, operated side and the Portmann grading for otosclerosis. The length of the prosthesis used was reported. Post-operative complications such as tinnitus, vertigo, hearing loss and altered taste were documented. Each patient was subjected to a preoperative and postoperative audiogram (follow-up at the second month after the surgery). We used the Student test for statistical analysis. Statistical significance was set at < 0.01. None of the patients experienced a post-operative hearing loss and none required a later revision surgery. No statistically significant difference was found between the two populations regarding demographic data (age, sex, side, bilaterality, family history, stage and lenght of piston) and hearing level (> 0.01) in the air, bone conduction and air-bone gap (ABG). Postoperative complications did not result to be significantly different between the two groups. Also, both groups showed a significant improvement (< 0.01) in the post-operative air, bone conduction and air-bone gap. There was no statistically significant difference (> 0.01) between the post-operative hearing results (bone conduction, air conduction, air-bone gap) using the two pistons. The mean ABG improvement was respectively 16.63 dB in the SCP group and 20.59 dB in the PCP group. The titanium Soft clip piston (SCP) is a good alternative to the Portmann clip piston (PCP). Nevertheless there are some differences in the surgical fixing of these two pistons in the correct position.
A novel measure and significance testing in data analysis of cell image segmentation.
Wu, Jin Chu; Halter, Michael; Kacker, Raghu N; Elliott, John T; Plant, Anne L
2017-03-14
Cell image segmentation (CIS) is an essential part of quantitative imaging of biological cells. Designing a performance measure and conducting significance testing are critical for evaluating and comparing the CIS algorithms for image-based cell assays in cytometry. Many measures and methods have been proposed and implemented to evaluate segmentation methods. However, computing the standard errors (SE) of the measures and their correlation coefficient is not described, and thus the statistical significance of performance differences between CIS algorithms cannot be assessed. We propose the total error rate (TER), a novel performance measure for segmenting all cells in the supervised evaluation. The TER statistically aggregates all misclassification error rates (MER) by taking cell sizes as weights. The MERs are for segmenting each single cell in the population. The TER is fully supported by the pairwise comparisons of MERs using 106 manually segmented ground-truth cells with different sizes and seven CIS algorithms taken from ImageJ. Further, the SE and 95% confidence interval (CI) of TER are computed based on the SE of MER that is calculated using the bootstrap method. An algorithm for computing the correlation coefficient of TERs between two CIS algorithms is also provided. Hence, the 95% CI error bars can be used to classify CIS algorithms. The SEs of TERs and their correlation coefficient can be employed to conduct the hypothesis testing, while the CIs overlap, to determine the statistical significance of the performance differences between CIS algorithms. A novel measure TER of CIS is proposed. The TER's SEs and correlation coefficient are computed. Thereafter, CIS algorithms can be evaluated and compared statistically by conducting the significance testing.
Self-esteem and obsessive compulsive disorder.
Husain, Nusrat; Chaudhry, Imran; Raza-ur-Rehman; Ahmed, Ghazal Riaz
2014-01-01
To explore the association between self-esteem and obsessive compulsive disorder in a low-income country, and to conduct an in-depth analysis into the said relationship by identifying any confounding variables that might exist. The cross-sectional study was conducted at the psychiatry out-patient clinic of Civil Hospital, Karachi, from January to March 2008, and comprised 65 patients diagnosed with obsessive compulsive disorder and 30 healthy controls. The participatnts completed the Janis and Field Social Adequacy scale and the Rosenberg Self-esteem scale. SPSS 15 was used for statistical analysis. Significantly different scores were reported on both measures of self-esteem between the patients and the controls (p<0.001 each), indicating reduced levels of self-esteem in the patients compared to the controls. Data replicated earlier findings from populations in high-income countries.
Statistical analysis in MSW collection performance assessment.
Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel
2014-09-01
The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.
A study of tensile test on open-cell aluminum foam sandwich
NASA Astrophysics Data System (ADS)
Ibrahim, N. A.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Abdullah Sidek, Atiah Bt.; Endut, N. A.
2018-01-01
Aluminum foam sandwich (AFS) panels are one of the growing materials in the various industries because of its lightweight behavior. AFS also known for having excellent stiffness to weight ratio and high-energy absorption. Due to their advantages, many researchers’ shows an interest in aluminum foam material for expanding the use of foam structure. However, there is still a gap need to be fill in order to develop reliable data on mechanical behavior of AFS with different parameters and analysis method approach. Least of researcher focusing on open-cell aluminum foam and statistical analysis. Thus, this research conducted by using open-cell aluminum foam core grade 6101 with aluminum sheets skin tested under tension. The data is analyzed using full factorial in JMP statistical analysis software (version 11). ANOVA result show a significant value of the model which less than 0.500. While scatter diagram and 3D plot surface profiler found that skins thickness gives a significant impact to stress/strain value compared to core thickness.
NASA Astrophysics Data System (ADS)
Rani Rana, Sandhya; Pattnaik, A. B.; Patnaik, S. C.
2018-03-01
In the present work the wear behavior and mechanical properties of as cast A16082 and A16086-T6 were compared and analyzed using statistical analysis. The as cast Al6082 alloy was solutionized at 550°C, quenched and artificially aged at 170°C for 8hrs. Metallographic examination and XRD analysis revealed the presence of intermetallic compounds Al6Mn.Hardness of heat treated Al6082 was found to be more than as cast sample. Wear tests were carried out using Pin on Disc wear testing machine according to Taguchi L9 orthogonal array. Experiments were conducted under normal load 10-30N, sliding speed 1-3m/s, sliding distance 400,800,1200m respectively. Sliding speed was found to be the dominant factor for wear in both as cast and aged Al 6082 alloy. Sliding distance increases the wear rate up to 800m and then after it decreases.
NASA Astrophysics Data System (ADS)
Noik, V. James; Mohd Tuah, P.
2015-04-01
Plastic fragments and particles as an emerging environmental contaminant and pollutant are gaining scientific attention in the recent decades due to the potential threats on biota. This study aims to elucidate the presence, abundance and temporal change of plastic fragments and particles from two selected beaches, namely Santubong and Trombol in Kuching on two sampling times. Morphological and polymer identification assessment on the recovered plastics was also conducted. Overall comparison statistical analysis revealed that the abundance of plastic fragments/debris on both of sampling stations were insignificantly different (p>0.05). Likewise, statistical analysis on the temporal changes on the abundance yielded no significant difference for most of the sampling sites on each respective station, except STB-S2. Morphological studies revealed physical features of plastic fragments and debris were diverse in shapes, sizes, colors and surface fatigues. FTIR fingerprinting analysis shows that polypropylene and polyethylene were the dominant plastic polymers debris on both beaches.
Catalá-López, Ferrán; Corrales, Inmaculada; de la Fuente-Honrubia, César; González-Bermejo, Diana; Martín-Serrano, Gloria; Montero, Dolores; Saint-Gerons, Diego Macías
2015-12-21
Romiplostim and eltrombopag are thrombopoietin receptor (TPOr) agonists that promote megakaryocyte differentiation, proliferation and platelet production. In 2012, a systematic review and meta-analysis reported a non-statistically significant increased risk of thromboembolic events for these drugs, but analyses were limited by lack of statistical power. Our objective was to update the 2012 meta-analysis examining whether TPOr agonists affect thromboembolism occurrence in adult thrombocytopenic patients. We conducted a systematic review and meta-analysis of randomized controlled trials (RCTs). Updated searches were conduced on PubMed, Cochrane Central, and publicly available registries (up to December 2014). RCTs using romiplostim or eltrombopag in at least one group were included. Relative risks (RR), absolute risk ratios (ARR) and number needed to harm (NNH) were estimated. Heterogeneity was analyzed using Cochran's Q test and I(2) statistic. Fifteen studies with 3026 adult thrombocytopenic patients were included. Estimated frequency of thromboembolism was 3.69% (95% CI: 2.95-4.61%) for TPOr agonists and 1.46% (95% CI: 0.89-2.40%) for controls. TPOr agonists were associated with a RR of thromboembolism of 1.81 (95% CI: 1.04-3.14) and an ARR of 2.10% (95% CI: 0.03-3.90%) meaning a NNH of 48. Overall, we did not find evidence of statistical heterogeneity (p=0.43; I(2)=1.60%). Our updated meta-analysis suggested that TPOr agonists are associated with a higher risk of thromboemboembolic events compared with controls, and supports the current recommendations included in the European product information on this respect. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Abdulqadir, Ibrahim; Ahmed, Sagir Gumel; Kuliya, Aisha Gwarzo; Tukur, Jamilu; Yusuf, Aminu Abba; Musa, Abubakar Umar
2018-01-01
Human immunodeficiency virus (HIV) scourge continues to affect young women within the reproductive age group and pregnancy is a recognized indication for the use antiretroviral (ARV) drugs among HIV-positive women. The aim is to determine the combined effect of pregnancy, HIV and ARV drugs on the hematological parameters of the pregnant women. This was a comparative cross-sectional study conducted among 70 each of HIV-positive and negative pregnant women. Bio-demographic and clinical data were extracted from the client folder and 4 ml of blood sample was obtained from each participant. Full blood count was generated using Swelab automatic hematology analyzer while reticulocyte count and erythrocyte sedimentation rate (ESR) were conducted manually. Data analysis was performed using SPSS version software 16 while P < 0.05 was considered statistically significant. Pregnant women with HIV had statistically significant lower hematocrit and white blood cell (WBC) and higher ESR than pregnant women without HIV ( P < 0.000). There was no statistically significant difference between the two groups in terms of platelet and reticulocyte ( P > 0.05). However, among HIV positive pregnant women, those with CD4 count <350/μL had statistically significant lower WBC and lymphocyte count than those with CD4 count ≥350/μL ( P < 0.05), whereas, those on zidovudine (AZT)-containing treatment had statistically significant lower hematocrit and higher mean cell volume than those on non-AZT-containing treatment ( P < 0.05), but there was no statistically significant difference in any of the hematological parameters ( P > 0.050) between women on first- and second-line ARV regimens. There is a significant difference in terms of hematological parameters between HIV-positive and HIV-negative pregnant women in this environment.
Exploratory Analysis of Survey Data for Understanding Adoption of Novel Aerospace Systems
NASA Astrophysics Data System (ADS)
Reddy, Lauren M.
In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost-benefit analysis of aerospace systems. We demonstrate how the findings from the opinion and stated preference surveys can be infused into the cost-benefit analysis of an unmanned aircraft delivery system. We also demonstrate how to apply the findings from the interviews to characterize uncertainty in the estimation of the benefits of space-based ADS-B.
Volunteering, income and health.
Detollenaere, Jens; Willems, Sara; Baert, Stijn
2017-01-01
Separate literatures have related volunteering to health gains and income gains. We study the association between volunteering, income and health within one statistical framework. A state-of-the-art mediation analysis is conducted on data concerning the health, volunteering and sociodemographic characteristics of 42926 individuals within 29 European countries. We find that volunteering is positively associated to self-rated health. This association is partially mediated by household income.
Volunteering, income and health
Detollenaere, Jens; Willems, Sara
2017-01-01
Separate literatures have related volunteering to health gains and income gains. We study the association between volunteering, income and health within one statistical framework. A state-of-the-art mediation analysis is conducted on data concerning the health, volunteering and sociodemographic characteristics of 42926 individuals within 29 European countries. We find that volunteering is positively associated to self-rated health. This association is partially mediated by household income. PMID:28273163
Brett J. Butler; Charles J. Barnett; Susan J. Crocker; Grant M. Domke; Dale Gormanson; William N. Hill; Cassandra M. Kurtz; Tonya Lister; Christopher Martin; Patrick D. Miles; Randall Morin; W. Keith Moser; Mark D. Nelson; Barbara O' Connell; Bruce Payton; Charles H. Perry; Ronald J. Piva; Rachel Riemann; Christopher W. Woodall
2011-01-01
This report summarizes the results of the fifth forest inventory of the forests of Southern New England, defined as Connecticut, Massachusetts, and Rhode Island, conducted by the U.S. Forest Service, Forest Inventory and analysis program. Information on forest attributes, ownership, land use change, carbon, timber products, forest health, and statistics and quality...
ERIC Educational Resources Information Center
Kirshstein, Rita J.; Matheson, Nancy; Jing, Zhongren; Zimbler, Linda J.
This report compares findings from faculty surveys conducted as part of the 1987-88 National Survey of Postsecondary Faculty, which is limited to faculty and staff with instructional responsibilities, and the 1992-93 National Study of Postsecondary Faculty, which includes instructional as well as noninstructional faculty. In particular, the report…
Proceedings: USACERL/ASCE First Joint Conference on Expert Systems, 29-30 June 1988
1989-01-01
Wong KOWLEDGE -BASED GRAPHIC DIALOGUES . o ...................... .... 80 D. L Mw 4 CONTENTS (Cont’d) ABSTRACTS ACCEPTED FOR PUBLICATION MAD, AN EXPERT...methodology of inductive shallow modeling was developed. Inductive systems may become powerful shallow modeling tools applicable to a large class of...analysis was conducted using a statistical package, Trajectories. Four different types of relationships were analyzed: linear, logarithmic, power , and
A Tutorial for SPSS/PC+ Studentware. Study Guide for the Doctor of Arts in Computer-Based Learning.
ERIC Educational Resources Information Center
MacFarland, Thomas W.; Hou, Cheng-I
The purpose of this tutorial is to provide the basic information needed for success with SPSS/PC+ Studentware, a student version of the statistical analysis software offered by SPSS, Inc., for the IBM PC+ and compatible computers. It is intended as a convenient summary of how to organize and conduct the most common computer-based statistical…
ERIC Educational Resources Information Center
Roth, Kathleen J.; Druker, Stephen L.; Garnier, Helen E.; Lemmens, Meike; Chen, Catherine; Kawanaka, Takako; Rasmussen, Dave; Trubacova, Svetlana; Warvi, Dagmar; Okamoto, Yukari; Stigler, James; Gallimore, Ronald
2006-01-01
This report presents the results of a study of eighth-grade science teaching, conducted as part of the Third International Mathematics and Science Study (TIMSS) 1999 Video Study. The Video Study is a supplement to the TIMSS 1999 student assessment, a successor to the TIMSS 1995 student assessment. The TIMSS 1999 Video Study had the broad purpose…
Study of Personnel Attrition and Revocation within U.S. Marine Corps Air Traffic Control Specialties
2012-03-01
Entrance Processing Stations (MEPS) and recruit depots, to include non-cognitive testing, such as Navy Computer Adaptive Personality Scales ( NCAPS ...Revocation, Selection, MOS, Regression, Probit, dProbit, STATA, Statistics, Marginal Effects, ASVAB, AFQT, Composite Scores, Screening, NCAPS 15. NUMBER...Navy Computer Adaptive Personality Scales ( NCAPS ), during recruitment. It is also recommended that an economic analysis be conducted comparing the
An assessment of training needs for the lumber manufacturing industry in the eastern United States
Joseph Denig; Scott Page; Yuhua Su; Karen Martinson
2008-01-01
A training needs assessment of the primary forest products industry was conducted for 33 eastern states. his publication presents in detail the statistical analysis of the study. Of the 2,570 lumber manufacturing companies, consisting of firms with more than six employees for the U.S. Department of Labor Standard Industrial Classification Code 2421, the response rate...
Statistical primer: how to deal with missing data in scientific research?
Papageorgiou, Grigorios; Grant, Stuart W; Takkenberg, Johanna J M; Mokhles, Mostafa M
2018-05-10
Missing data are a common challenge encountered in research which can compromise the results of statistical inference when not handled appropriately. This paper aims to introduce basic concepts of missing data to a non-statistical audience, list and compare some of the most popular approaches for handling missing data in practice and provide guidelines and recommendations for dealing with and reporting missing data in scientific research. Complete case analysis and single imputation are simple approaches for handling missing data and are popular in practice, however, in most cases they are not guaranteed to provide valid inferences. Multiple imputation is a robust and general alternative which is appropriate for data missing at random, surpassing the disadvantages of the simpler approaches, but should always be conducted with care. The aforementioned approaches are illustrated and compared in an example application using Cox regression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Fangyan; Zhang, Song; Chung Wong, Pak
Effectively visualizing large graphs and capturing the statistical properties are two challenging tasks. To aid in these two tasks, many sampling approaches for graph simplification have been proposed, falling into three categories: node sampling, edge sampling, and traversal-based sampling. It is still unknown which approach is the best. We evaluate commonly used graph sampling methods through a combined visual and statistical comparison of graphs sampled at various rates. We conduct our evaluation on three graph models: random graphs, small-world graphs, and scale-free graphs. Initial results indicate that the effectiveness of a sampling method is dependent on the graph model, themore » size of the graph, and the desired statistical property. This benchmark study can be used as a guideline in choosing the appropriate method for a particular graph sampling task, and the results presented can be incorporated into graph visualization and analysis tools.« less
Meta-analysis of randomized clinical trials in the era of individual patient data sharing.
Kawahara, Takuya; Fukuda, Musashi; Oba, Koji; Sakamoto, Junichi; Buyse, Marc
2018-06-01
Individual patient data (IPD) meta-analysis is considered to be a gold standard when the results of several randomized trials are combined. Recent initiatives on sharing IPD from clinical trials offer unprecedented opportunities for using such data in IPD meta-analyses. First, we discuss the evidence generated and the benefits obtained by a long-established prospective IPD meta-analysis in early breast cancer. Next, we discuss a data-sharing system that has been adopted by several pharmaceutical sponsors. We review a number of retrospective IPD meta-analyses that have already been proposed using this data-sharing system. Finally, we discuss the role of data sharing in IPD meta-analysis in the future. Treatment effects can be more reliably estimated in both types of IPD meta-analyses than with summary statistics extracted from published papers. Specifically, with rich covariate information available on each patient, prognostic and predictive factors can be identified or confirmed. Also, when several endpoints are available, surrogate endpoints can be assessed statistically. Although there are difficulties in conducting, analyzing, and interpreting retrospective IPD meta-analysis utilizing the currently available data-sharing systems, data sharing will play an important role in IPD meta-analysis in the future.
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
Geographic information system as country-level development and monitoring tool, Senegal example
Moore, Donald G.; Howard, Stephen M.; ,
1990-01-01
Geographic information systems (GIS) allow an investigator the capability to merge and analyze numerous types of country-level resource data. Hypothetical resource analysis applications in Senegal were conducted to illustrate the utility of a GIS for development planning and resource monitoring. Map and attribute data for soils, vegetation, population, infrastructure, and administrative units were merged to form a database within a GIS. Several models were implemented using a GIS to: analyze development potential for sustainable dryland agriculture; prioritize where agricultural development should occur based upon a regional food budget; and monitor dynamic events with remote sensing. The steps for implementing a GIS analysis are described and illustrated, and the use of a GIS for conducting an economic analysis is outlined. Using a GIS for analysis and display of results opens new methods of communication between resource scientists and decision makers. Analyses yielding country-wide map output and detailed statistical data for each level of administration provide the advantage of a single system that can serve a variety of users.
Preparing for the first meeting with a statistician.
De Muth, James E
2008-12-15
Practical statistical issues that should be considered when performing data collection and analysis are reviewed. The meeting with a statistician should take place early in the research development before any study data are collected. The process of statistical analysis involves establishing the research question, formulating a hypothesis, selecting an appropriate test, sampling correctly, collecting data, performing tests, and making decisions. Once the objectives are established, the researcher can determine the characteristics or demographics of the individuals required for the study, how to recruit volunteers, what type of data are needed to answer the research question(s), and the best methods for collecting the required information. There are two general types of statistics: descriptive and inferential. Presenting data in a more palatable format for the reader is called descriptive statistics. Inferential statistics involve making an inference or decision about a population based on results obtained from a sample of that population. In order for the results of a statistical test to be valid, the sample should be representative of the population from which it is drawn. When collecting information about volunteers, researchers should only collect information that is directly related to the study objectives. Important information that a statistician will require first is an understanding of the type of variables involved in the study and which variables can be controlled by researchers and which are beyond their control. Data can be presented in one of four different measurement scales: nominal, ordinal, interval, or ratio. Hypothesis testing involves two mutually exclusive and exhaustive statements related to the research question. Statisticians should not be replaced by computer software, and they should be consulted before any research data are collected. When preparing to meet with a statistician, the pharmacist researcher should be familiar with the steps of statistical analysis and consider several questions related to the study to be conducted.
Guiho-Bailly, Marie-Pierre; Roquelaure, Yves
2013-01-01
The population-based survey "Santé et itinéraire professionnel" (SIP) aims to investigate the relationship between health and career. A qualitative study was conducted to identify potential biases in the design of the questionnaire. The Laboratoire d'ergonomie et d'épidémiologie en santé au travail (research center on "Ergonomics and Epidemiology in Occupational Health" based at the University of Angers, France) recently conducted a study entitled "Rapport subjectif au travail, sens des trajets professionnels et construction de la santé" ("The subjective perception of work, career paths and the construction of health"). Individual interviews were conducted with thirty survey respondents (irrespective of whether they had reported any health problems or established a link between a health event and their career path) by two experts in the psychodynamics of work. The analysis of the clinical and statistical data involved four stages: a study of an initial test case, a comparison of monographs and reports drawn up by the DREES/DARES, an analysis of questionnaire responses, and an analysis of thirty monographs. MAIN RESULTS AND DISCUSSION: After an examination of the results in relation to sample composition and the method used, the study shows that the relationship between health and career is not overestimated, but also indicates that psychological and musculoskeletal disorders and "minor" work accidents tend to be underreported. The study also found a loss of information about professional mobility as a way of maintaining health. Based on a qualitative approach to validation, the proposed method provides a basis for assessing the design of the questionnaire and provides reference points for data interpretation and the direction of future research.
Spectral Discrete Probability Density Function of Measured Wind Turbine Noise in the Far Field
Ashtiani, Payam; Denison, Adelaide
2015-01-01
Of interest is the spectral character of wind turbine noise at typical residential set-back distances. In this paper, a spectral statistical analysis has been applied to immission measurements conducted at three locations. This method provides discrete probability density functions for the Turbine ONLY component of the measured noise. This analysis is completed for one-third octave sound levels, at integer wind speeds, and is compared to existing metrics for measuring acoustic comfort as well as previous discussions on low-frequency noise sources. PMID:25905097
Impact of Uncertainty on the Porous Media Description in the Subsurface Transport Analysis
NASA Astrophysics Data System (ADS)
Darvini, G.; Salandin, P.
2008-12-01
In the modelling of flow and transport phenomena in naturally heterogeneous media, the spatial variability of hydraulic properties, typically the hydraulic conductivity, is generally described by use of a variogram of constant sill and spatial correlation. While some analyses reported in the literature discuss of spatial inhomogeneity related to a trend in the mean hydraulic conductivity, the effect in the flow and transport due to an inexact definition of spatial statistical properties of media as far as we know had never taken into account. The relevance of this topic is manifest, and it is related to the uncertainty in the definition of spatial moments of hydraulic log-conductivity from an (usually) little number of data, as well as to the modelling of flow and transport processes by the Monte Carlo technique, whose numerical fields have poor ergodic properties and are not strictly statistically homogeneous. In this work we investigate the effects related to mean log-conductivity (logK) field behaviours different from the constant one due to different sources of inhomogeneity as: i) a deterministic trend; ii) a deterministic sinusoidal pattern and iii) a random behaviour deriving from the hierarchical sedimentary architecture of porous formations and iv) conditioning procedure on available measurements of the hydraulic conductivity. These mean log-conductivity behaviours are superimposed to a correlated weakly fluctuating logK field. The time evolution of the spatial moments of the plume driven by a statistically inhomogeneous steady state random velocity field is analyzed in a 2-D finite domain by taking into account different sizes of injection area. The problem is approached by both a classical Monte Carlo procedure and SFEM (stochastic finite element method). By the latter the moments are achieved by space-time integration of the velocity field covariance structure derived according to the first- order Taylor series expansion. Two different goals are foreseen: 1) from the results it will be possible to distinguish the contribute in the plume dispersion of the uncertainty in the statistics of the medium hydraulic properties in all the cases considered, and 2) we will try to highlight the loss of performances that seems to affect the first-order approaches in the transport phenomena that take place in hierarchical architecture of porous formations.
Teodoro, P E; Torres, F E; Santos, A D; Corrêa, A M; Nascimento, M; Barroso, L M A; Ceccon, G
2016-05-09
The aim of this study was to evaluate the suitability of statistics as experimental precision degree measures for trials with cowpea (Vigna unguiculata L. Walp.) genotypes. Cowpea genotype yields were evaluated in 29 trials conducted in Brazil between 2005 and 2012. The genotypes were evaluated with a randomized block design with four replications. Ten statistics that were estimated for each trial were compared using descriptive statistics, Pearson correlations, and path analysis. According to the class limits established, selective accuracy and F-test values for genotype, heritability, and the coefficient of determination adequately estimated the degree of experimental precision. Using these statistics, 86.21% of the trials had adequate experimental precision. Selective accuracy and the F-test values for genotype, heritability, and the coefficient of determination were directly related to each other, and were more suitable than the coefficient of variation and the least significant difference (by the Tukey test) to evaluate experimental precision in trials with cowpea genotypes.
The new statistics: why and how.
Cumming, Geoff
2014-01-01
We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial. This article explains why the new statistics are important and offers guidance for their use. It describes an eight-step new-statistics strategy for research with integrity, which starts with formulation of research questions in estimation terms, has no place for NHST, and is aimed at building a cumulative quantitative discipline.
Iocca, Oreste; Farcomeni, Alessio; Pardiñas Lopez, Simon; Talib, Huzefa S
2017-01-01
To conduct a traditional meta-analysis and a Bayesian Network meta-analysis to synthesize the information coming from randomized controlled trials on different socket grafting materials and combine the resulting indirect evidence in order to make inferences on treatments that have not been compared directly. RCTs were identified for inclusion in the systematic review and subsequent statistical analysis. Bone height and width remodelling were selected as the chosen summary measures for comparison. First, a series of pairwise meta-analyses were performed and overall mean difference (MD) in mm with 95% CI was calculated between grafted versus non-grafted sockets. Then, a Bayesian Network meta-analysis was performed to draw indirect conclusions on which grafting materials can be considered most likely the best compared to the others. From the six included studies, seven comparisons were obtained. Traditional meta-analysis showed statistically significant results in favour of grafting the socket compared to no-graft both for height (MD 1.02, 95% CI 0.44-1.59, p value < 0.001) than for width (MD 1.52 95% CI 1.18-1.86, p value <0.000001) remodelling. Bayesian Network meta-analysis allowed to obtain a rank of intervention efficacy. On the basis of the results of the present analysis, socket grafting seems to be more favourable than unassisted socket healing. Moreover, Bayesian Network meta-analysis indicates that freeze-dried bone graft plus membrane is the most likely effective in the reduction of bone height remodelling. Autologous bone marrow resulted the most likely effective when width remodelling was considered. Studies with larger samples and less risk of bias should be conducted in the future in order to further strengthen the results of this analysis. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Dadaser-Celik, Filiz; Azgin, Sukru Taner; Yildiz, Yalcin Sevki
2016-12-01
Biogas production from food waste has been used as an efficient waste treatment option for years. The methane yields from decomposition of waste are, however, highly variable under different operating conditions. In this study, a statistical experimental design method (Taguchi OA 9 ) was implemented to investigate the effects of simultaneous variations of three parameters on methane production. The parameters investigated were solid content (SC), carbon/nitrogen ratio (C/N) and food/inoculum ratio (F/I). Two sets of experiments were conducted with nine anaerobic reactors operating under different conditions. Optimum conditions were determined using statistical analysis, such as analysis of variance (ANOVA). A confirmation experiment was carried out at optimum conditions to investigate the validity of the results. Statistical analysis showed that SC was the most important parameter for methane production with a 45% contribution, followed by F/I ratio with a 35% contribution. The optimum methane yield of 151 l kg -1 volatile solids (VS) was achieved after 24 days of digestion when SC was 4%, C/N was 28 and F/I were 0.3. The confirmation experiment provided a methane yield of 167 l kg -1 VS after 24 days. The analysis showed biogas production from food waste may be increased by optimization of operating conditions. © The Author(s) 2016.
Effect of open rhinoplasty on the smile line.
Tabrizi, Reza; Mirmohamadsadeghi, Hoori; Daneshjoo, Danadokht; Zare, Samira
2012-05-01
Open rhinoplasty is an esthetic surgical technique that is becoming increasingly popular, and can affect the nose and upper lip compartments. The aim of this study was to evaluate the effect of open rhinoplasty on tooth show and the smile line. The study participants were 61 patients with a mean age of 24.3 years (range, 17.2 to 39.6 years). The surgical procedure consisted of an esthetic open rhinoplasty without alar resection. Analysis of tooth show was limited to pre- and postoperative (at 12 months) tooth show measurements at rest and the maximum smile with a ruler (when participants held their heads naturally). Statistical analyses were performed with SPSS 13.0, and paired-sample t tests were used to compare tooth show means before and after the operation. Analysis of the rest position showed no statistically significant change in tooth show (P = .15), but analysis of participants' maximum smile data showed a statistically significant increase in tooth show after surgery (P < .05). In contrast, Pearson correlation analysis showed a positive relation between rhinoplasty and tooth show increases in maximum smile, especially in subjects with high smile lines. This study shows that the nasolabial compartment is a single unit and any change in 1 part may influence the other parts. Further studies should be conducted to investigate these interactions. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
41 CFR 105-50.202-2 - Preparation of or assistance in the conduct of statistical or other studies.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Preparation of or... Available From General Services Administration § 105-50.202-2 Preparation of or assistance in the conduct of statistical or other studies. (a) This service includes preparation of statistical or other studies and...
41 CFR 105-50.202-2 - Preparation of or assistance in the conduct of statistical or other studies.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Preparation of or... Available From General Services Administration § 105-50.202-2 Preparation of or assistance in the conduct of statistical or other studies. (a) This service includes preparation of statistical or other studies and...
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, Hun C.; Fang, Ho T.
1987-01-01
The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).
Current trends in treatment of hypertension in Karachi and cost minimization possibilities.
Hussain, Izhar M; Naqvi, Baqir S; Qasim, Rao M; Ali, Nasir
2015-01-01
This study finds out drug usage trends in Stage I Hypertensive Patients without any compelling indications in Karachi, deviations of current practices from evidence based antihypertensive therapeutic guidelines and looks for cost minimization opportunities. In the present study conducted during June 2012 to August 2012, two sets were used. Randomized stratified independent surveys were conducted in doctors and general population - including patients, using pretested questionnaires. Sample sizes for doctors and general population were 100 and 400 respectively. Statistical analysis was conducted on Statistical Package for Social Science (SPSS). Financial impact was also analyzed. On the basis of patients' doctors' feedback, Beta Blockers, and Angiotensin Converting Enzyme Inhibitors were used more frequently than other drugs. Thiazides and low-priced generics were hardly prescribed. Beta blockers were prescribed widely and considered cost effective. This trend increases cost by two to ten times. Feedbacks showed that therapeutic guidelines were not followed by the doctors practicing in the community and hospitals in Karachi. Thiazide diuretics were hardly used. Beta blockers were widely prescribed. High priced market leaders or expensive branded generics were commonly prescribed. Therefore, there are great opportunities for cost minimization by using evidence-based clinically effective and safe medicines.
Almeida, Diogo; Skov, Ida; Lund, Jesper; Mohammadnejad, Afsaneh; Silva, Artur; Vandin, Fabio; Tan, Qihua; Baumbach, Jan; Röttger, Richard
2016-10-01
Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS). For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html.
Almeida, Diogo; Skov, Ida; Lund, Jesper; Mohammadnejad, Afsaneh; Silva, Artur; Vandin, Fabio; Tan, Qihua; Baumbach, Jan; Röttger, Richard
2016-12-18
Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS). For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html.
Effect of Eutectic Concentration on Conductivity in PEO:LiX Based Solid Polymer Electrolytes
NASA Astrophysics Data System (ADS)
Zhan, Pengfei; Ganapatibhotla, Lalitha; Maranas, Janna
Polyethylene oxide (PEO) and lithium salt based solid polymer electrolytes (SPEs) have been widely proposed as a substitution for the liquid electrolyte in Li-ion batteries. As salt concentration varies, these systems demonstrate rich phase behavior. Conductivity as a function of salt concentration has been measured for decades and various concentration dependences have been observed. A PEO:LiX mixture can have one or two conductivity maximums, while some mixtures with salt of high ionic strength will have higher conductivity as the salt concentration decrease. The factors that affect the conductivity are specific for each sample. The universal factor that affects conductivity is still not clear. In this work, we measured the conductivity of a series of PEO:LiX mixtures and statistical analysis shows conductivity is affected by the concentration difference from the eutectic concentration (Δc). The correlation with Δc is stronger than the correlation with glass transition temperature. We believe that at the eutectic concentration, during the solidification process, unique structures can form which aid conduction. Currently at Dow Chemical.
NASA Astrophysics Data System (ADS)
Getnet Tadesse, Melkie; Loghin, Carmen; Chen, Yan; Wang, Lichuan; Catalin, Dumitras; Nierstrasz, Vincent
2017-06-01
Coating of textile fabrics with poly (3, 4-ethylenedioxythiophene): poly (styrene sulfonate) (PEDOT:PSS) is one of the methods used for obtaining functional or smart applications. In this work, we prepared PEDOT:PSS polymer with certain additives such as polyethylene glycol, methanol (MeOH), and ethylene glycol on polyester fabric substrates by a simple immersion process. Surface resistance was measured and analyzed with analysis of variance to determine the coating parameters at 95% confidence level. Fourier transform infrared (FTIR) analysis and scanning electron microscopy (SEM) study of the samples were performed. Contact angle and washing fastness measurements were conducted, to observe the wettability and washing fastness of the samples, respectively. Surface resistance values were decreased by a factor of 100, due to conductive enhancers. As the immersion time and temperature condition varies, surface resistance showed no difference, statistically. FTIR analysis supports the idea that the mechanism responsible for the conductivity enhancement is the partial replacement of PSS from PEDOT chain by forming a hydrogen bond with hydroxyl ion (OH) of the conductive enhancers. A SEM images showed that PEDOT:PSS is well distributed to the surface of the fabrics. Contact angle measurements showed morphology change in the samples. The conductivity was reasonably stable after 10 washing cycles. Altogether, an effective simple immersion of coated polyester fabric is presented to achieve functional textiles that offer a broad range of possible applications.
[Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].
Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina
2012-09-01
The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies
Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.
2016-01-01
The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis. PMID:27274911
Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.
Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H
2016-04-01
The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.
Gray, Alastair
2017-01-01
Increasing numbers of economic evaluations are conducted alongside randomised controlled trials. Such studies include factorial trials, which randomise patients to different levels of two or more factors and can therefore evaluate the effect of multiple treatments alone and in combination. Factorial trials can provide increased statistical power or assess interactions between treatments, but raise additional challenges for trial‐based economic evaluations: interactions may occur more commonly for costs and quality‐adjusted life‐years (QALYs) than for clinical endpoints; economic endpoints raise challenges for transformation and regression analysis; and both factors must be considered simultaneously to assess which treatment combination represents best value for money. This article aims to examine issues associated with factorial trials that include assessment of costs and/or cost‐effectiveness, describe the methods that can be used to analyse such studies and make recommendations for health economists, statisticians and trialists. A hypothetical worked example is used to illustrate the challenges and demonstrate ways in which economic evaluations of factorial trials may be conducted, and how these methods affect the results and conclusions. Ignoring interactions introduces bias that could result in adopting a treatment that does not make best use of healthcare resources, while considering all interactions avoids bias but reduces statistical power. We also introduce the concept of the opportunity cost of ignoring interactions as a measure of the bias introduced by not taking account of all interactions. We conclude by offering recommendations for planning, analysing and reporting economic evaluations based on factorial trials, taking increased analysis costs into account. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28470760
Lederer, David J; Bradford, Williamson Z; Fagan, Elizabeth A; Glaspole, Ian; Glassberg, Marilyn K; Glasscock, Kenneth F; Kardatzke, David; King, Talmadge E; Lancaster, Lisa H; Nathan, Steven D; Pereira, Carlos A; Sahn, Steven A; Swigris, Jeffrey J; Noble, Paul W
2015-07-01
FVC outcomes in clinical trials on idiopathic pulmonary fibrosis (IPF) can be substantially influenced by the analytic methodology and the handling of missing data. We conducted a series of sensitivity analyses to assess the robustness of the statistical finding and the stability of the estimate of the magnitude of treatment effect on the primary end point of FVC change in a phase 3 trial evaluating pirfenidone in adults with IPF. Source data included all 555 study participants randomized to treatment with pirfenidone or placebo in the Assessment of Pirfenidone to Confirm Efficacy and Safety in Idiopathic Pulmonary Fibrosis (ASCEND) study. Sensitivity analyses were conducted to assess whether alternative statistical tests and methods for handling missing data influenced the observed magnitude of treatment effect on the primary end point of change from baseline to week 52 in FVC. The distribution of FVC change at week 52 was systematically different between the two treatment groups and favored pirfenidone in each analysis. The method used to impute missing data due to death had a marked effect on the magnitude of change in FVC in both treatment groups; however, the magnitude of treatment benefit was generally consistent on a relative basis, with an approximate 50% reduction in FVC decline observed in the pirfenidone group in each analysis. Our results confirm the robustness of the statistical finding on the primary end point of change in FVC in the ASCEND trial and corroborate the estimated magnitude of the pirfenidone treatment effect in patients with IPF. ClinicalTrials.gov; No.: NCT01366209; URL: www.clinicaltrials.gov.
Medical history and epidemiology: their contribution to the development of public health nursing.
Earl, Catherine E
2009-01-01
The nursing profession historically has been involved in data collection in research efforts notably from the time of the Framingham Tuberculosis Project (1914-1923). Over the past century, nurses have become more sophisticated in their abilities to design, conduct, and analyze data. This article discusses the contributions of medicine and epidemiology to the development of public health nursing and the use of statistical methods by nurses in the United States in the 19th and 20th centuries. Knowledge acquired from this article will inform educators and researchers about the importance of using quantitative analysis, evidenced-based knowledge, and statistical methods when teaching students in all health professions.
NASA Technical Reports Server (NTRS)
Slutz, R. J.; Gray, T. B.; West, M. L.; Stewart, F. G.; Leftin, M.
1971-01-01
A statistical study of formulas for predicting the sunspot number several years in advance is reported. By using a data lineup with cycle maxima coinciding, and by using multiple and nonlinear predictors, a new formula which gives better error estimates than former formulas derived from the work of McNish and Lincoln is obtained. A statistical analysis is conducted to determine which of several mathematical expressions best describes the relationship between 10.7 cm solar flux and Zurich sunspot numbers. Attention is given to the autocorrelation of the observations, and confidence intervals for the derived relationships are presented. The accuracy of predicting a value of 10.7 cm solar flux from a predicted sunspot number is dicussed.
Guidelines for the Investigation of Mediating Variables in Business Research
Coxe, Stefany; Baraldi, Amanda N.
2013-01-01
Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized. PMID:25237213
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marekova, Elisaveta
Series of relatively large earthquakes in different regions of the Earth are studied. The regions chooses are of a high seismic activity and has a good contemporary network for recording of the seismic events along them. The main purpose of this investigation is the attempt to describe analytically the seismic process in the space and time. We are considering the statistical distributions the distances and the times between consecutive earthquakes (so called pair analysis). Studies conducted on approximating the statistical distribution of the parameters of consecutive seismic events indicate the existence of characteristic functions that describe them best. Such amore » mathematical description allows the distributions of the examined parameters to be compared to other model distributions.« less
STRengthening analytical thinking for observational studies: the STRATOS initiative.
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-12-30
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
Kottner, Jan; Halfens, Ruud
2010-05-01
Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.
Thorlund, Kristian; Druyts, Eric; Toor, Kabirraaj; Mills, Edward J
2015-05-01
To conduct a network meta-analysis (NMA) to establish the comparative efficacy of infliximab, adalimumab and golimumab for the treatment of moderately to severely active ulcerative colitis (UC). A systematic literature search identified five randomized controlled trials for inclusion in the NMA. One trial assessed golimumab, two assessed infliximab and two assessed adalimumab. Outcomes included clinical response, clinical remission, mucosal healing, sustained clinical response and sustained clinical remission. Innovative methods were used to allow inclusion of the golimumab trial data given the alternative design of this trial (i.e., two-stage re-randomization). After induction, no statistically significant differences were found between golimumab and adalimumab or between golimumab and infliximab. Infliximab was statistically superior to adalimumab after induction for all outcomes and treatment ranking suggested infliximab as the superior treatment for induction. Golimumab and infliximab were associated with similar efficacy for achieving maintained clinical remission and sustained clinical remission, whereas adalimumab was not significantly better than placebo for sustained clinical remission. Golimumab and infliximab were also associated with similar efficacy for achieving maintained clinical response, sustained clinical response and mucosal healing. Finally, golimumab 50 and 100 mg was statistically superior to adalimumab for clinical response and sustained clinical response, and golimumab 100 mg was also statistically superior to adalimumab for mucosal healing. The results of our NMA suggest that infliximab was statistically superior to adalimumab after induction, and that golimumab was statistically superior to adalimumab for sustained outcomes. Golimumab and infliximab appeared comparable in efficacy.
Gillespie, Paddy; O'Shea, Eamon; Smith, Susan M; Cupples, Margaret E; Murphy, Andrew W
2016-12-01
Data on health care utilization may be collected using a variety of mechanisms within research studies, each of which may have implications for cost and cost effectiveness. The aim of this observational study is to compare data collected from medical records searches and self-report questionnaires for the cost analysis of a cardiac secondary prevention intervention. Secondary data analysis of the Secondary Prevention of Heart Disease in General Practice (SPHERE) randomized controlled trial (RCT). Resource use data for a range of health care services were collected by research nurse searches of medical records and self-report questionnaires and costs of care estimated for each data collection mechanism. A series of statistical analyses were conducted to compare the mean costs for medical records data versus questionnaire data and to conduct incremental analyses for the intervention and control arms in the trial. Data were available to estimate costs for 95% of patients in the intervention and 96% of patients in the control using the medical records data compared to 65% and 66%, respectively, using the questionnaire data. The incremental analysis revealed a statistically significant difference in mean cost of -€796 (95% CI: -1447, -144; P-value: 0.017) for the intervention relative to the control. This compared to no significant difference in mean cost (95% CI: -1446, 860; P-value: 0.619) for the questionnaire analysis. Our findings illustrate the importance of the choice of health care utilization data collection mechanism for the conduct of economic evaluation alongside randomized trials in primary care. This choice will have implications for the costing methodology employed and potentially, for the cost and cost effectiveness outcomes generated. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Functional data analysis on ground reaction force of military load carriage increment
NASA Astrophysics Data System (ADS)
Din, Wan Rozita Wan; Rambely, Azmin Sham
2014-06-01
Analysis of ground reaction force on military load carriage is done through functional data analysis (FDA) statistical technique. The main objective of the research is to investigate the effect of 10% load increment and to find the maximum suitable load for the Malaysian military. Ten military soldiers age 31 ± 6.2 years, weigh 71.6 ± 10.4 kg and height of 166.3 ± 5.9 cm carrying different military load range from 0% body weight (BW) up to 40% BW participated in an experiment to gather the GRF and kinematic data using Vicon Motion Analysis System, Kirstler force plates and thirty nine body markers. The analysis is conducted in sagittal, medial lateral and anterior posterior planes. The results show that 10% BW load increment has an effect when heel strike and toe-off for all the three planes analyzed with P-value less than 0.001 at 0.05 significant levels. FDA proves to be one of the best statistical techniques in analyzing the functional data. It has the ability to handle filtering, smoothing and curve aligning according to curve features and points of interest.
Role of metabolomics in TBI research
Wolahan, Stephanie M.; Hirt, Daniel; Braas, Daniel; Glenn, Thomas C.
2016-01-01
Synopsis Metabolomics is an important member of the omics community in that it defines which small molecules may be responsible for disease states. This article reviews the essential principles of metabolomics from specimen preparation, chemical analysis, and advanced statistical methods. Metabolomics in TBI has so far been underutilized. Future metabolomics based studies focused on the diagnoses, prognoses, and treatment effects, need to be conducted across all types of TBI. PMID:27637396
Enhancing Research in Networking & System Security, and Forensics, in Puerto Rico
2015-03-03
Researcher and her research revolves around using Cognitive Systems, which are machines that can think, listen and see in order to help the disabled ...Subsequence. The implementation is been conducted using R- Language because of its statistical and analysis abilities. Because it works using a command line...Technology. 14-AUG-13, . : , Eduardo Melendez. FROM RANDOM EMBEDDING TECHNIQUES TO ENTROPY USING IMAGEPOINT ADJACENT SHADE VALUES, 12th Annual
ERIC Educational Resources Information Center
Current Population Reports, 1986
1986-01-01
Analysis of information gained from the March 1986 Current Population Survey (CPS) conducted by the Bureau of the Census shows the following results for the year 1985: (1) median family money income continued to move ahead of inflation; (2) the median earnings of men showed no statistically significant change from 1984, but the earnings of women…
ERIC Educational Resources Information Center
Stancavage, Frances B.; Mitchell, Julia H.; de Mello, Victor Bandeira; Gaertner, Freya E.; Spain, Angeline K.; Rahal, Michelle L.
2006-01-01
This report presents results from a national survey, conducted in 2005, that examined the educational experiences of American Indian/Alaska Native (AI/AN) students in grades 4 and 8, with particular emphasis on the integration of native language and culture into school and classroom activities. Students, teachers, and school principals all…
Intercomparison between ozone profiles measured above Spitsbergen by lidar and sonde techniques
NASA Technical Reports Server (NTRS)
Fabian, Rolf; Vondergathen, Peter; Ehlers, J.; Krueger, Bernd C.; Neuber, Roland; Beyerle, Georg
1994-01-01
This paper compares coincident ozone profile measurements by electrochemical sondes and lidar performed at Ny-Alesund/Spitsbergen. A detailed height dependent statistical analysis of the differences between these complementary methods was performed for the overlapping altitude region (13-35 km). The data set comprises ozone profile measurements conducted between Jan. 1989 and Jan. 1991. Differences of up to 25 percent were found above 30 km altitude.
Evaluation program for secondary spacecraft cells
NASA Technical Reports Server (NTRS)
Christy, D. E.; Harkness, J. D.
1973-01-01
A life cycle test of secondary electric batteries for spacecraft applications was conducted. A sample number of nickel cadmium batteries were subjected to general performance tests to determine the limit of their actual capabilities. Weaknesses discovered in cell design are reported and aid in research and development efforts toward improving the reliability of spacecraft batteries. A statistical analysis of the life cycle prediction and cause of failure versus test conditions is provided.
Comparative Statistical Analysis of Auroral Models
2012-03-22
was willing to add this project to her extremely busy schedule. Lastly, I must also express my sincere appreciation for the rest of the faculty and...models have been extensively used for estimating GPS and other communication satellite disturbances ( Newell et al., 2010a). The auroral oval...models predict changes in the auroral oval in response to various geomagnetic conditions. In 2010, Newell et al. conducted a comparative study of
AKBOĞA, Özge; BARADAN, Selim
2016-01-01
Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods. PMID:27524105
An Elementary Algorithm for Autonomous Air Terminal Merging and Interval Management
NASA Technical Reports Server (NTRS)
White, Allan L.
2017-01-01
A central element of air traffic management is the safe merging and spacing of aircraft during the terminal area flight phase. This paper derives and examines an algorithm for the merging and interval managing problem for Standard Terminal Arrival Routes. It describes a factor analysis for performance based on the distribution of arrivals, the operating period of the terminal, and the topology of the arrival routes; then presents results from a performance analysis and from a safety analysis for a realistic topology based on typical routes for a runway at Phoenix International Airport. The heart of the safety analysis is a statistical derivation on how to conduct a safety analysis for a local simulation when the safety requirement is given for the entire airspace.
Hsu, Benson S; Meyer, Benjamin D; Lakhani, Saquib A
2017-08-01
With the changing healthcare landscape in the United States, teaching hospitals face increasing pressure to provide medical education as well as cost-effective care. Our study investigated the financial, resource utilization and mortality impact of teaching hospital status on pediatric patients admitted with sepsis. We conducted a retrospective, weighted statistical analysis of hospitalized children with the diagnosis of sepsis. The Agency for Healthcare Research and Quality 2009 Kids' Inpatient Database provided the data for analysis. Diagnosis of sepsis and severity of illness levels were based on All Patient Refined Diagnosis-Related Groups of 720: Septicemia and Disseminated Infections. Teaching hospital status was based on presence of training programs. Statistical analysis was conducted using STATA 12.1 (Stata Corporation, College Station, TX). Weighted analysis revealed 17,461 patients with sepsis-9982 in teaching and 7479 in nonteaching hospitals. When comparing all patients, length of stay (8.2 vs. 4.8, P < 0.001), number of procedures received (2.03 vs. 0.87, P < 0.001), mortality (4.7% vs. 1.6%, P < 0.001), costs per day ($2326 vs. $1736, P < 0.001) and total costs ($20,428 vs. $7960, P < 0.001) were higher in teaching hospitals. Even when stratified by severity classes, length of stay, number of procedures received and total costs were higher in teaching hospitals with no difference in mortality. Our study suggested that teaching hospitals provide pediatric inpatient care for sepsis at greater costs and resource utilization without a clear improvement in overall mortality rates in comparison with nonteaching hospitals.
Fighting the Whole System: Dissociative Identity Disorder, Labeling Theory, and Iatrogenic Doubting.
Floris, Jessica; McPherson, Susan
2015-01-01
This research examines how individuals diagnosed with dissociative identity disorder construe their experiences of being labeled with a contested diagnosis. Semistructured interviews were conducted in the United Kingdom with 5 women and 2 men diagnosed with dissociative identity disorder. A framework analysis was conducted. The analysis identified 2 overarching themes: diagnosis cross-examined and navigating care systems. The diagnosis appeared to be continually assessed by participants for its fit with symptoms, and the doubt among professionals seemed to be unhelpfully reflected in participants' attempts to understand and come to terms with their experiences. The findings are considered in light of labeling theory, the iatrogenic effects of professional doubt, and current debates concerning the reliability and validity of psychiatric diagnostic systems that have been reinvigorated by the publication of the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition.
Heat balance statistics derived from four-dimensional assimilations with a global circulation model
NASA Technical Reports Server (NTRS)
Schubert, S. D.; Herman, G. F.
1981-01-01
The reported investigation was conducted to develop a reliable procedure for obtaining the diabatic and vertical terms required for atmospheric heat balance studies. The method developed employs a four-dimensional assimilation mode in connection with the general circulation model of NASA's Goddard Laboratory for Atmospheric Sciences. The initial analysis was conducted with data obtained in connection with the 1976 Data Systems Test. On the basis of the results of the investigation, it appears possible to use the model's observationally constrained diagnostics to provide estimates of the global distribution of virtually all of the quantities which are needed to compute the atmosphere's heat and energy balance.
Integrating teaching and authentic research in the field and laboratory settings
NASA Astrophysics Data System (ADS)
Daryanto, S.; Wang, L.; Kaseke, K. F.; Ravi, S.
2016-12-01
Typically authentic research activities are separated from rigorous classroom teaching. Here we assessed the potential of integrating teaching and research activities both in the field and in the laboratory. We worked with students from both US and abroad without strong science background to utilize advanced environmental sensors and statistical tool to conduct innovative projects. The students include one from Namibia and two local high school students in Indianapolis (through Project SEED, Summer Experience for the Economically Disadvantaged). They conducted leaf potential measurements, isotope measurements and meta-analysis. The experience showed us the great potential of integrating teaching and research in both field and laboratory settings.
Shelton, Deborah; Kesten, Karen; Zhang, Wanli; Trestman, Robert
2011-01-01
Purpose This article reports the findings of a Dialectical Behavioral Therapy- Corrections Modified (DBT-CM) intervention upon difficult to manage, impulsive and/or aggressive incarcerated male adolescents. Methods A secondary analysis of a sub-sample of 38 male adolescents who participated in the study was conducted. A one-group pretest-posttest design was used; descriptive statistics and t-tests were conducted. Results Significant changes were found in physical aggression, distancing coping methods and number of disciplinary tickets for behavior. Conclusion The study supports the value of DBT-CM for management of incarcerated male adolescents with difficult to manage aggressive behaviors. PMID:21501287
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
Kessel, K A; Habermehl, D; Bohn, C; Jäger, A; Floca, R O; Zhang, L; Bougatf, N; Bendl, R; Debus, J; Combs, S E
2012-12-01
Especially in the field of radiation oncology, handling a large variety of voluminous datasets from various information systems in different documentation styles efficiently is crucial for patient care and research. To date, conducting retrospective clinical analyses is rather difficult and time consuming. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using an analysis system connected with a documentation system. A total number of 783 patients have been documented into a professional, database-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported into the web-based system. For 36 patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After an automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are saved in the database and included in statistical calculations. The main goal of using an automatic analysis tool is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the analysis system to other types of tumors in radiation oncology.
Alper, Züleyha; Ercan, İlker; Uncu, Yeşim
2018-01-01
Objective Obesity in childhood and adolescence is one of the most serious public health problems due to a remarkable increase in prevalence in recent years and its close relationship with non-communicable diseases, such as diabetes and hypertension, resulting in increased adult morbidity and mortality. This study aims to quantify the secular trend in different regions of Turkey from 1990 to 2015 by performing a meta-analysis of childhood and adolescent obesity prevalence studies conducted. Methods Uludag University Library Database was searched for relevant articles published prior to March 2017. The heterogeneity of the studies in the meta-analysis was tested by the I2 statistic and Cochran’s Q test. The obesity trend analyses were examined by chi-square trend analysis with respect to five year periods. The statistical significance level was taken as α=0.05. Results A total of 76 papers were initially identified addressing childhood and adolescent obesity in Turkey. Fifty-eight papers were selected for analysis. The prevalence of obesity increased from 0.6% to 7.3% with an 11.6-fold increase between the periods 1990-1995 to 2011-2015. The prevalence of obesity increased in both genders. However, boys were more likely to be obese than girls. Conclusion Studies on obesity prevalence in the 5-19 age group in Turkey have gained importance, especially in the 2000s. While a remarkable number of prevalence studies, mostly regional, have been conducted between 2005-2011, a gradual decline was observed thereafter. Further national and population-based surveys on prevalence of obesity in children and adolescents are definitely needed in Turkey. PMID:28901943
Alper, Züleyha; Ercan, İlker; Uncu, Yeşim
2018-03-01
Obesity in childhood and adolescence is one of the most serious public health problems due to a remarkable increase in prevalence in recent years and its close relationship with non-communicable diseases, such as diabetes and hypertension, resulting in increased adult morbidity and mortality. This study aims to quantify the secular trend in different regions of Turkey from 1990 to 2015 by performing a meta-analysis of childhood and adolescent obesity prevalence studies conducted. Uludag University Library Database was searched for relevant articles published prior to March 2017. The heterogeneity of the studies in the meta-analysis was tested by the I2 statistic and Cochran's Q test. The obesity trend analyses were examined by chi-square trend analysis with respect to five year periods. The statistical significance level was taken as α=0.05. A total of 76 papers were initially identified addressing childhood and adolescent obesity in Turkey. Fifty-eight papers were selected for analysis. The prevalence of obesity increased from 0.6% to 7.3% with an 11.6-fold increase between the periods 1990-1995 to 2011-2015. The prevalence of obesity increased in both genders. However, boys were more likely to be obese than girls. Studies on obesity prevalence in the 5-19 age group in Turkey have gained importance, especially in the 2000s. While a remarkable number of prevalence studies, mostly regional, have been conducted between 2005-2011, a gradual decline was observed thereafter. Further national and population-based surveys on prevalence of obesity in children and adolescents are definitely needed in Turkey.
STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-01-01
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results
Sangamesh, N C; Vidya, K C; Pathi, Jugajyoti; Singh, Arpita
2017-01-01
To assess the awareness, attitude, and knowledge about basic life support (BLS) among medical, dental, and nursing students and faculties and the proposal of BLS skills in the academic curriculum of undergraduate (UG) course. Recognition, prevention, and effective management of life-threatening emergencies are the responsibility of health-care professionals. These situations can be successfully managed by proper knowledge and training of the BLS skills. These life-saving maneuvers can be given through the structured resuscitation programs, which are lacking in the academic curriculum. A questionnaire study consisting of 20 questions was conducted among 659 participants in the Kalinga Institute of Dental Sciences, Kalinga Institute of Medical Sciences, KIIT University. Medical junior residents, BDS faculties, interns, nursing faculties, and 3 rd -year and final-year UG students from both medical and dental colleges were chosen. The statistical analysis was carried out using SPSS software version 20.0 (Armonk, NY:IBM Corp). After collecting the data, the values were statistically analyzed and tabulated. Statistical analysis was performed using Mann-Whitney U-test. The results with P < 0.05 were considered statistically significant. Our participants were aware of BLS, showed positive attitude toward it, whereas the knowledge about BLS was lacking, with the statistically significant P value. By introducing BLS regularly in the academic curriculum and by routine hands on workshops, all the health-care providers should be well versed with the BLS skills for effectively managing the life-threatening emergencies.
Cundell, A M; Bean, R; Massimore, L; Maier, C
1998-01-01
To determine the relationship between the sampling time of the environmental monitoring, i.e., viable counts, in aseptic filling areas and the microbial count and frequency of alerts for air, surface and personnel microbial monitoring, statistical analyses were conducted on 1) the frequency of alerts versus the time of day for routine environmental sampling conducted in calendar year 1994, and 2) environmental monitoring data collected at 30-minute intervals during routine aseptic filling operations over two separate days in four different clean rooms with multiple shifts and equipment set-ups at a parenteral manufacturing facility. Statistical analyses showed, except for one floor location that had significantly higher number of counts but no alert or action level samplings in the first two hours of operation, there was no relationship between the number of counts and the time of sampling. Further studies over a 30-day period at the floor location showed no relationship between time of sampling and microbial counts. The conclusion reached in the study was that there is no worst case time for environmental monitoring at that facility and that sampling any time during the aseptic filling operation will give a satisfactory measure of the microbial cleanliness in the clean room during the set-up and aseptic filling operation.
Parametric study of the swimming performance of a fish robot propelled by a flexible caudal fin.
Low, K H; Chong, C W
2010-12-01
In this paper, we aim to study the swimming performance of fish robots by using a statistical approach. A fish robot employing a carangiform swimming mode had been used as an experimental platform for the performance study. The experiments conducted aim to investigate the effect of various design parameters on the thrust capability of the fish robot with a flexible caudal fin. The controllable parameters associated with the fin include frequency, amplitude of oscillation, aspect ratio and the rigidity of the caudal fin. The significance of these parameters was determined in the first set of experiments by using a statistical approach. A more detailed parametric experimental study was then conducted with only those significant parameters. As a result, the parametric study could be completed with a reduced number of experiments and time spent. With the obtained experimental result, we were able to understand the relationship between various parameters and a possible adjustment of parameters to obtain a higher thrust. The proposed statistical method for experimentation provides an objective and thorough analysis of the effects of individual or combinations of parameters on the swimming performance. Such an efficient experimental design helps to optimize the process and determine factors that influence variability.
The bedrock electrical conductivity structure of Northern Ireland
NASA Astrophysics Data System (ADS)
Beamish, David
2013-08-01
An airborne geophysical survey of the whole of Northern Ireland has provided over 4.8 M estimates of the bedrock conductivity over the wide range of geological formations present. This study investigates how such data can be used to provide additional knowledge in relation to existing digital geological map information. A by-product of the analysis is a simplification of the spatially aggregated information obtained in such surveys. The methodology used is a GIS-based attribution of the conductivity estimates using a lithological classification of the bedrock formations. A 1:250k geological classification of the data is performed leading to a 56 unit lithological and geostatistical analysis of the conductivity information. The central moments (medians) of the classified data are used to provide a new digital bedrock conductivity map of Northern Ireland with values ranging from 0.32 to 41.36 mS m-1. This baseline map of conductivities displays a strong correspondence with an existing 4 quadrant, chrono-geological description of Northern Ireland. Once defined, the baseline conductivity map allows departures from the norm to be assessed across each specific lithological unit. Bulk electrical conductivity is controlled by a number of petrophysical parameters and it is their variation that is assessed by the procedures employed. The igneous rocks are found to display the largest variability in conductivity values and many of the statistical distributions are multi-modal. A sequence of low-value modes in these data are associated with intrusives within volcanic complexes. These and much older Neoproterzoic rocks appear to represent very low porosity formations that may be the product of rapid cooling during emplacement. By way of contrast, extensive flood basalts (the Antrim lavas) record a well-defined and much higher median value (12.24 mS m-1) although they display complex spatial behaviour in detail. Sedimentary rocks appear to follow the broad behaviours anticipated by standard theoretical descriptions of rock electrical properties that allow for a term due to grain surface conduction (e.g. the presence of clay). Single lithology sedimentary rocks are represented by an increasing set of conductivities through the sequence sandstone (4.91 mS m-1), limestone (8.41 mS m-1) and mudstone (17.85 mS m-1) with argillaceous rocks providing a conductivity of 41.1 mS m-1. In the case of both sandstone and limestone, the single lithology conductivities are significantly less than their mixed lithology counterparts. Mudrocks display a bimodal statistical distribution and an extended analysis of these rocks is carried out across a Carboniferous basin. The results clearly indicate that non-shale mudstones are distinctly less conductive than their shale counterparts. Shale formations display rapid and large movements in conductivity and it is suggested that the observed sensitivity may be due to competing surface conduction effects due to clay and organic material. A study of the variation of conductivity with geological period is also performed. Both a decreasing trend with age and a modulation that peaks in the Triassic period are observed.
The effect of kangaroo mother care on mental health of mothers with low birth weight infants
Badiee, Zohreh; Faramarzi, Salar; MiriZadeh, Tahereh
2014-01-01
Background: The mothers of premature infants are at risk of psychological stress because of separation from their infants. One of the methods influencing the maternal mental health in the postpartum period is kangaroo mother care (KMC). This study was conducted to evaluate the effect of KMC of low birth weight infants on their maternal mental health. Materials and Methods: The study was conducted in the Department of Pediatrics of Isfahan University of Medical Sciences, Isfahan, Iran. Premature infants were randomly allocated into two groups. The control group received standard caring in the incubator. In the experimental group, caring with three sessions of 60 min KMC daily for 1 week was practiced. Mental health scores of the mothers were evaluated by using the 28-item General Health Questionnaire. Statistical analysis was performed by the analysis of covariance using SPSS. Results: The scores of 50 infant-mother pairs were analyzed totally (25 in KMC group and 25 in standard care group). Results of covariance analysis showed the positive effects of KMC on the rate of maternal mental health scores. There were statistically significant differences between the mean scores of the experimental group and control subjects in the posttest period (P < 0.001). Conclusion: KMC for low birth weight infants is a safe way to improve maternal mental health. Therefore, it is suggested as a useful method that can be recommended for improving the mental health of mothers. PMID:25371871
Wu, Yingcheng; Xu, Ran; Jia, Keren; Shi, Hui
2017-01-01
Most recently, an emerging theme in the field of tumor immunology predominates: chimeric antigen receptor (CAR) therapy in treating solid tumors. The number of related preclinical trials was surging. However, an evaluation of the effects of preclinical studies remained absent. Hence, a meta-analysis was conducted on the efficacy of CAR in animal models for solid tumors. The authors searched PubMed/Medline, Embase, and Google scholar up to April 2017. HR for survival was extracted based on the survival curve. The authors used fixed effect models to combine the results of all the trials. Heterogeneity was assessed by I-square statistic. Quality assessment was conducted following the Stroke Therapy Academic Industry Roundtable standard. Publication bias was assessed using Egger's test. Eleven trials were included, including 54 experiments with a total of 362 animals involved. CAR immunotherapy significantly improved the survival of animals (HR: 0.25, 95% CI: 0.13-0.37, P < 0.001). The quality assessment revealed that no study reported whether allocation concealment and blinded outcome assessment were conducted, and only five studies implemented randomization. This meta-analysis indicated that CAR therapy may be a potential clinical strategy in treating solid tumors.
NASA Astrophysics Data System (ADS)
Pandiaraj, P.; Gnanavelbabu, A.; Saravanan, P.
Metallic fluids like CuO, Al2O3, ZnO, SiO2 and TiO2 nanofluids were widely used for the development of working fluids in flat plate heat pipes except magnesium oxide (MgO). So, we initiate our idea to use MgO nanofluids in flat plate heat pipe as a working fluid material. MgO nanopowders were synthesized by wet chemical method. Solid state characterizations of synthesized nanopowders were carried out by Ultraviolet Spectroscopy (UV), Fourier Transform Infrared Spectroscopy (FTIR), Scanning Electron Microscopy (SEM) and X-ray Diffraction (XRD) techniques. Synthesized nanopowders were prepared as nanofluids by adding water and as well as water/ethylene glycol as a binary mixture. Thermal conductivity measurements of prepared nanofluids were studied using transient hot-wire apparatus. Response surface methodology based on the Box-Behnken design was implemented to investigate the influence of temperature (30-60∘C), particle fraction (1.5-4.5 vol.%), and solution pH (4-12) of nanofluids as the independent variables. A total of 17 experiments were accomplished for the construction of second-order polynomial equations for target output. All the influential factors, their mutual effects and their quadratic terms were statistically validated by analysis of variance (ANOVA). The optimum stability and thermal conductivity of MgO nanofluids with various temperature, volume fraction and solution pH were predicted and compared with experimental results. The results revealed that increase in particle fraction and pH of MgO nanofluids at certain points would increase thermal conductivity and become stable at nominal temperature.
Shi, Xiangnan; Cao, Libo; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen
2014-07-18
In this study, we developed a statistical rib cage geometry model accounting for variations by age, sex, stature and body mass index (BMI). Thorax CT scans were obtained from 89 subjects approximately evenly distributed among 8 age groups and both sexes. Threshold-based CT image segmentation was performed to extract the rib geometries, and a total of 464 landmarks on the left side of each subject׳s ribcage were collected to describe the size and shape of the rib cage as well as the cross-sectional geometry of each rib. Principal component analysis and multivariate regression analysis were conducted to predict rib cage geometry as a function of age, sex, stature, and BMI, all of which showed strong effects on rib cage geometry. Except for BMI, all parameters also showed significant effects on rib cross-sectional area using a linear mixed model. This statistical rib cage geometry model can serve as a geometric basis for developing a parametric human thorax finite element model for quantifying effects from different human attributes on thoracic injury risks. Copyright © 2014 Elsevier Ltd. All rights reserved.
Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.
Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William
2018-04-09
Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
Suzuki, Mizue; Kurata, Sadami; Yamamoto, Emiko; Makino, Kumiko; Kanamori, Masao
2012-09-01
The purpose of this study was to clarify potential fall-related behaviors as fall risk factors that may predict the potential for falls among the elderly patients with dementia at a geriatric facility in Japan. This study was conducted from April 2008 to May 2009. A baseline study was conducted in April 2008 to evaluate Mini-Mental State Examination, Physical Self-Maintenance Scale, fall-related behaviors, and other factors. For statistical analysis, paired t test and logistic analysis were used to compare each item between fallers and nonfallers. A total of 135 participants were followed up for 1 year; 50 participants (37.04%) fell during that period. Results of multiple logistic regression analysis showed that the total score for fall-related behaviors was significantly related to falls. It was suggested that 11 fall-related behaviors may be effective indicators to predict falls among the elderly patients with dementia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilhelm, M.S.
1982-01-01
The research was conducted as a secondary analysis of data collected during the evaluation of a statewide household energy audit conducted at Michigan State University. Energy-consumption data from utility and oil companies served as the measure of direct conservation. Indirect conservation was investigated through the analysis of self-reported participation in a variety of behaviors collectively defined as voluntary simplicity. The household was the unit of analysis served as the primary statistical procedure for testing the hypotheses. A 1.8 percentage reduction in direct household energy consumption was found between the years 1977-78 and 1979-80. Nearly three-fourths of the households were foundmore » to have practiced at least some voluntary simplicity behaviors. Relative cost of fuel used by the household was the only significant motivator for direct conservation (p = .016). Availability of human resources did not influence direct conservation. Neither did direct conservation contribute to a sense of personal control over energy problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whicker, Jeffrey Jay; Gillis, Jessica Mcdonnel; Ruedig, Elizabeth
This report summarizes the sampling design used, associated statistical assumptions, as well as general guidelines for conducting post-sampling data analysis. Sampling plan components presented here include how many sampling locations to choose and where within the sampling area to collect those samples. The type of medium to sample (i.e., soil, groundwater, etc.) and how to analyze the samples (in-situ, fixed laboratory, etc.) are addressed in other sections of the sampling plan.
Analysis of Runway Incursion Data
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2013-01-01
A statistical analysis of runway incursion (RI) events was conducted to ascertain relevance to the top ten challenges of the National Aeronautics and Space Administration Aviation Safety Program (AvSP). The information contained in the RI database was found to contain data that may be relevant to several of the AvSP top ten challenges. When combined with other data from the FAA documenting air traffic volume from calendar year 2000 through 2011, the structure of a predictive model emerges that can be used to forecast the frequency of RI events at various airports for various classes of aircraft and under various environmental conditions.
NASA Technical Reports Server (NTRS)
Baxa, E. G., Jr.
1974-01-01
A theoretical formulation of differential and composite OMEGA error is presented to establish hypotheses about the functional relationships between various parameters and OMEGA navigational errors. Computer software developed to provide for extensive statistical analysis of the phase data is described. Results from the regression analysis used to conduct parameter sensitivity studies on differential OMEGA error tend to validate the theoretically based hypothesis concerning the relationship between uncorrected differential OMEGA error and receiver separation range and azimuth. Limited results of measurement of receiver repeatability error and line of position measurement error are also presented.
Analysis and discussion on the experimental data of electrolyte analyzer
NASA Astrophysics Data System (ADS)
Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei
2018-06-01
In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.
[Personality traits of perpetrators of various types of crimes].
Skoczek, Adrianna; Gancarczyk, Urszula; Prochownik, Paweł; Sobień, Bartosz; Podolec, Piotr; Komar, Monika
2018-01-01
This study was conducted in Nowy Wiśnicz, with prisoners sentenced for: murders, sex crimes, theft and robbery, maintenance, bullying. A Polish adaptation of PAI test, made by the author of the study, was used. The study results and its statistical analysis showed characteristic personality features of particular criminal groups can be used in rehabilitation of disturbed people, addicts, and become the basis for preparing actions reducing frequency of committing crimes.
ERIC Educational Resources Information Center
Schoenborn, Charlotte A.
This report is based on data from the 1988 National Health Interview Survey on Alcohol (NHIS-Alcohol), part of the ongoing National Health Interview Survey conducted by the National Center for Health Statistics. Interviews for the NHIS are conducted in person by staff of the United States Bureau of the Census. Information is collected on each…
Kordi, Masoumeh; Riyazi, Sahar; Lotfalizade, Marziyeh; Shakeri, Mohammad Taghi; Suny, Hoseyn Jafari
2018-01-01
Screening of fetal anomalies is assumed as a necessary measurement in antenatal cares. The screening plans aim at empowerment of individuals to make the informed choice. This study was conducted in order to compare the effect of group and face-to-face education and decisional conflicts among the pregnant females regarding screening of fetal abnormalities. This study of the clinical trial was carried out on 240 pregnant women at <10-week pregnancy age in health care medical centers in Mashhad city in 2014. The form of individual-midwifery information and informed choice questionnaire and decisional conflict scale were used as tools for data collection. The face-to-face and group education course were held in two weekly sessions for intervention groups during two consecutive weeks, and the usual care was conducted for the control group. The rate of informed choice and decisional conflict was measured in pregnant women before education and also at weeks 20-22 of pregnancy in three groups. The data analysis was executed using SPSS statistical software (version 16), and statistical tests were implemented including Chi-square test, Kruskal-Wallis test, Wilcoxon test, Mann-Whitney U-test, one-way analysis of variance test, and Tukey's range test. The P < 0.05 was considered as a significant. The results showed that there was statically significant difference between three groups in terms of frequency of informed choice in screening of fetal abnormalities ( P = 0.001) in such a way that at next step of intervention, 62 participants (77.5%) in face-to-face education group, 64 members (80%) in group education class, and 20 persons (25%) in control group had the informed choice regarding screening tests, but there was no statistically significant difference between two individual and group education classes. Similarly, during the postintervention phase, there was a statistically significant difference in mean score of decisional conflict scale among pregnant women regarding screening tests in three groups ( P = 0.001). With respect to effectiveness of group and face-to-face education methods in increasing the informed choice and reduced decisional conflict in pregnant women regarding screening tests, each of these education methods may be employed according to the clinical environment conditions and requirement to encourage the women for conducting the screening tests.
Okumu, Clarice; Oyugi, Boniface
2018-01-01
This study intended to compare the clients' satisfaction with the quality of childbirth services in a private and public facility amongst mothers who have delivered within the last twenty four to seventy hours. This was a cross-sectional comparative research design with both quantitative and qualitative data collection and analysis methods. Data were collected through a focused group discussion guide and structured questionnaire collecting information on clients' satisfaction with quality of childbirth services. The study was conducted amongst women of reproductive age (WRA) between 15-49 years in Tigoni District hospital (public hospital) and Limuru Nursing home (private hospital). For quantitative data we conducted descriptive analysis and Mann-Whitney test using SPSS version 20.0 while qualitative data was manually analyzed manually using thematic analysis. A higher proportion of clients from private facility 98.1% were attended within 0-30 minutes of arrival to the facility as compared to 87% from public facility. The overall mean score showed that the respondents in public facility gave to satisfaction with the services was 4.46 out of a maximum of 5.00 score while private facility gave 4.60. The level of satisfaction amongst respondents in the public facility on pain relief after delivery was statistically significantly higher than the respondents in private facilities (U = 8132.50, p<0.001) while the level of satisfaction amongst respondents in the public facility on functional equipment was statistically significantly higher than the respondents in private facilities (U = 9206.50, p = 0.001). Moreover, level of satisfaction with the way staff responded to questions and concerns during labour and delivery was statistically significantly higher than the respondents in private facilities (U = 9964.50, p = 0.022). In overall, majority of clients from both public and private facilities expressed satisfaction with quality of services from admission till discharge in both public and private facilities and were willing to recommend other to come and deliver in the respective facilities.
Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olama, Mohammed M; Sharma, Isha; Kuruganti, Teja
In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis ofmore » building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.« less
NASA Astrophysics Data System (ADS)
Shi, Bibo; Grimm, Lars J.; Mazurowski, Maciej A.; Marks, Jeffrey R.; King, Lorraine M.; Maley, Carlo C.; Hwang, E. Shelley; Lo, Joseph Y.
2017-03-01
Reducing the overdiagnosis and overtreatment associated with ductal carcinoma in situ (DCIS) requires accurate prediction of the invasive potential at cancer screening. In this work, we investigated the utility of pre-operative histologic and mammographic features to predict upstaging of DCIS. The goal was to provide intentionally conservative baseline performance using readily available data from radiologists and pathologists and only linear models. We conducted a retrospective analysis on 99 patients with DCIS. Of those 25 were upstaged to invasive cancer at the time of definitive surgery. Pre-operative factors including both the histologic features extracted from stereotactic core needle biopsy (SCNB) reports and the mammographic features annotated by an expert breast radiologist were investigated with statistical analysis. Furthermore, we built classification models based on those features in an attempt to predict the presence of an occult invasive component in DCIS, with generalization performance assessed by receiver operating characteristic (ROC) curve analysis. Histologic features including nuclear grade and DCIS subtype did not show statistically significant differences between cases with pure DCIS and with DCIS plus invasive disease. However, three mammographic features, i.e., the major axis length of DCIS lesion, the BI-RADS level of suspicion, and radiologist's assessment did achieve the statistical significance. Using those three statistically significant features as input, a linear discriminant model was able to distinguish patients with DCIS plus invasive disease from those with pure DCIS, with AUC-ROC equal to 0.62. Overall, mammograms used for breast screening contain useful information that can be perceived by radiologists and help predict occult invasive components in DCIS.
Sherrill, Joel T; Sommers, David I; Nierenberg, Andrew A; Leon, Andrew C; Arndt, Stephan; Bandeen-Roche, Karen; Greenhouse, Joel; Guthrie, Donald; Normand, Sharon-Lise; Phillips, Katharine A; Shear, M Katherine; Woolson, Robert
2009-01-01
The authors summarize points for consideration generated in a National Institute of Mental Health (NIMH) workshop convened to provide an opportunity for reviewers from different disciplines-specifically clinical researchers and statisticians-to discuss how their differing and complementary expertise can be well integrated in the review of intervention-related grant applications. A 1-day workshop was convened in October, 2004. The workshop featured panel presentations on key topics followed by interactive discussion. This article summarizes the workshop and subsequent discussions, which centered on topics including weighting the statistics/data analysis elements of an application in the assessment of the application's overall merit; the level of statistical sophistication appropriate to different stages of research and for different funding mechanisms; some key considerations in the design and analysis portions of applications; appropriate statistical methods for addressing essential questions posed by an application; and the role of the statistician in the application's development, study conduct, and interpretation and dissemination of results. A number of key elements crucial to the construction and review of grant applications were identified. It was acknowledged that intervention-related studies unavoidably involve trade-offs. Reviewers are helped when applications acknowledge such trade-offs and provide good rationale for their choices. Clear linkage among the design, aims, hypotheses, and data analysis plan and avoidance of disconnections among these elements also strengthens applications. The authors identify multiple points to consider when constructing intervention-related grant applications. The points are presented here as questions and do not reflect institute policy or comprise a list of best practices, but rather represent points for consideration.
Mei, Lin; He, Lin; Song, Yuhua; Lv, Yang; Zhang, Lijiu; Hao, Fengxi; Xu, Mengmeng
2018-05-01
To investigate the relationship between obesity and disease-free survival (DFS) and overall survival (OS) of triple-negative breast cancer. Citations were searched in PubMed, Cochrane Library, and Web of Science. Random effect model meta-analysis was conducted by using Revman software version 5.0, and publication bias was evaluated by creating Egger regression with STATA software version 12. Nine studies (4412 patients) were included for DFS meta-analysis, 8 studies (4392 patients) include for OS meta-analysis. There were no statistical significances between obesity with DFS (P = .60) and OS (P = .71) in triple-negative breast cancer (TNBC) patients. Obesity has no impact on DFS and OS in patients with TNBC.