Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Gadbury, Gary L.; Allison, David B.
2012-01-01
Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a “near significant p-value” to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called “fiddling”) in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000. PMID:23056287
Gadbury, Gary L; Allison, David B
2012-01-01
Much has been written regarding p-values below certain thresholds (most notably 0.05) denoting statistical significance and the tendency of such p-values to be more readily publishable in peer-reviewed journals. Intuition suggests that there may be a tendency to manipulate statistical analyses to push a "near significant p-value" to a level that is considered significant. This article presents a method for detecting the presence of such manipulation (herein called "fiddling") in a distribution of p-values from independent studies. Simulations are used to illustrate the properties of the method. The results suggest that the method has low type I error and that power approaches acceptable levels as the number of p-values being studied approaches 1000.
Measuring the Impacts of ICT Using Official Statistics. OECD Digital Economy Papers, No. 136
ERIC Educational Resources Information Center
Roberts, Sheridan
2008-01-01
This paper describes the findings of an OECD project examining ICT impact measurement and analyses based on official statistics. Both economic and social impacts are covered and some results are presented. It attempts to place ICT impacts measurement into an Information Society conceptual framework, provides some suggestions for standardising…
Using Data Mining to Teach Applied Statistics and Correlation
ERIC Educational Resources Information Center
Hartnett, Jessica L.
2016-01-01
This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…
Mathematical background and attitudes toward statistics in a sample of Spanish college students.
Carmona, José; Martínez, Rafael J; Sánchez, Manuel
2005-08-01
To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.
ERIC Educational Resources Information Center
Camerer, Rudi
2014-01-01
The testing of intercultural competence has long been regarded as the field of psychometric test procedures, which claim to analyse an individual's personality by specifying and quantifying personality traits with the help of self-answer questionnaires and the statistical evaluation of these. The underlying assumption is that what is analysed and…
Statistical analysis of iron geochemical data suggests limited late Proterozoic oxygenation
NASA Astrophysics Data System (ADS)
Sperling, Erik A.; Wolock, Charles J.; Morgan, Alex S.; Gill, Benjamin C.; Kunzmann, Marcus; Halverson, Galen P.; MacDonald, Francis A.; Knoll, Andrew H.; Johnston, David T.
2015-07-01
Sedimentary rocks deposited across the Proterozoic-Phanerozoic transition record extreme climate fluctuations, a potential rise in atmospheric oxygen or re-organization of the seafloor redox landscape, and the initial diversification of animals. It is widely assumed that the inferred redox change facilitated the observed trends in biodiversity. Establishing this palaeoenvironmental context, however, requires that changes in marine redox structure be tracked by means of geochemical proxies and translated into estimates of atmospheric oxygen. Iron-based proxies are among the most effective tools for tracking the redox chemistry of ancient oceans. These proxies are inherently local, but have global implications when analysed collectively and statistically. Here we analyse about 4,700 iron-speciation measurements from shales 2,300 to 360 million years old. Our statistical analyses suggest that subsurface water masses in mid-Proterozoic oceans were predominantly anoxic and ferruginous (depleted in dissolved oxygen and iron-bearing), but with a tendency towards euxinia (sulfide-bearing) that is not observed in the Neoproterozoic era. Analyses further indicate that early animals did not experience appreciable benthic sulfide stress. Finally, unlike proxies based on redox-sensitive trace-metal abundances, iron geochemical data do not show a statistically significant change in oxygen content through the Ediacaran and Cambrian periods, sharply constraining the magnitude of the end-Proterozoic oxygen increase. Indeed, this re-analysis of trace-metal data is consistent with oxygenation continuing well into the Palaeozoic era. Therefore, if changing redox conditions facilitated animal diversification, it did so through a limited rise in oxygen past critical functional and ecological thresholds, as is seen in modern oxygen minimum zone benthic animal communities.
Plant Taxonomy as a Field Study
ERIC Educational Resources Information Center
Dalby, D. H.
1970-01-01
Suggests methods of teaching plant identification and taxonomic theory using keys, statistical analyses, and biometrics. Population variation, genotype- environment interaction and experimental taxonomy are used in laboratory and field. (AL)
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
The extent and consequences of p-hacking in science.
Head, Megan L; Holman, Luke; Lanfear, Rob; Kahn, Andrew T; Jennions, Michael D
2015-03-01
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as "p-hacking," occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses.
Statistics provide guidance for indigenous organic carbon detection on Mars missions.
Sephton, Mark A; Carter, Jonathan N
2014-08-01
Data from the Viking and Mars Science Laboratory missions indicate the presence of organic compounds that are not definitively martian in origin. Both contamination and confounding mineralogies have been suggested as alternatives to indigenous organic carbon. Intuitive thought suggests that we are repeatedly obtaining data that confirms the same level of uncertainty. Bayesian statistics may suggest otherwise. If an organic detection method has a true positive to false positive ratio greater than one, then repeated organic matter detection progressively increases the probability of indigeneity. Bayesian statistics also reveal that methods with higher ratios of true positives to false positives give higher overall probabilities and that detection of organic matter in a sample with a higher prior probability of indigenous organic carbon produces greater confidence. Bayesian statistics, therefore, provide guidance for the planning and operation of organic carbon detection activities on Mars. Suggestions for future organic carbon detection missions and instruments are as follows: (i) On Earth, instruments should be tested with analog samples of known organic content to determine their true positive to false positive ratios. (ii) On the mission, for an instrument with a true positive to false positive ratio above one, it should be recognized that each positive detection of organic carbon will result in a progressive increase in the probability of indigenous organic carbon being present; repeated measurements, therefore, can overcome some of the deficiencies of a less-than-definitive test. (iii) For a fixed number of analyses, the highest true positive to false positive ratio method or instrument will provide the greatest probability that indigenous organic carbon is present. (iv) On Mars, analyses should concentrate on samples with highest prior probability of indigenous organic carbon; intuitive desires to contrast samples of high prior probability and low prior probability of indigenous organic carbon should be resisted.
The Extent and Consequences of P-Hacking in Science
Head, Megan L.; Holman, Luke; Lanfear, Rob; Kahn, Andrew T.; Jennions, Michael D.
2015-01-01
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as “p-hacking,” occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses. PMID:25768323
Rainfall Results of the Florida Area Cumulus Experiment, 1970-76.
NASA Astrophysics Data System (ADS)
Woodley, William L.; Jordan, Jill; Barnston, Anthony; Simpson, Joanne; Biondini, Ron; Flueck, John
1982-02-01
The Florida Area Cumulus Experiment of 1970-76 (FACE-1) is a single-area, randomized, exploratory experiment to determine whether seeding cumuli for dynamic effects (dynamic seeding) can be used to augment convective rainfall over a substantial target area (1.3 × 104 km2) in south Florida. Rainfall is estimated using S-band radar observations after adjustment by raingages. The two primary response variables are rain volumes in the total target (TT) and in the floating target (FT), the most intensely treated portion of the target. The experimental unit is the day and the main observational period is the 6 h after initiation of treatment (silver iodide flares on seed days and either no flares or placebos on control days). Analyses without predictors suggest apparent increases in both the location (means and medians) and the dispersion (standard deviation and interquartile range) characteristics of rainfall due to seeding in the FT and TT variables with substantial statistical support for the FT results and lesser statistical support for the TT results. Analyses of covariance using meteorologically meaningful predictor variables suggest a somewhat larger effect of seeding with stronger statistical support. These results are interpreted in terms of the FACE conceptual model.
Considerations in the statistical analysis of clinical trials in periodontitis.
Imrey, P B
1986-05-01
Adult periodontitis has been described as a chronic infectious process exhibiting sporadic, acute exacerbations which cause quantal, localized losses of dental attachment. Many analytic problems of periodontal trials are similar to those of other chronic diseases. However, the episodic, localized, infrequent, and relatively unpredictable behavior of exacerbations, coupled with measurement error difficulties, cause some specific problems. Considerable controversy exists as to the proper selection and treatment of multiple site data from the same patient for group comparisons for epidemiologic or therapeutic evaluative purposes. This paper comments, with varying degrees of emphasis, on several issues pertinent to the analysis of periodontal trials. Considerable attention is given to the ways in which measurement variability may distort analytic results. Statistical treatments of multiple site data for descriptive summaries are distinguished from treatments for formal statistical inference to validate therapeutic effects. Evidence suggesting that sites behave independently is contested. For inferential analyses directed at therapeutic or preventive effects, analytic models based on site independence are deemed unsatisfactory. Methods of summarization that may yield more powerful analyses than all-site mean scores, while retaining appropriate treatment of inter-site associations, are suggested. Brief comments and opinions on an assortment of other issues in clinical trial analysis are preferred.
Rational-Emotive Therapy versus Systematic Desensitization: A Comment on Moleski and Tosi.
ERIC Educational Resources Information Center
Atkinson, Leslie
1983-01-01
Questioned the statistical analyses of the Moleski and Tosi investigation of rational-emotive therapy versus systematic desensitization. Suggested means for lowering the error rate through a more efficient experimental design. Recommended a reanalysis of the original data. (LLL)
Results of the Intelligence Test for Visually Impaired Children (ITVIC).
ERIC Educational Resources Information Center
Dekker, R.; And Others
1991-01-01
Statistical analyses of scores on subtests of the Intelligence Test for Visually Impaired Children were done for two groups of children, either with or without usable vision. Results suggest that the battery has differential factorial and predictive validity. (Author/DB)
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Franke, Molly F; Jerome, J Gregory; Matias, Wilfredo R; Ternier, Ralph; Hilaire, Isabelle J; Harris, Jason B; Ivers, Louise C
2017-10-13
Case-control studies to quantify oral cholera vaccine effectiveness (VE) often rely on neighbors without diarrhea as community controls. Test-negative controls can be easily recruited and may minimize bias due to differential health-seeking behavior and recall. We compared VE estimates derived from community and test-negative controls and conducted bias-indicator analyses to assess potential bias with community controls. From October 2012 through November 2016, patients with acute watery diarrhea were recruited from cholera treatment centers in rural Haiti. Cholera cases had a positive stool culture. Non-cholera diarrhea cases (test-negative controls and non-cholera diarrhea cases for bias-indicator analyses) had a negative culture and rapid test. Up to four community controls were matched to diarrhea cases by age group, time, and neighborhood. Primary analyses included 181 cholera cases, 157 non-cholera diarrhea cases, 716 VE community controls and 625 bias-indicator community controls. VE for self-reported vaccination with two doses was consistent across the two control groups, with statistically significant VE estimates ranging from 72 to 74%. Sensitivity analyses revealed similar, though somewhat attenuated estimates for self-reported two dose VE. Bias-indicator estimates were consistently less than one, with VE estimates ranging from 19 to 43%, some of which were statistically significant. OCV estimates from case-control analyses using community and test-negative controls were similar. While bias-indicator analyses suggested possible over-estimation of VE estimates using community controls, test-negative analyses suggested this bias, if present, was minimal. Test-negative controls can be a valid low-cost and time-efficient alternative to community controls for OCV effectiveness estimation and may be especially relevant in emergency situations. Copyright © 2017. Published by Elsevier Ltd.
Statistical Performances of Resistive Active Power Splitter
NASA Astrophysics Data System (ADS)
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Coordinate based random effect size meta-analysis of neuroimaging studies.
Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J
2017-06-01
Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.
Terides, Matthew D; Dear, Blake F; Fogliati, Vincent J; Gandy, Milena; Karin, Eyal; Jones, Michael P; Titov, Nickolai
2018-01-01
Cognitive-behavioural therapy (CBT) is an effective treatment for clinical and subclinical symptoms of depression and general anxiety, and increases life satisfaction. Patients' usage of CBT skills is a core aspect of treatment but there is insufficient empirical evidence suggesting that skills usage behaviours are a mechanism of clinical change. This study investigated if an internet-delivered CBT (iCBT) intervention increased the frequency of CBT skills usage behaviours and if this statistically mediated reductions in symptoms and increased life satisfaction. A two-group randomised controlled trial was conducted comparing internet-delivered CBT (n = 65) with a waitlist control group (n = 75). Participants were individuals experiencing clinically significant symptoms of depression or general anxiety. Mixed-linear models analyses revealed that the treatment group reported a significantly higher frequency of skills usage, lower symptoms, and higher life satisfaction by the end of treatment compared with the control group. Results from bootstrapping mediation analyses revealed that the increased skills usage behaviours statistically mediated symptom reductions and increased life satisfaction. Although skills usage and symptom outcomes were assessed concurrently, these findings support the notion that iCBT increases the frequency of skills usage behaviours and suggest that this may be an important mechanism of change.
Statistical Exposé of a Multiple-Compartment Anaerobic Reactor Treating Domestic Wastewater.
Pfluger, Andrew R; Hahn, Martha J; Hering, Amanda S; Munakata-Marr, Junko; Figueroa, Linda
2018-06-01
Mainstream anaerobic treatment of domestic wastewater is a promising energy-generating treatment strategy; however, such reactors operated in colder regions are not well characterized. Performance data from a pilot-scale, multiple-compartment anaerobic reactor taken over 786 days were subjected to comprehensive statistical analyses. Results suggest that chemical oxygen demand (COD) was a poor proxy for organics in anaerobic systems as oxygen demand from dissolved inorganic material, dissolved methane, and colloidal material influence dissolved and particulate COD measurements. Additionally, univariate and functional boxplots were useful in visualizing variability in contaminant concentrations and identifying statistical outliers. Further, significantly different dissolved organic removal and methane production was observed between operational years, suggesting that anaerobic reactor systems may not achieve steady-state performance within one year. Last, modeling multiple-compartment reactor systems will require data collected over at least two years to capture seasonal variations of the major anaerobic microbial functions occurring within each reactor compartment.
Association of ED with chronic periodontal disease.
Matsumoto, S; Matsuda, M; Takekawa, M; Okada, M; Hashizume, K; Wada, N; Hori, J; Tamaki, G; Kita, M; Iwata, T; Kakizaki, H
2014-01-01
To examine the relationship between chronic periodontal disease (CPD) and ED, the interview sheet including the CPD self-checklist (CPD score) and the five-item version of the International Index of Erectile Function (IIEF-5) was distributed to 300 adult men who received a comprehensive dental examination. Statistical analyses were performed by the Spearman's rank correlation coefficient and other methods. Statistical significance was accepted at the level of P<0.05. The interview sheets were collected from 88 men (response rate 29.3%, 50.9±16.6 years old). There was a statistically significant correlation between the CPD score and the presence of ED (P=0.0415). The results in the present study suggest that ED is related to the damage caused by endothelial dysfunction and the systematic inflammatory changes associated with CPD. The present study also suggests that dental health is important as a preventive medicine for ED.
Su, Junhu; Ji, Weihong; Wei, Yanming; Zhang, Yanping; Gleeson, Dianne M; Lou, Zhongyu; Ren, Jing
2014-08-01
The endangered schizothoracine fish Gymnodiptychus pachycheilus is endemic to the Qinghai-Tibetan Plateau (QTP), but very little genetic information is available for this species. Here, we accessed the current genetic divergence of G. pachycheilus population to evaluate their distributions modulated by contemporary and historical processes. Population structure and demographic history were assessed by analyzing 1811-base pairs of mitochondrial DNA from 61 individuals across a large proportion of its geographic range. Our results revealed low nucleotide diversity, suggesting severe historical bottleneck events. Analyses of molecular variance and the conventional population statistic FST (0.0435, P = 0.0215) confirmed weak genetic structure. The monophyly of G. pachycheilus was statistically well-supported, while two divergent evolutionary clusters were identified by phylogenetic analyses, suggesting a microgeographic population structure. The consistent scenario of recent population expansion of two clusters was identified based on several complementary analyses of demographic history (0.096 Ma and 0.15 Ma). This genetic divergence and evolutionary process are likely to have resulted from a series of drainage arrangements triggered by the historical tectonic events of the region. The results obtained here provide the first insights into the evolutionary history and genetic status of this little-known fish.
Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling
Wood, John
2017-01-01
Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080
Kelley, George A.; Kelley, Kristi S.
2013-01-01
Purpose. Conduct a systematic review of previous meta-analyses addressing the effects of exercise in the treatment of overweight and obese children and adolescents. Methods. Previous meta-analyses of randomized controlled exercise trials that assessed adiposity in overweight and obese children and adolescents were included by searching nine electronic databases and cross-referencing from retrieved studies. Methodological quality was assessed using the Assessment of Multiple Systematic Reviews (AMSTAR) Instrument. The alpha level for statistical significance was set at P ≤ 0.05. Results. Of the 308 studies reviewed, two aggregate data meta-analyses representing 14 and 17 studies and 481 and 701 boys and girls met all eligibility criteria. Methodological quality was 64% and 73%. For both studies, statistically significant reductions in percent body fat were observed (P = 0.006 and P < 0.00001). The number-needed-to treat (NNT) was 4 and 3 with an estimated 24.5 and 31.5 million overweight and obese children in the world potentially benefitting, 2.8 and 3.6 million in the US. No other measures of adiposity (BMI-related measures, body weight, and central obesity) were statistically significant. Conclusions. Exercise is efficacious for reducing percent body fat in overweight and obese children and adolescents. Insufficient evidence exists to suggest that exercise reduces other measures of adiposity. PMID:24455215
ERIC Educational Resources Information Center
Burger, H. Robert
1983-01-01
Part 1 (SE 533 635) presented programs for use in mineralogy, petrology, and geochemistry. This part presents an annotated list of 64 additional programs, focusing on introductory geology, mapping, and statistical packages for geological analyses. A brief description, source, suggested use(s), programing language, and other information are…
ERIC Educational Resources Information Center
Bitler, Marianne; Domina, Thurston; Penner, Emily; Hoynes, Hilary
2015-01-01
We use quantile treatment effects estimation to examine the consequences of the random-assignment New York City School Choice Scholarship Program across the distribution of student achievement. Our analyses suggest that the program had negligible and statistically insignificant effects across the skill distribution. In addition to contributing to…
Grade Trend Analysis for a Credit-Bearing Library Instruction Course
ERIC Educational Resources Information Center
Guo, Shu
2015-01-01
Statistics suggest the prevalence of grade inflation nationwide, and researchers perform many analyses on student grades at both university and college levels. This analysis focuses on a one-credit library instruction course for undergraduate students at a large public university. The studies examine thirty semester GPAs and the percentages of As…
On "Rhyme, Language, and Children's Reading."
ERIC Educational Resources Information Center
Bowey, Judith A.
1990-01-01
Argues that the results detailed in Bryant, MacLean, and Bradley's 1990 article do not differ greatly with those reported earlier by Bowey and Patel, and suggests that discrepancies between the two statistical analyses reflect the relative size of simple correlations and are attributable to differences in the designs of the two studies. (30…
Health Benefits of Dietary Whole Grains: An Umbrella Review of Meta-analyses.
McRae, Marc P
2017-03-01
The purpose of this study is to review the effectiveness of the role of whole grain as a therapeutic agent in type 2 diabetes, cardiovascular disease, cancer, and obesity. An umbrella review of all published meta-analyses was performed. A PubMed search from January 1, 1980, to May 31, 2016, was conducted using the following search strategy: (whole grain OR whole grains) AND (meta-analysis OR systematic review). Only English language publications that provided quantitative statistical analysis on type 2 diabetes, cardiovascular disease, cancer, and weight loss were retrieved. Twenty-one meta-analyses were retrieved for inclusion in this umbrella review, and all the meta-analyses reported statistically significant positive benefits for reducing the incidence of type 2 diabetes (relative risk [RR] = 0.68-0.80), cardiovascular disease (RR = 0.63-0.79), and colorectal, pancreatic, and gastric cancers (RR = 0.57-0.94) and a modest effect on body weight, waist circumference, and body fat mass. Significant reductions in cardiovascular and cancer mortality were also observed (RR = 0.82 and 0.89, respectively). Some problems of heterogeneity, publication bias, and quality assessment were found among the studies. This review suggests that there is some evidence for dietary whole grain intake to be beneficial in the prevention of type 2 diabetes, cardiovascular disease, and colorectal, pancreatic, and gastric cancers. The potential benefits of these findings suggest that the consumption of 2 to 3 servings per day (~45 g) of whole grains may be a justifiable public health goal.
Metz, Anneke M
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.
Atmospheric Convective Organization: Self-Organized Criticality or Homeostasis?
NASA Astrophysics Data System (ADS)
Yano, Jun-Ichi
2015-04-01
Atmospheric convection has a tendency organized on a hierarchy of scales ranging from the mesoscale to the planetary scales, with the latter especially manifested by the Madden-Julian oscillation. The present talk examines two major possible mechanisms of self-organization identified in wider literature from a phenomenological thermodynamic point of view by analysing a planetary-scale cloud-resolving model simulation. The first mechanism is self-organized criticality. A saturation tendency of precipitation rate with the increasing column-integrated water, reminiscence of critical phenomena, indicates self-organized criticality. The second is a self-regulation mechanism that is known as homeostasis in biology. A thermodynamic argument suggests that such self-regulation maintains the column-integrated water below a threshold by increasing the precipitation rate. Previous analyses of both observational data as well as cloud-resolving model (CRM) experiments give mixed results. A satellite data analysis suggests self-organized criticality. Some observational data as well as CRM experiments support homeostasis. Other analyses point to a combination of these two interpretations. In this study, a CRM experiment over a planetary-scale domain with a constant sea-surface temperature is analyzed. This analysis shows that the relation between the column-integrated total water and precipitation suggests self-organized criticality, whereas the one between the column-integrated water vapor and precipitation suggests homeostasis. The concurrent presence of these two mechanisms are further elaborated by detailed statistical and budget analyses. These statistics are scale invariant, reflecting a spatial scaling of precipitation processes. These self-organization mechanisms are most likely be best theoretically understood by the energy cycle of the convective systems consisting of the kinetic energy and the cloud-work function. The author has already investigated the behavior of this cycle system under a zero-dimensional configuration. Preliminary simulations of this cycle system over a two-dimensional domain will be presented.
The effect of noise-induced variance on parameter recovery from reaction times.
Vadillo, Miguel A; Garaizar, Pablo
2016-03-31
Technical noise can compromise the precision and accuracy of the reaction times collected in psychological experiments, especially in the case of Internet-based studies. Although this noise seems to have only a small impact on traditional statistical analyses, its effects on model fit to reaction-time distributions remains unexplored. Across four simulations we study the impact of technical noise on parameter recovery from data generated from an ex-Gaussian distribution and from a Ratcliff Diffusion Model. Our results suggest that the impact of noise-induced variance tends to be limited to specific parameters and conditions. Although we encourage researchers to adopt all measures to reduce the impact of noise on reaction-time experiments, we conclude that the typical amount of noise-induced variance found in these experiments does not pose substantial problems for statistical analyses based on model fitting.
Tonelli, Adriano R.; Zein, Joe; Adams, Jacob; Ioannidis, John P.A.
2014-01-01
Purpose Multiple interventions have been tested in acute respiratory distress syndrome (ARDS). We examined the entire agenda of published randomized controlled trials (RCTs) in ARDS that reported on mortality and of respective meta-analyses. Methods We searched PubMed, the Cochrane Library and Web of Knowledge until July 2013. We included RCTs in ARDS published in English. We excluded trials of newborns and children; and those on short-term interventions, ARDS prevention or post-traumatic lung injury. We also reviewed all meta-analyses of RCTs in this field that addressed mortality. Treatment modalities were grouped in five categories: mechanical ventilation strategies and respiratory care, enteral or parenteral therapies, inhaled / intratracheal medications, nutritional support and hemodynamic monitoring. Results We identified 159 published RCTs of which 93 had overall mortality reported (n= 20,671 patients) - 44 trials (14,426 patients) reported mortality as a primary outcome. A statistically significant survival benefit was observed in 8 trials (7 interventions) and two trials reported an adverse effect on survival. Among RTCs with >50 deaths in at least 1 treatment arm (n=21), 2 showed a statistically significant mortality benefit of the intervention (lower tidal volumes and prone positioning), 1 showed a statistically significant mortality benefit only in adjusted analyses (cisatracurium) and 1 (high-frequency oscillatory ventilation) showed a significant detrimental effect. Across 29 meta-analyses, the most consistent evidence was seen for low tidal volumes and prone positioning in severe ARDS. Conclusions There is limited supportive evidence that specific interventions can decrease mortality in ARDS. While low tidal volumes and prone positioning in severe ARDS seem effective, most sporadic findings of interventions suggesting reduced mortality are not corroborated consistently in large-scale evidence including meta-analyses. PMID:24667919
Statistical learning of novel graphotactic constraints in children and adults.
Samara, Anna; Caravolas, Markéta
2014-05-01
The current study explored statistical learning processes in the acquisition of orthographic knowledge in school-aged children and skilled adults. Learning of novel graphotactic constraints on the position and context of letter distributions was induced by means of a two-phase learning task adapted from Onishi, Chambers, and Fisher (Cognition, 83 (2002) B13-B23). Following incidental exposure to pattern-embedding stimuli in Phase 1, participants' learning generalization was tested in Phase 2 with legality judgments about novel conforming/nonconforming word-like strings. Test phase performance was above chance, suggesting that both types of constraints were reliably learned even after relatively brief exposure. As hypothesized, signal detection theory d' analyses confirmed that learning permissible letter positions (d'=0.97) was easier than permissible neighboring letter contexts (d'=0.19). Adults were more accurate than children in all but a strict analysis of the contextual constraints condition. Consistent with the statistical learning perspective in literacy, our results suggest that statistical learning mechanisms contribute to children's and adults' acquisition of knowledge about graphotactic constraints similar to those existing in their orthography. Copyright © 2013 Elsevier Inc. All rights reserved.
Cluster detection methods applied to the Upper Cape Cod cancer data.
Ozonoff, Al; Webster, Thomas; Vieira, Veronica; Weinberg, Janice; Ozonoff, David; Aschengrau, Ann
2005-09-15
A variety of statistical methods have been suggested to assess the degree and/or the location of spatial clustering of disease cases. However, there is relatively little in the literature devoted to comparison and critique of different methods. Most of the available comparative studies rely on simulated data rather than real data sets. We have chosen three methods currently used for examining spatial disease patterns: the M-statistic of Bonetti and Pagano; the Generalized Additive Model (GAM) method as applied by Webster; and Kulldorff's spatial scan statistic. We apply these statistics to analyze breast cancer data from the Upper Cape Cancer Incidence Study using three different latency assumptions. The three different latency assumptions produced three different spatial patterns of cases and controls. For 20 year latency, all three methods generally concur. However, for 15 year latency and no latency assumptions, the methods produce different results when testing for global clustering. The comparative analyses of real data sets by different statistical methods provides insight into directions for further research. We suggest a research program designed around examining real data sets to guide focused investigation of relevant features using simulated data, for the purpose of understanding how to interpret statistical methods applied to epidemiological data with a spatial component.
Shriver, K A
1986-01-01
Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.
Bridging the Gap between Theory and Model: A Reflection on the Balance Scale Task.
ERIC Educational Resources Information Center
Turner, Geoffrey F. W.; Thomas, Hoben
2002-01-01
Focuses on individual strengths of articles by Jensen and van der Maas, and Halford et al., and the power of their combined perspectives. Suggests a performance model that can both evaluate specific theoretical claims and reveal important data features that had been previously obscured using conventional statistical analyses. Maintains that the…
A social network approach to understanding science communication among fire professionals
Vita Wright
2012-01-01
Studies of science communication and use in the fire management community suggest manager's access research via informal information networks and that these networks vary by both agency and position. We used a phone survey followed by traditional statistical analyses to understand the informal social networks of fire professionals in two western regions of the...
A social network approach to understanding science communication among fire professionals (Abstract)
Vita Wright; Andrea Thode; Anne Mottek-Lucas; Jacklynn Fallon; Megan Matonis
2012-01-01
Studies of science communication and use in the fire management community suggest manager's access research via informal information networks and that these networks vary by both agency and position. We used a phone survey followed by traditional statistical analyses to understand the informal social networks of fire professionals in two western regions of the...
The intervals method: a new approach to analyse finite element outputs using multivariate statistics
De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep
2017-01-01
Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107
Varma, Rajesh; Gupta, Janesh K
2006-01-01
There is considerable evidence to show an association between genital tract infections, such as bacterial vaginosis (BV), and preterm delivery (PTD). Meta-analyses to date have shown screening and treating BV in pregnancy does not prevent PTD. This casts doubt on a cause and effect relationship between BV and PTD. However, the meta-analyses reported significant clinical, methodological and statistical heterogeneity of the included studies. We therefore undertook a repeat meta-analysis, included recently published trials, and applied strict criteria on data extraction. We meta-analysed low and high-risk pregnancies separately. We found that screening and treating BV in low-risk pregnancies produced a statistically significant reduction in spontaneous PTD (RR 0.73; 95% CI 0.55-0.98). This beneficial effect was not observed in high-risk or combined risk groups. The differences in antibiotic sensitivity between high and low risk groups may suggest differing causal contributions of the infectious process to PTD. The evidence, along with prior knowledge of differing predisposing factors and prognosis between these risk groups, supports the hypothesis that PTD in high and low risk pregnant women are different entities and not linear extremes of the same syndrome.
Changing response of the North Atlantic/European winter climate to the 11 year solar cycle
NASA Astrophysics Data System (ADS)
Ma, Hedi; Chen, Haishan; Gray, Lesley; Zhou, Liming; Li, Xing; Wang, Ruili; Zhu, Siguang
2018-03-01
Recent studies have presented conflicting results regarding the 11 year solar cycle (SC) influences on winter climate over the North Atlantic/European region. Analyses of only the most recent decades suggest a synchronized North Atlantic Oscillation (NAO)-like response pattern to the SC. Analyses of long-term climate data sets dating back to the late 19th century, however, suggest a mean sea level pressure (mslp) response that lags the SC by 2-4 years in the southern node of the NAO (i.e. Azores region). To understand the conflicting nature and cause of these time dependencies in the SC surface response, the present study employs a lead/lag multi-linear regression technique with a sliding window of 44 years over the period 1751-2016. Results confirm previous analyses, in which the average response for the whole time period features a statistically significant 2-4 year lagged mslp response centered over the Azores region. Overall, the lagged nature of Azores mslp response is generally consistent in time. Stronger and statistically significant SC signals tend to appear in the periods when the SC forcing amplitudes are relatively larger. Individual month analysis indicates the consistent lagged response in December-January-February average arises primarily from early winter months (i.e. December and January), which has been associated with ocean feedback processes that involve reinforcement by anomalies from the previous winter. Additional analysis suggests that the synchronous NAO-like response in recent decades arises primarily from late winter (February), possibly reflecting a result of strong internal noise.
Personal use of hair dyes and the risk of bladder cancer: results of a meta-analysis.
Huncharek, Michael; Kupelnick, Bruce
2005-01-01
OBJECTIVE: This study examined the methodology of observational studies that explored an association between personal use of hair dye products and the risk of bladder cancer. METHODS: Data were pooled from epidemiological studies using a general variance-based meta-analytic method that employed confidence intervals. The outcome of interest was a summary relative risk (RRs) reflecting the risk of bladder cancer development associated with use of hair dye products vs. non-use. Sensitivity analyses were performed to explain any observed statistical heterogeneity and to explore the influence of specific study characteristics of the summary estimate of effect. RESULTS: Initially combining homogenous data from six case-control and one cohort study yielded a non-significant RR of 1.01 (0.92, 1.11), suggesting no association between hair dye use and bladder cancer development. Sensitivity analyses examining the influence of hair dye type, color, and study design on this suspected association showed that uncontrolled confounding and design limitations contributed to a spurious non-significant summary RR. The sensitivity analyses yielded statistically significant RRs ranging from 1.22 (1.11, 1.51) to 1.50 (1.30, 1.98), indicating that personal use of hair dye products increases bladder cancer risk by 22% to 50% vs. non-use. CONCLUSION: The available epidemiological data suggest an association between personal use of hair dye products and increased risk of bladder cancer. PMID:15736329
Predation and fragmentation portrayed in the statistical structure of prey time series
Hendrichsen, Ditte K; Topping, Chris J; Forchhammer, Mads C
2009-01-01
Background Statistical autoregressive analyses of direct and delayed density dependence are widespread in ecological research. The models suggest that changes in ecological factors affecting density dependence, like predation and landscape heterogeneity are directly portrayed in the first and second order autoregressive parameters, and the models are therefore used to decipher complex biological patterns. However, independent tests of model predictions are complicated by the inherent variability of natural populations, where differences in landscape structure, climate or species composition prevent controlled repeated analyses. To circumvent this problem, we applied second-order autoregressive time series analyses to data generated by a realistic agent-based computer model. The model simulated life history decisions of individual field voles under controlled variations in predator pressure and landscape fragmentation. Analyses were made on three levels: comparisons between predated and non-predated populations, between populations exposed to different types of predators and between populations experiencing different degrees of habitat fragmentation. Results The results are unambiguous: Changes in landscape fragmentation and the numerical response of predators are clearly portrayed in the statistical time series structure as predicted by the autoregressive model. Populations without predators displayed significantly stronger negative direct density dependence than did those exposed to predators, where direct density dependence was only moderately negative. The effects of predation versus no predation had an even stronger effect on the delayed density dependence of the simulated prey populations. In non-predated prey populations, the coefficients of delayed density dependence were distinctly positive, whereas they were negative in predated populations. Similarly, increasing the degree of fragmentation of optimal habitat available to the prey was accompanied with a shift in the delayed density dependence, from strongly negative to gradually becoming less negative. Conclusion We conclude that statistical second-order autoregressive time series analyses are capable of deciphering interactions within and across trophic levels and their effect on direct and delayed density dependence. PMID:19419539
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
2008-01-01
There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754
Dynamic systems approaches and levels of analysis in the nervous system
Parker, David; Srivastava, Vipin
2013-01-01
Various analyses are applied to physiological signals. While epistemological diversity is necessary to address effects at different levels, there is often a sense of competition between analyses rather than integration. This is evidenced by the differences in the criteria needed to claim understanding in different approaches. In the nervous system, neuronal analyses that attempt to explain network outputs in cellular and synaptic terms are rightly criticized as being insufficient to explain global effects, emergent or otherwise, while higher-level statistical and mathematical analyses can provide quantitative descriptions of outputs but can only hypothesize on their underlying mechanisms. The major gap in neuroscience is arguably our inability to translate what should be seen as complementary effects between levels. We thus ultimately need approaches that allow us to bridge between different spatial and temporal levels. Analytical approaches derived from critical phenomena in the physical sciences are increasingly being applied to physiological systems, including the nervous system, and claim to provide novel insight into physiological mechanisms and opportunities for their control. Analyses of criticality have suggested several important insights that should be considered in cellular analyses. However, there is a mismatch between lower-level neurophysiological approaches and statistical phenomenological analyses that assume that lower-level effects can be abstracted away, which means that these effects are unknown or inaccessible to experimentalists. As a result experimental designs often generate data that is insufficient for analyses of criticality. This review considers the relevance of insights from analyses of criticality to neuronal network analyses, and highlights that to move the analyses forward and close the gap between the theoretical and neurobiological levels, it is necessary to consider that effects at each level are complementary rather than in competition. PMID:23386835
Association between sleep difficulties as well as duration and hypertension: is BMI a mediator?
Carrillo-Larco, R M; Bernabe-Ortiz, A; Sacksteder, K A; Diez-Canseco, F; Cárdenas, M K; Gilman, R H; Miranda, J J
2017-01-01
Sleep difficulties and short sleep duration have been associated with hypertension. Though body mass index (BMI) may be a mediator variable, the mediation effect has not been defined. We aimed to assess the association between sleep duration and sleep difficulties with hypertension, to determine if BMI is a mediator variable, and to quantify the mediation effect. We conducted a mediation analysis and calculated prevalence ratios with 95% confidence intervals. The exposure variables were sleep duration and sleep difficulties, and the outcome was hypertension. Sleep difficulties were statistically significantly associated with a 43% higher prevalence of hypertension in multivariable analyses; results were not statistically significant for sleep duration. In these analyses, and in sex-specific subgroup analyses, we found no strong evidence that BMI mediated the association between sleep indices and risk of hypertension. Our findings suggest that BMI does not appear to mediate the association between sleep patterns and hypertension. These results highlight the need to further study the mechanisms underlying the relationship between sleep patterns and cardiovascular risk factors.
The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.
Christensen, G B; Knight, S; Camp, N J
2009-11-01
We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.
Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.
Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P
2017-08-23
Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.
Bulgari, Daniela; Casati, Paola; Crepaldi, Paola; Daffonchio, Daniele; Quaglino, Fabio; Brusetti, Lorenzo; Bianco, Piero Attilio
2011-01-01
Length heterogeneity-PCR assays, combined with statistical analyses, highlighted that the endophytic bacterial community associated with healthy grapevines was characterized by a greater diversity than that present in diseased and recovered plants. The findings suggest that phytoplasmas can restructure the bacterial community by selecting endophytic strains that could elicit a plant defense response. PMID:21622794
Bulgari, Daniela; Casati, Paola; Crepaldi, Paola; Daffonchio, Daniele; Quaglino, Fabio; Brusetti, Lorenzo; Bianco, Piero Attilio
2011-07-01
Length heterogeneity-PCR assays, combined with statistical analyses, highlighted that the endophytic bacterial community associated with healthy grapevines was characterized by a greater diversity than that present in diseased and recovered plants. The findings suggest that phytoplasmas can restructure the bacterial community by selecting endophytic strains that could elicit a plant defense response.
Environmental Studies: Mathematical, Computational and Statistical Analyses
1993-03-03
mathematical analysis addresses the seasonally and longitudinally averaged circulation which is under the influence of a steady forcing located asymmetrically...employed, as has been suggested for some situations. A general discussion of how interfacial phenomena influence both the original contamination process...describing the large-scale advective and dispersive behaviour of contaminants transported by groundwater and the uncertainty associated with field-scale
R as a Lingua Franca: Advantages of Using R for Quantitative Research in Applied Linguistics
ERIC Educational Resources Information Center
Mizumoto, Atsushi; Plonsky, Luke
2016-01-01
In this article, we suggest that using R, a statistical software environment, is advantageous for quantitative researchers in applied linguistics. We first provide a brief overview of the reasons why R is popular among researchers in other fields and why we recommend its use for analyses in applied linguistics. In order to illustrate these…
Hagell, Peter; Westergren, Albert
Sample size is a major factor in statistical null hypothesis testing, which is the basis for many approaches to testing Rasch model fit. Few sample size recommendations for testing fit to the Rasch model concern the Rasch Unidimensional Measurement Models (RUMM) software, which features chi-square and ANOVA/F-ratio based fit statistics, including Bonferroni and algebraic sample size adjustments. This paper explores the occurrence of Type I errors with RUMM fit statistics, and the effects of algebraic sample size adjustments. Data with simulated Rasch model fitting 25-item dichotomous scales and sample sizes ranging from N = 50 to N = 2500 were analysed with and without algebraically adjusted sample sizes. Results suggest the occurrence of Type I errors with N less then or equal to 500, and that Bonferroni correction as well as downward algebraic sample size adjustment are useful to avoid such errors, whereas upward adjustment of smaller samples falsely signal misfit. Our observations suggest that sample sizes around N = 250 to N = 500 may provide a good balance for the statistical interpretation of the RUMM fit statistics studied here with respect to Type I errors and under the assumption of Rasch model fit within the examined frame of reference (i.e., about 25 item parameters well targeted to the sample).
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215
Maggi, Federico; Bosco, Domenico; Galetto, Luciana; Palmano, Sabrina; Marzachì, Cristina
2017-01-01
Analyses of space-time statistical features of a flavescence dorée (FD) epidemic in Vitis vinifera plants are presented. FD spread was surveyed from 2011 to 2015 in a vineyard of 17,500 m2 surface area in the Piemonte region, Italy; count and position of symptomatic plants were used to test the hypothesis of epidemic Complete Spatial Randomness and isotropicity in the space-time static (year-by-year) point pattern measure. Space-time dynamic (year-to-year) point pattern analyses were applied to newly infected and recovered plants to highlight statistics of FD progression and regression over time. Results highlighted point patterns ranging from disperse (at small scales) to aggregated (at large scales) over the years, suggesting that the FD epidemic is characterized by multiscale properties that may depend on infection incidence, vector population, and flight behavior. Dynamic analyses showed moderate preferential progression and regression along rows. Nearly uniform distributions of direction and negative exponential distributions of distance of newly symptomatic and recovered plants relative to existing symptomatic plants highlighted features of vector mobility similar to Brownian motion. These evidences indicate that space-time epidemics modeling should include environmental setting (e.g., vineyard geometry and topography) to capture anisotropicity as well as statistical features of vector flight behavior, plant recovery and susceptibility, and plant mortality. PMID:28111581
Hinman, Sarah E; Blackburn, Jason K; Curtis, Andrew
2006-01-01
Background To better understand the distribution of typhoid outbreaks in Washington, D.C., the U.S. Public Health Service (PHS) conducted four investigations of typhoid fever. These studies included maps of cases reported between 1 May – 31 October 1906 – 1909. These data were entered into a GIS database and analyzed using Ripley's K-function followed by the Gi* statistic in yearly intervals to evaluate spatial clustering, the scale of clustering, and the temporal stability of these clusters. Results The Ripley's K-function indicated no global spatial autocorrelation. The Gi* statistic indicated clustering of typhoid at multiple scales across the four year time period, refuting the conclusions drawn in all four PHS reports concerning the distribution of cases. While the PHS reports suggested an even distribution of the disease, this study quantified both areas of localized disease clustering, as well as mobile larger regions of clustering. Thus, indicating both highly localized and periodic generalized sources of infection within the city. Conclusion The methodology applied in this study was useful for evaluating the spatial distribution and annual-level temporal patterns of typhoid outbreaks in Washington, D.C. from 1906 to 1909. While advanced spatial analyses of historical data sets must be interpreted with caution, this study does suggest that there is utility in these types of analyses and that they provide new insights into the urban patterns of typhoid outbreaks during the early part of the twentieth century. PMID:16566830
Hinman, Sarah E; Blackburn, Jason K; Curtis, Andrew
2006-03-27
To better understand the distribution of typhoid outbreaks in Washington, D.C., the U.S. Public Health Service (PHS) conducted four investigations of typhoid fever. These studies included maps of cases reported between 1 May - 31 October 1906 - 1909. These data were entered into a GIS database and analyzed using Ripley's K-function followed by the Gi* statistic in yearly intervals to evaluate spatial clustering, the scale of clustering, and the temporal stability of these clusters. The Ripley's K-function indicated no global spatial autocorrelation. The Gi* statistic indicated clustering of typhoid at multiple scales across the four year time period, refuting the conclusions drawn in all four PHS reports concerning the distribution of cases. While the PHS reports suggested an even distribution of the disease, this study quantified both areas of localized disease clustering, as well as mobile larger regions of clustering. Thus, indicating both highly localized and periodic generalized sources of infection within the city. The methodology applied in this study was useful for evaluating the spatial distribution and annual-level temporal patterns of typhoid outbreaks in Washington, D.C. from 1906 to 1909. While advanced spatial analyses of historical data sets must be interpreted with caution, this study does suggest that there is utility in these types of analyses and that they provide new insights into the urban patterns of typhoid outbreaks during the early part of the twentieth century.
Observational Word Learning: Beyond Propose-But-Verify and Associative Bean Counting.
Roembke, Tanja; McMurray, Bob
2016-04-01
Learning new words is difficult. In any naming situation, there are multiple possible interpretations of a novel word. Recent approaches suggest that learners may solve this problem by tracking co-occurrence statistics between words and referents across multiple naming situations (e.g. Yu & Smith, 2007), overcoming the ambiguity in any one situation. Yet, there remains debate around the underlying mechanisms. We conducted two experiments in which learners acquired eight word-object mappings using cross-situational statistics while eye-movements were tracked. These addressed four unresolved questions regarding the learning mechanism. First, eye-movements during learning showed evidence that listeners maintain multiple hypotheses for a given word and bring them all to bear in the moment of naming. Second, trial-by-trial analyses of accuracy suggested that listeners accumulate continuous statistics about word/object mappings, over and above prior hypotheses they have about a word. Third, consistent, probabilistic context can impede learning, as false associations between words and highly co-occurring referents are formed. Finally, a number of factors not previously considered in prior analysis impact observational word learning: knowledge of the foils, spatial consistency of the target object, and the number of trials between presentations of the same word. This evidence suggests that observational word learning may derive from a combination of gradual statistical or associative learning mechanisms and more rapid real-time processes such as competition, mutual exclusivity and even inference or hypothesis testing.
2013-01-01
Background The publication of protocols by medical journals is increasingly becoming an accepted means for promoting good quality research and maximising transparency. Recently, Finfer and Bellomo have suggested the publication of statistical analysis plans (SAPs).The aim of this paper is to make public and to report in detail the planned analyses that were approved by the Trial Steering Committee in May 2010 for the principal papers of the PACE (Pacing, graded Activity, and Cognitive behaviour therapy: a randomised Evaluation) trial, a treatment trial for chronic fatigue syndrome. It illustrates planned analyses of a complex intervention trial that allows for the impact of clustering by care providers, where multiple care-providers are present for each patient in some but not all arms of the trial. Results The trial design, objectives and data collection are reported. Considerations relating to blinding, samples, adherence to the protocol, stratification, centre and other clustering effects, missing data, multiplicity and compliance are described. Descriptive, interim and final analyses of the primary and secondary outcomes are then outlined. Conclusions This SAP maximises transparency, providing a record of all planned analyses, and it may be a resource for those who are developing SAPs, acting as an illustrative example for teaching and methodological research. It is not the sum of the statistical analysis sections of the principal papers, being completed well before individual papers were drafted. Trial registration ISRCTN54285094 assigned 22 May 2003; First participant was randomised on 18 March 2005. PMID:24225069
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Improving surveillance for injuries associated with potential motor vehicle safety defects
Whitfield, R; Whitfield, A
2004-01-01
Objective: To improve surveillance for deaths and injuries associated with potential motor vehicle safety defects. Design: Vehicles in fatal crashes can be studied for indications of potential defects using an "early warning" surveillance statistic previously suggested for screening reports of adverse drug reactions. This statistic is illustrated with time series data for fatal, tire related and fire related crashes. Geographic analyses are used to augment the tire related statistics. Results: A statistical criterion based on the Poisson distribution that tests the likelihood of an expected number of events, given the number of events that actually occurred, is a promising method that can be readily adapted for use in injury surveillance. Conclusions: Use of the demonstrated techniques could have helped to avert a well known injury surveillance failure. This method is adaptable to aid in the direction of engineering and statistical reviews to prevent deaths and injuries associated with potential motor vehicle safety defects using available databases. PMID:15066972
Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko
2014-07-01
Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Stewart, Gavin B.; Altman, Douglas G.; Askie, Lisa M.; Duley, Lelia; Simmonds, Mark C.; Stewart, Lesley A.
2012-01-01
Background Individual participant data (IPD) meta-analyses that obtain “raw” data from studies rather than summary data typically adopt a “two-stage” approach to analysis whereby IPD within trials generate summary measures, which are combined using standard meta-analytical methods. Recently, a range of “one-stage” approaches which combine all individual participant data in a single meta-analysis have been suggested as providing a more powerful and flexible approach. However, they are more complex to implement and require statistical support. This study uses a dataset to compare “two-stage” and “one-stage” models of varying complexity, to ascertain whether results obtained from the approaches differ in a clinically meaningful way. Methods and Findings We included data from 24 randomised controlled trials, evaluating antiplatelet agents, for the prevention of pre-eclampsia in pregnancy. We performed two-stage and one-stage IPD meta-analyses to estimate overall treatment effect and to explore potential treatment interactions whereby particular types of women and their babies might benefit differentially from receiving antiplatelets. Two-stage and one-stage approaches gave similar results, showing a benefit of using anti-platelets (Relative risk 0.90, 95% CI 0.84 to 0.97). Neither approach suggested that any particular type of women benefited more or less from antiplatelets. There were no material differences in results between different types of one-stage model. Conclusions For these data, two-stage and one-stage approaches to analysis produce similar results. Although one-stage models offer a flexible environment for exploring model structure and are useful where across study patterns relating to types of participant, intervention and outcome mask similar relationships within trials, the additional insights provided by their usage may not outweigh the costs of statistical support for routine application in syntheses of randomised controlled trials. Researchers considering undertaking an IPD meta-analysis should not necessarily be deterred by a perceived need for sophisticated statistical methods when combining information from large randomised trials. PMID:23056232
Wallace, Lorraine S; Chisolm, Deena J; Abdel-Rasoul, Mahmoud; DeVoe, Jennifer E
2013-08-01
This study examined adults' self-reported understanding and formatting preferences of medical statistics, confidence in self-care and ability to obtain health advice or information, and perceptions of patient-health-care provider communication measured through dual survey modes (random digital dial and mail). Even while controlling for sociodemographic characteristics, significant differences in regard to adults' responses to survey variables emerged as a function of survey mode. While the analyses do not allow us to pinpoint the underlying causes of the differences observed, they do suggest that mode of administration should be carefully adjusted for and considered.
NASA Astrophysics Data System (ADS)
Amorese, D.; Grasso, J.-R.; Garambois, S.; Font, M.
2018-05-01
The rank-sum multiple change-point method is a robust statistical procedure designed to search for the optimal number and the location of change points in an arbitrary continue or discrete sequence of values. As such, this procedure can be used to analyse time-series data. Twelve years of robust data sets for the Séchilienne (French Alps) rockslide show a continuous increase in average displacement rate from 50 to 280 mm per month, in the 2004-2014 period, followed by a strong decrease back to 50 mm per month in the 2014-2015 period. When possible kinematic phases are tentatively suggested in previous studies, its solely rely on the basis of empirical threshold values. In this paper, we analyse how the use of a statistical algorithm for change-point detection helps to better understand time phases in landslide kinematics. First, we test the efficiency of the statistical algorithm on geophysical benchmark data, these data sets (stream flows and Northern Hemisphere temperatures) being already analysed by independent statistical tools. Second, we apply the method to 12-yr daily time-series of the Séchilienne landslide, for rainfall and displacement data, from 2003 December to 2015 December, in order to quantitatively extract changes in landslide kinematics. We find two strong significant discontinuities in the weekly cumulated rainfall values: an average rainfall rate increase is resolved in 2012 April and a decrease in 2014 August. Four robust changes are highlighted in the displacement time-series (2008 May, 2009 November-December-2010 January, 2012 September and 2014 March), the 2010 one being preceded by a significant but weak rainfall rate increase (in 2009 November). Accordingly, we are able to quantitatively define five kinematic stages for the Séchilienne rock avalanche during this period. The synchronization between the rainfall and displacement rate, only resolved at the end of 2009 and beginning of 2010, corresponds to a remarkable change (fourfold increase in mean displacement rate) in the landslide kinematic. This suggests that an increase of the rainfall is able to drive an increase of the landslide displacement rate, but that most of the kinematics of the landslide is not directly attributable to rainfall amount. The detailed exploration of the characteristics of the five kinematic stages suggests that the weekly averaged displacement rates are more tied to the frequency or rainy days than to the rainfall rate values. These results suggest the pattern of Séchilienne rock avalanche is consistent with the previous findings that landslide kinematics is dependent upon not only rainfall but also soil moisture conditions (as known as being more strongly related to precipitation frequency than to precipitation amount). Finally, our analysis of the displacement rate time-series pinpoints a susceptibility change of slope response to rainfall, as being slower before the end of 2009 than after, respectively. The kinematic history as depicted by statistical tools opens new routes to understand the apparent complexity of Séchilienne landslide kinematic.
Contextual analysis of fluid intelligence.
Salthouse, Timothy A; Pink, Jeffrey E; Tucker-Drob, Elliot M
2008-01-01
The nature of fluid intelligence was investigated by identifying variables that were, and were not, significantly related to this construct. Relevant information was obtained from three sources: re-analyses of data from previous studies, a study in which 791 adults performed storage-plus-processing working memory tasks, and a study in which 236 adults performed a variety of working memory, updating, and cognitive control tasks. The results suggest that fluid intelligence represents a broad individual difference dimension contributing to diverse types of controlled or effortful processing. The analyses also revealed that very few of the age-related effects on the target variables were statistically independent of effects on established cognitive abilities, which suggests most of the age-related influences on a wide variety of cognitive control variables overlap with age-related influences on cognitive abilities such as fluid intelligence, episodic memory, and perceptual speed.
Earthquake triggering in southeast Africa following the 2012 Indian Ocean earthquake
NASA Astrophysics Data System (ADS)
Neves, Miguel; Custódio, Susana; Peng, Zhigang; Ayorinde, Adebayo
2018-02-01
In this paper we present evidence of earthquake dynamic triggering in southeast Africa. We analysed seismic waveforms recorded at 53 broad-band and short-period stations in order to identify possible increases in the rate of microearthquakes and tremor due to the passage of teleseismic waves generated by the Mw8.6 2012 Indian Ocean earthquake. We found evidence of triggered local earthquakes and no evidence of triggered tremor in the region. We assessed the statistical significance of the increase in the number of local earthquakes using β-statistics. Statistically significant dynamic triggering of local earthquakes was observed at 7 out of the 53 analysed stations. Two of these stations are located in the northeast coast of Madagascar and the other five stations are located in the Kaapvaal Craton, southern Africa. We found no evidence of dynamically triggered seismic activity in stations located near the structures of the East African Rift System. Hydrothermal activity exists close to the stations that recorded dynamic triggering, however, it also exists near the East African Rift System structures where no triggering was observed. Our results suggest that factors other than solely tectonic regime and geothermalism are needed to explain the mechanisms that underlie earthquake triggering.
van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-08-07
Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.
Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-01-01
Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160
Chang, Lin-Chau; Mahmood, Riaz; Qureshi, Samina; Breder, Christopher D
2017-01-01
Standardised MedDRA Queries (SMQs) have been developed since the early 2000's and used by academia, industry, public health, and government sectors for detecting safety signals in adverse event safety databases. The purpose of the present study is to characterize how SMQs are used and the impact in safety analyses for New Drug Application (NDA) and Biologics License Application (BLA) submissions to the United States Food and Drug Administration (USFDA). We used the PharmaPendium database to capture SMQ use in Summary Basis of Approvals (SBoAs) of drugs and biologics approved by the USFDA. Characteristics of the drugs and the SMQ use were employed to evaluate the role of SMQ safety analyses in regulatory decisions and the veracity of signals they revealed. A comprehensive search of the SBoAs yielded 184 regulatory submissions approved from 2006 to 2015. Search strategies more frequently utilized restrictive searches with "narrow terms" to enhance specificity over strategies using "broad terms" to increase sensitivity, while some involved modification of search terms. A majority (59%) of 1290 searches used descriptive statistics, however inferential statistics were utilized in 35% of them. Commentary from reviewers and supervisory staff suggested that a small, yet notable percentage (18%) of 1290 searches supported regulatory decisions. The searches with regulatory impact were found in 73 submissions (40% of the submissions investigated). Most searches (75% of 227 searches) with regulatory implications described how the searches were confirmed, indicating prudence in the decision-making process. SMQs have an increasing role in the presentation and review of safety analysis for NDAs/BLAs and their regulatory reviews. This study suggests that SMQs are best used for screening process, with descriptive statistics, description of SMQ modifications, and systematic verification of cases which is crucial for drawing regulatory conclusions.
Chang, Lin-Chau; Mahmood, Riaz; Qureshi, Samina
2017-01-01
Purpose Standardised MedDRA Queries (SMQs) have been developed since the early 2000’s and used by academia, industry, public health, and government sectors for detecting safety signals in adverse event safety databases. The purpose of the present study is to characterize how SMQs are used and the impact in safety analyses for New Drug Application (NDA) and Biologics License Application (BLA) submissions to the United States Food and Drug Administration (USFDA). Methods We used the PharmaPendium database to capture SMQ use in Summary Basis of Approvals (SBoAs) of drugs and biologics approved by the USFDA. Characteristics of the drugs and the SMQ use were employed to evaluate the role of SMQ safety analyses in regulatory decisions and the veracity of signals they revealed. Results A comprehensive search of the SBoAs yielded 184 regulatory submissions approved from 2006 to 2015. Search strategies more frequently utilized restrictive searches with “narrow terms” to enhance specificity over strategies using “broad terms” to increase sensitivity, while some involved modification of search terms. A majority (59%) of 1290 searches used descriptive statistics, however inferential statistics were utilized in 35% of them. Commentary from reviewers and supervisory staff suggested that a small, yet notable percentage (18%) of 1290 searches supported regulatory decisions. The searches with regulatory impact were found in 73 submissions (40% of the submissions investigated). Most searches (75% of 227 searches) with regulatory implications described how the searches were confirmed, indicating prudence in the decision-making process. Conclusions SMQs have an increasing role in the presentation and review of safety analysis for NDAs/BLAs and their regulatory reviews. This study suggests that SMQs are best used for screening process, with descriptive statistics, description of SMQ modifications, and systematic verification of cases which is crucial for drawing regulatory conclusions. PMID:28570569
Hariri, Azian; Mohamad Noor, Noraishah; Paiman, Nuur Azreen; Ahmad Zaidi, Ahmad Mujahid; Zainal Bakri, Siti Farhana
2017-09-22
Welding operations are rarely conducted in an air-conditioned room. However, a company would set its welding operations in an air-conditioned room to maintain the humidity level needed to reduce hydrogen cracks in the specimen being welded. This study intended to assess the exposure to metal elements in the welders' breathing zone and toenail samples. Heavy metal concentration was analysed using inductively coupled plasma mass spectrometry. The lung function test was also conducted and analysed using statistical approaches. Chromium and manganese concentrations in the breathing zone exceeded the permissible exposure limit stipulated by Malaysian regulations. A similar trend was obtained in the concentration of heavy metals in the breathing zone air sampling and in the welders' toenails. Although there was no statistically significant decrease in the lung function of welders, it is suggested that exposure control through engineering and administrative approaches should be considered for workplace safety and health improvement.
A marked correlation function for constraining modified gravity models
NASA Astrophysics Data System (ADS)
White, Martin
2016-11-01
Future large scale structure surveys will provide increasingly tight constraints on our cosmological model. These surveys will report results on the distance scale and growth rate of perturbations through measurements of Baryon Acoustic Oscillations and Redshift-Space Distortions. It is interesting to ask: what further analyses should become routine, so as to test as-yet-unknown models of cosmic acceleration? Models which aim to explain the accelerated expansion rate of the Universe by modifications to General Relativity often invoke screening mechanisms which can imprint a non-standard density dependence on their predictions. This suggests density-dependent clustering as a `generic' constraint. This paper argues that a density-marked correlation function provides a density-dependent statistic which is easy to compute and report and requires minimal additional infrastructure beyond what is routinely available to such survey analyses. We give one realization of this idea and study it using low order perturbation theory. We encourage groups developing modified gravity theories to see whether such statistics provide discriminatory power for their models.
Mindful attention and awareness: relationships with psychopathology and emotion regulation.
Gregório, Sónia; Pinto-Gouveia, José
2013-01-01
The growing interest in mindfulness from the scientific community has originated several self-report measures of this psychological construct. The Mindful Attention and Awareness Scale (MAAS) is a self-report measure of mindfulness at a trait-level. This paper aims at exploring MAAS psychometric characteristics and validating it for the Portuguese population. The first two studies replicate some of the original author's statistical procedures in two different samples from the Portuguese general community population, in particular confirmatory factor analyses. Results from both analyses confirmed the scale single-factor structure and indicated a very good reliability. Moreover, cross-validation statistics showed that this single-factor structure is valid for different respondents from the general community population. In the third study the Portuguese version of the MAAS was found to have good convergent and discriminant validities. Overall the findings support the psychometric validity of the Portuguese version of MAAS and suggest this is a reliable self-report measure of trait-mindfulness, a central construct in Clinical Psychology research and intervention fields.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
NASA Astrophysics Data System (ADS)
Trabelsi, M.; Maamouri, F.; Quignard, J.-P.; Boussaïd, M.; Faure, E.
2004-12-01
Current distribution of Atherina lagunae poses an interesting biogeographical problem as this species inhabits widely separate circum-Mediterranean lagoons. Statistical analyses of 87 biometric parameters and genetic variation in a portion of the cytochrome b gene were examined in four populations of A. lagunae from Tunisian and French lagoons. The results suggested a subdivision into two distinct Atherinid groups: one included the French lagoonal sand smelts and the second included the Tunisian ones. Tunisian lagoonal sand smelts were distinguished from the French ones by the lower number of lateral line scales, vertebrae, pectorals and first dorsal fin rays and the higher number of lower and total gillrakers. In addition, A. lagunae from Tunisian lagoons are characterised by short preorbital length, developed operculum, broad interorbital space, larger head, robust body and a relatively small first dorsal fin which is positioned backwards. In addition, intraspecific sequence variation in a portion of the cytochrome b gene was examined in 87 individuals from Tunisia and France. The high correlation between the results of the molecular phylogenetic tree and biometric statistical data analysis suggested that two different sibling species or at least sub-species or semi-species have colonised the lagoons. In addition, our analyses suggested that the evolution of A. lagunae probably occurred in two steps including marine sympatric speciation within the large Atherina boyeri complex and a post-Pleistocene colonisation of the lagoons.
Testosterone replacement therapy and the heart: friend, foe or bystander?
Canfield, Steven; Wang, Run
2016-01-01
The role of testosterone therapy (TTh) in cardiovascular disease (CVD) outcomes is still controversial, and it seems will remain inconclusive for the moment. An extensive body of literature has investigated the association of endogenous testosterone and use of TTh with CVD events including several meta-analyses. In some instances, a number of studies reported beneficial effects of TTh on CVD events and in other instances the body of literature reported detrimental effects or no effects at all. Yet, no review article has scrutinized this body of literature using the magnitude of associations and statistical significance reported from this relationship. We critically reviewed the previous and emerging body of literature that investigated the association of endogenous testosterone and use of TTh with CVD events (only fatal and nonfatal). These studies were divided into three groups, “beneficial (friendly use)”, “detrimental (foe)” and “no effects at all (bystander)”, based on their magnitude of associations and statistical significance from original research studies and meta-analyses of epidemiological studies and of randomized controlled trials (RCT’s). In this review article, the studies reporting a significant association of high levels of testosterone with a reduced risk of CVD events in original prospective studies and meta-analyses of cross-sectional and prospective studies seems to be more consistent. However, the number of meta-analyses of RCT’s does not provide a clear picture after we divided it into the beneficial, detrimental or no effects all groups using their magnitudes of association and statistical significance. From this review, we suggest that we need a study or number of studies that have the adequate power, epidemiological, and clinical data to provide a definitive conclusion on whether the effect of TTh on the natural history of CVD is real or not. PMID:28078222
Testosterone replacement therapy and the heart: friend, foe or bystander?
Lopez, David S; Canfield, Steven; Wang, Run
2016-12-01
The role of testosterone therapy (TTh) in cardiovascular disease (CVD) outcomes is still controversial, and it seems will remain inconclusive for the moment. An extensive body of literature has investigated the association of endogenous testosterone and use of TTh with CVD events including several meta-analyses. In some instances, a number of studies reported beneficial effects of TTh on CVD events and in other instances the body of literature reported detrimental effects or no effects at all. Yet, no review article has scrutinized this body of literature using the magnitude of associations and statistical significance reported from this relationship. We critically reviewed the previous and emerging body of literature that investigated the association of endogenous testosterone and use of TTh with CVD events (only fatal and nonfatal). These studies were divided into three groups, "beneficial (friendly use)", "detrimental (foe)" and "no effects at all (bystander)", based on their magnitude of associations and statistical significance from original research studies and meta-analyses of epidemiological studies and of randomized controlled trials (RCT's). In this review article, the studies reporting a significant association of high levels of testosterone with a reduced risk of CVD events in original prospective studies and meta-analyses of cross-sectional and prospective studies seems to be more consistent. However, the number of meta-analyses of RCT's does not provide a clear picture after we divided it into the beneficial, detrimental or no effects all groups using their magnitudes of association and statistical significance. From this review, we suggest that we need a study or number of studies that have the adequate power, epidemiological, and clinical data to provide a definitive conclusion on whether the effect of TTh on the natural history of CVD is real or not.
Webster, R J; Williams, A; Marchetti, F; Yauk, C L
2018-07-01
Mutations in germ cells pose potential genetic risks to offspring. However, de novo mutations are rare events that are spread across the genome and are difficult to detect. Thus, studies in this area have generally been under-powered, and no human germ cell mutagen has been identified. Whole Genome Sequencing (WGS) of human pedigrees has been proposed as an approach to overcome these technical and statistical challenges. WGS enables analysis of a much wider breadth of the genome than traditional approaches. Here, we performed power analyses to determine the feasibility of using WGS in human families to identify germ cell mutagens. Different statistical models were compared in the power analyses (ANOVA and multiple regression for one-child families, and mixed effect model sampling between two to four siblings per family). Assumptions were made based on parameters from the existing literature, such as the mutation-by-paternal age effect. We explored two scenarios: a constant effect due to an exposure that occurred in the past, and an accumulating effect where the exposure is continuing. Our analysis revealed the importance of modeling inter-family variability of the mutation-by-paternal age effect. Statistical power was improved by models accounting for the family-to-family variability. Our power analyses suggest that sufficient statistical power can be attained with 4-28 four-sibling families per treatment group, when the increase in mutations ranges from 40 to 10% respectively. Modeling family variability using mixed effect models provided a reduction in sample size compared to a multiple regression approach. Much larger sample sizes were required to detect an interaction effect between environmental exposures and paternal age. These findings inform study design and statistical modeling approaches to improve power and reduce sequencing costs for future studies in this area. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
von Cramon-Taubadel, Noreen; Schroeder, Lauren
2016-10-01
Estimation of the variance-covariance (V/CV) structure of fragmentary bioarchaeological populations requires the use of proxy extant V/CV parameters. However, it is currently unclear whether extant human populations exhibit equivalent V/CV structures. Random skewers (RS) and hierarchical analyses of common principal components (CPC) were applied to a modern human cranial dataset. Cranial V/CV similarity was assessed globally for samples of individual populations (jackknifed method) and for pairwise population sample contrasts. The results were examined in light of potential explanatory factors for covariance difference, such as geographic region, among-group distance, and sample size. RS analyses showed that population samples exhibited highly correlated multivariate responses to selection, and that differences in RS results were primarily a consequence of differences in sample size. The CPC method yielded mixed results, depending upon the statistical criterion used to evaluate the hierarchy. The hypothesis-testing (step-up) approach was deemed problematic due to sensitivity to low statistical power and elevated Type I errors. In contrast, the model-fitting (lowest AIC) approach suggested that V/CV matrices were proportional and/or shared a large number of CPCs. Pairwise population sample CPC results were correlated with cranial distance, suggesting that population history explains some of the variability in V/CV structure among groups. The results indicate that patterns of covariance in human craniometric samples are broadly similar but not identical. These findings have important implications for choosing extant covariance matrices to use as proxy V/CV parameters in evolutionary analyses of past populations. © 2016 Wiley Periodicals, Inc.
Matsuoka, Masanari; Sugita, Masatake; Kikuchi, Takeshi
2014-09-18
Proteins that share a high sequence homology while exhibiting drastically different 3D structures are investigated in this study. Recently, artificial proteins related to the sequences of the GA and IgG binding GB domains of human serum albumin have been designed. These artificial proteins, referred to as GA and GB, share 98% amino acid sequence identity but exhibit different 3D structures, namely, a 3α bundle versus a 4β + α structure. Discriminating between their 3D structures based on their amino acid sequences is a very difficult problem. In the present work, in addition to using bioinformatics techniques, an analysis based on inter-residue average distance statistics is used to address this problem. It was hard to distinguish which structure a given sequence would take only with the results of ordinary analyses like BLAST and conservation analyses. However, in addition to these analyses, with the analysis based on the inter-residue average distance statistics and our sequence tendency analysis, we could infer which part would play an important role in its structural formation. The results suggest possible determinants of the different 3D structures for sequences with high sequence identity. The possibility of discriminating between the 3D structures based on the given sequences is also discussed.
Golder, Su; Loke, Yoon K.; Bland, Martin
2011-01-01
Background There is considerable debate as to the relative merits of using randomised controlled trial (RCT) data as opposed to observational data in systematic reviews of adverse effects. This meta-analysis of meta-analyses aimed to assess the level of agreement or disagreement in the estimates of harm derived from meta-analysis of RCTs as compared to meta-analysis of observational studies. Methods and Findings Searches were carried out in ten databases in addition to reference checking, contacting experts, citation searches, and hand-searching key journals, conference proceedings, and Web sites. Studies were included where a pooled relative measure of an adverse effect (odds ratio or risk ratio) from RCTs could be directly compared, using the ratio of odds ratios, with the pooled estimate for the same adverse effect arising from observational studies. Nineteen studies, yielding 58 meta-analyses, were identified for inclusion. The pooled ratio of odds ratios of RCTs compared to observational studies was estimated to be 1.03 (95% confidence interval 0.93–1.15). There was less discrepancy with larger studies. The symmetric funnel plot suggests that there is no consistent difference between risk estimates from meta-analysis of RCT data and those from meta-analysis of observational studies. In almost all instances, the estimates of harm from meta-analyses of the different study designs had 95% confidence intervals that overlapped (54/58, 93%). In terms of statistical significance, in nearly two-thirds (37/58, 64%), the results agreed (both studies showing a significant increase or significant decrease or both showing no significant difference). In only one meta-analysis about one adverse effect was there opposing statistical significance. Conclusions Empirical evidence from this overview indicates that there is no difference on average in the risk estimate of adverse effects of an intervention derived from meta-analyses of RCTs and meta-analyses of observational studies. This suggests that systematic reviews of adverse effects should not be restricted to specific study types. Please see later in the article for the Editors' Summary PMID:21559325
Agriculture, population growth, and statistical analysis of the radiocarbon record.
Zahid, H Jabran; Robinson, Erick; Kelly, Robert L
2016-01-26
The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Myers, E A; Rodríguez-Robles, J A; Denardo, D F; Staub, R E; Stropoli, A; Ruane, S; Burbrink, F T
2013-11-01
Phylogeographic inference can determine the timing of population divergence, historical demographic processes, patterns of migration, and when extended to multiple species, the history of communities. Single-locus analyses can mislead interpretations of the evolutionary history of taxa and comparative analyses. It is therefore important to revisit previous single-locus phylogeographic studies, particularly those that have been used to propose general patterns for regional biotas and the processes responsible for generating inferred patterns. Here, we employ a multilocus statistical approach to re-examine the phylogeography of Lampropeltis zonata. Using nonparametic and Bayesian species delimitation, we determined that there are two well-supported species within L. zonata. Ecological niche modelling supports the delimitation of these taxa, suggesting that the two species inhabit distinct climatic environments. Gene flow between the two taxa is low and appears to occur unidirectionally. Further, our data suggest that gene flow was mediated by females, a rare pattern in snakes. In contrast to previous analyses, we determined that the divergence between the two lineages occurred in the late Pliocene (c. 2.07 Ma). Spatially and temporally, the divergence of these lineages is associated with the inundation of central California by the Monterey Bay. The effective population sizes of the two species appear to have been unaffected by Pleistocene glaciation. Our increased sampling of loci for L. zonata, combined with previously published multilocus analyses of other sympatric species, suggests that previous conclusions reached by comparative phylogeographic studies conducted within the California Floristic Province should be reassessed. © 2013 John Wiley & Sons Ltd.
Austin, Peter C
2007-11-01
I conducted a systematic review of the use of propensity score matching in the cardiovascular surgery literature. I examined the adequacy of reporting and whether appropriate statistical methods were used. I examined 60 articles published in the Annals of Thoracic Surgery, European Journal of Cardio-thoracic Surgery, Journal of Cardiovascular Surgery, and the Journal of Thoracic and Cardiovascular Surgery between January 1, 2004, and December 31, 2006. Thirty-one of the 60 studies did not provide adequate information on how the propensity score-matched pairs were formed. Eleven (18%) of studies did not report on whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. No studies used appropriate methods to compare baseline characteristics between treated and untreated subjects in the propensity score-matched sample. Eight (13%) of the 60 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Two studies used appropriate methods for some outcomes, but not for all outcomes. Thirty-nine (65%) studies explicitly used statistical methods that were inappropriate for matched-pairs data when estimating the effect of treatment on outcomes. Eleven studies did not report the statistical tests that were used to assess the statistical significance of the treatment effect. Analysis of propensity score-matched samples tended to be poor in the cardiovascular surgery literature. Most statistical analyses ignored the matched nature of the sample. I provide suggestions for improving the reporting and analysis of studies that use propensity score matching.
Cuomo, Raphael E; Mackey, Tim K
2014-12-02
To explore healthcare policy and system improvements that would more proactively respond to future penetration of counterfeit cancer medications in the USA drug supply chain using geospatial analysis. A statistical and geospatial analysis of areas that received notices from the Food and Drug Administration (FDA) about the possibility of counterfeit Avastin penetrating the US drug supply chain. Data from FDA warning notices were compared to data from 44 demographic variables available from the US Census Bureau via correlation, means testing and geospatial visualisation. Results were interpreted in light of existing literature in order to recommend improvements to surveillance of counterfeit medicines. This study analysed 791 distinct healthcare provider addresses that received FDA warning notices across 30,431 zip codes in the USA. Statistical outputs were Pearson's correlation coefficients and t values. Geospatial outputs were cartographic visualisations. These data were used to generate the overarching study outcome, which was a recommendation for a strategy for drug safety surveillance congruent with existing literature on counterfeit medication. Zip codes with greater numbers of individuals age 65+ and greater numbers of ethnic white individuals were most correlated with receipt of a counterfeit Avastin notice. Geospatial visualisations designed in conjunction with statistical analysis of demographic variables appeared more capable of suggesting areas and populations that may be at risk for undetected counterfeit Avastin penetration. This study suggests that dual incorporation of statistical and geospatial analysis in surveillance of counterfeit medicine may be helpful in guiding efforts to prevent, detect and visualise counterfeit medicines penetrations in the US drug supply chain and other settings. Importantly, the information generated by these analyses could be utilised to identify at-risk populations associated with demographic characteristics. Stakeholders should explore these results as another tool to improve on counterfeit medicine surveillance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Lachowiec, Jennifer; Shen, Xia; Queitsch, Christine; Carlborg, Örjan
2015-01-01
Efforts to identify loci underlying complex traits generally assume that most genetic variance is additive. Here, we examined the genetics of Arabidopsis thaliana root length and found that the genomic narrow-sense heritability for this trait in the examined population was statistically zero. The low amount of additive genetic variance that could be captured by the genome-wide genotypes likely explains why no associations to root length could be found using standard additive-model-based genome-wide association (GWA) approaches. However, as the broad-sense heritability for root length was significantly larger, and primarily due to epistasis, we also performed an epistatic GWA analysis to map loci contributing to the epistatic genetic variance. Four interacting pairs of loci were revealed, involving seven chromosomal loci that passed a standard multiple-testing corrected significance threshold. The genotype-phenotype maps for these pairs revealed epistasis that cancelled out the additive genetic variance, explaining why these loci were not detected in the additive GWA analysis. Small population sizes, such as in our experiment, increase the risk of identifying false epistatic interactions due to testing for associations with very large numbers of multi-marker genotypes in few phenotyped individuals. Therefore, we estimated the false-positive risk using a new statistical approach that suggested half of the associated pairs to be true positive associations. Our experimental evaluation of candidate genes within the seven associated loci suggests that this estimate is conservative; we identified functional candidate genes that affected root development in four loci that were part of three of the pairs. The statistical epistatic analyses were thus indispensable for confirming known, and identifying new, candidate genes for root length in this population of wild-collected A. thaliana accessions. We also illustrate how epistatic cancellation of the additive genetic variance explains the insignificant narrow-sense and significant broad-sense heritability by using a combination of careful statistical epistatic analyses and functional genetic experiments.
NASA Astrophysics Data System (ADS)
Funk, C. C.; Shukla, S.; Hoerling, M. P.; Robertson, F. R.; Hoell, A.; Liebmann, B.
2013-12-01
During boreal spring, eastern portions of Kenya and Somalia have experienced more frequent droughts since 1999. Given the region's high levels of food insecurity, better predictions of these droughts could provide substantial humanitarian benefits. We show that dynamical-statistical seasonal climate forecasts, based on the latest generation of coupled atmosphere-ocean and uncoupled atmospheric models, effectively predict boreal spring rainfall in this area. Skill sources are assessed by comparing ensembles driven with full-ocean forcing with ensembles driven with ENSO-only sea surface temperatures (SSTs). Our analysis suggests that both ENSO and non-ENSO Indo-Pacific SST forcing have played an important role in the increase in drought frequencies. Over the past 30 years, La Niña drought teleconnections have strengthened, while non-ENSO Indo-Pacific convection patterns have also supported increased (decreased) Western Pacific (East African) rainfall. To further examine the relative contribution of ENSO, low frequency warming and the Pacific Decadal Oscillation, we present decompositions of ECHAM5, GFS, CAM4 and GMAO AMIP simulations. These decompositions suggest that rapid warming in the western Pacific and steeper western-to-central Pacific SST gradients have likely played an important role in the recent intensification of the Walker circulation, and the associated increase in East African aridity. A linear combination of time series describing the Pacific Decadal Oscillation and the strength of Indo-Pacific warming are shown to track East African rainfall reasonably well. The talk concludes with a few thoughts linking the potentially important interplay of attribution and prediction. At least for recent East African droughts, it appears that a characteristic Indo-Pacific SST and precipitation anomaly pattern can be linked statistically to support forecasts and attribution analyses. The combination of traditional AGCM attribution analyses with simple yet physically plausible statistical estimation procedures may help us better untangle some climate mysteries.
Statistical modeling implicates neuroanatomical circuit mediating stress relief by ‘comfort’ food
Ulrich-Lai, Yvonne M.; Christiansen, Anne M.; Wang, Xia; Song, Seongho; Herman, James P.
2015-01-01
A history of eating highly-palatable foods reduces physiological and emotional responses to stress. For instance, we have previously shown that limited sucrose intake (4 ml of 30% sucrose twice daily for 14 days) reduces hypothalamic-pituitary-adrenocortical (HPA) axis responses to stress. However, the neural mechanisms underlying stress relief by such ‘comfort’ foods are unclear, and could reveal an endogenous brain pathway for stress mitigation. As such, the present work assessed the expression of several proteins related to neuronal activation and/or plasticity in multiple stress- and reward-regulatory brain regions of rats after limited sucrose (vs. water control) intake. These data were then subjected to a series of statistical analyses, including Bayesian modeling, to identify the most likely neurocircuit mediating stress relief by sucrose. The analyses suggest that sucrose reduces HPA activation by dampening an excitatory basolateral amygdala - medial amygdala circuit, while also potentiating an inhibitory bed nucleus of the stria terminalis principle subdivision-mediated circuit, resulting in reduced HPA activation after stress. Collectively, the results support the hypothesis that sucrose limits stress responses via plastic changes to the structure and function of stress-regulatory neural circuits. The work also illustrates that advanced statistical methods are useful approaches to identify potentially novel and important underlying relationships in biological data sets. PMID:26246177
Statistical modeling implicates neuroanatomical circuit mediating stress relief by 'comfort' food.
Ulrich-Lai, Yvonne M; Christiansen, Anne M; Wang, Xia; Song, Seongho; Herman, James P
2016-07-01
A history of eating highly palatable foods reduces physiological and emotional responses to stress. For instance, we have previously shown that limited sucrose intake (4 ml of 30 % sucrose twice daily for 14 days) reduces hypothalamic-pituitary-adrenocortical (HPA) axis responses to stress. However, the neural mechanisms underlying stress relief by such 'comfort' foods are unclear, and could reveal an endogenous brain pathway for stress mitigation. As such, the present work assessed the expression of several proteins related to neuronal activation and/or plasticity in multiple stress- and reward-regulatory brain regions of rats after limited sucrose (vs. water control) intake. These data were then subjected to a series of statistical analyses, including Bayesian modeling, to identify the most likely neurocircuit mediating stress relief by sucrose. The analyses suggest that sucrose reduces HPA activation by dampening an excitatory basolateral amygdala-medial amygdala circuit, while also potentiating an inhibitory bed nucleus of the stria terminalis principle subdivision-mediated circuit, resulting in reduced HPA activation after stress. Collectively, the results support the hypothesis that sucrose limits stress responses via plastic changes to the structure and function of stress-regulatory neural circuits. The work also illustrates that advanced statistical methods are useful approaches to identify potentially novel and important underlying relationships in biological datasets.
Tipping points in the arctic: eyeballing or statistical significance?
Carstensen, Jacob; Weydmann, Agata
2012-02-01
Arctic ecosystems have experienced and are projected to experience continued large increases in temperature and declines in sea ice cover. It has been hypothesized that small changes in ecosystem drivers can fundamentally alter ecosystem functioning, and that this might be particularly pronounced for Arctic ecosystems. We present a suite of simple statistical analyses to identify changes in the statistical properties of data, emphasizing that changes in the standard error should be considered in addition to changes in mean properties. The methods are exemplified using sea ice extent, and suggest that the loss rate of sea ice accelerated by factor of ~5 in 1996, as reported in other studies, but increases in random fluctuations, as an early warning signal, were observed already in 1990. We recommend to employ the proposed methods more systematically for analyzing tipping points to document effects of climate change in the Arctic.
OTD Observations of Continental US Ground and Cloud Flashes
NASA Technical Reports Server (NTRS)
Koshak, William
2007-01-01
Lightning optical flash parameters (e.g., radiance, area, duration, number of optical groups, and number of optical events) derived from almost five years of Optical Transient Detector (OTD) data are analyzed. Hundreds of thousands of OTD flashes occurring over the continental US are categorized according to flash type (ground or cloud flash) using US National Lightning Detection Network TM (NLDN) data. The statistics of the optical characteristics of the ground and cloud flashes are inter-compared on an overall basis, and as a function of ground flash polarity. A standard two-distribution hypothesis test is used to inter-compare the population means of a given lightning parameter for the two flash types. Given the differences in the statistics of the optical characteristics, it is suggested that statistical analyses (e.g., Bayesian Inference) of the space-based optical measurements might make it possible to successfully discriminate ground and cloud flashes a reasonable percentage of the time.
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
Intelligence, birth order, and family size.
Kanazawa, Satoshi
2012-09-01
The analysis of the National Child Development Study in the United Kingdom (n = 17,419) replicates some earlier findings and shows that genuine within-family data are not necessary to make the apparent birth-order effect on intelligence disappear. Birth order is not associated with intelligence in between-family data once the number of siblings is statistically controlled. The analyses support the admixture hypothesis, which avers that the apparent birth-order effect on intelligence is an artifact of family size, and cast doubt on the confluence and resource dilution models, both of which claim that birth order has a causal influence on children's cognitive development. The analyses suggest that birth order has no genuine causal effect on general intelligence.
Estévez López, Estefanía; Murgui Pérez, Sergio; Moreno Ruiz, David; Musitu Ochoa, Gonzalo
2007-02-01
The purpose of present study is to analyse the relationship among certain family and school factors, adolescents' attitude towards institutional authority, and violent behaviour at school. The sample is composed of 1049 adolescents of both sexes and aged from 11 to 16 years old. Statistical analyses were carried out using structural equation modelling. Results indicate a close association between negative communication with father and violent behaviour in adolescence. Moreover, data suggest that teachers' expectations affect students' attitude towards institutional authority, which in turn is closely related to school violence. Finally, findings show an indirect influence of father, mother and teacher in adolescents' violent behaviour, mainly through their effect on family- and school-self-concept.
The space of ultrametric phylogenetic trees.
Gavryushkin, Alex; Drummond, Alexei J
2016-08-21
The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Scudder, Rachel P.; Murray, Richard W.; Schindlbeck, Julie C.; Kutterolf, Steffen; Hauff, Folkmar; McKinley, Claire C.
2014-11-01
We have geochemically and statistically characterized bulk marine sediment and ash layers at Ocean Drilling Program Site 1149 (Izu-Bonin Arc) and Deep Sea Drilling Project Site 52 (Mariana Arc), and have quantified that multiple dispersed ash sources collectively comprise ˜30-35% of the hemipelagic sediment mass entering the Izu-Bonin-Mariana subduction system. Multivariate statistical analyses indicate that the bulk sediment at Site 1149 is a mixture of Chinese Loess, a second compositionally distinct eolian source, a dispersed mafic ash, and a dispersed felsic ash. We interpret the source of these ashes as, respectively, being basalt from the Izu-Bonin Front Arc (IBFA) and rhyolite from the Honshu Arc. Sr-, Nd-, and Pb isotopic analyses of the bulk sediment are consistent with the chemical/statistical-based interpretations. Comparison of the mass accumulation rate of the dispersed ash component to discrete ash layer parameters (thickness, sedimentation rate, and number of layers) suggests that eruption frequency, rather than eruption size, drives the dispersed ash record. At Site 52, the geochemistry and statistical modeling indicates that Chinese Loess, IBFA, dispersed BNN (boninite from Izu-Bonin), and a dispersed felsic ash of unknown origin are the sources. At Site 1149, the ash layers and the dispersed ash are compositionally coupled, whereas at Site 52 they are decoupled in that there are no boninite layers, yet boninite is dispersed within the sediment. Changes in the volcanic and eolian inputs through time indicate strong arc-related and climate-related controls.
Gandy, M; Karin, E; Jones, M P; McDonald, S; Sharpe, L; Titov, N; Dear, B F
2018-05-13
The evidence for Internet-delivered pain management programs for chronic pain is growing, but there is little empirical understanding of how they effect change. Understanding mechanisms of clinical response to these programs could inform their effective development and delivery. A large sample (n = 396) from a previous randomized controlled trial of a validated internet-delivered psychological pain management program, the Pain Course, was used to examine the influence of three potential psychological mechanisms (pain acceptance, pain self-efficacy, fear of movement/re-injury) on treatment-related change in disability, depression, anxiety and average pain. Analyses involved generalized estimating equation models for clinical outcomes that adjusted for co-occurring change in psychological variables. This was paired with cross-lagged analysis to assess for evidence of causality. Analyses involved two time points, pre-treatment and post-treatment. Changes in pain-acceptance were strongly associated with changes in three (depression, anxiety and average pain) of the four clinical outcomes. Changes in self-efficacy were also strongly associated with two (anxiety and average pain) clinical outcomes. These findings suggest that participants were unlikely to improve in these clinical outcomes without also experiencing increases in their pain self-efficacy and pain acceptance. However, there was no clear evidence from cross-lagged analyses to currently support these psychological variables as direct mechanisms of clinical improvements. There was only statistical evidence to suggest higher levels of self-efficacy moderated improvements in depression. The findings suggest that, while clinical improvements are closely associated with improvements in pain acceptance and self-efficacy, these psychological variables may not drive the treatment effects observed. This study employed robust statistical techniques to assess the psychological mechanisms of an established internet-delivered pain management program. While clinical improvements (e.g. depression, anxiety, pain) were closely associated with improvements in psychological variables (e.g. pain self-efficacy and pain acceptance), these variables do not appear to be treatment mechanisms. © 2018 European Pain Federation - EFIC®.
Zhang, Jian; Jiang, Hong; Sun, Min; Chen, Jianghua
2017-08-16
Periodontal disease occurs relatively prevalently in people with chronic kidney disease (CKD), but it remains indeterminate whether periodontal disease is an independent risk factor for premature death in this population. Interventions to reduce mortality in CKD population consistently yield to unsatisfactory results and new targets are necessitated. So this meta-analysis aimed to evaluate the association between periodontal disease and mortality in the CKD population. Pubmed, Embase, Web of Science, Scopus and abstracts from recent relevant meeting were searched by two authors independently. Relative risks (RRs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was explored by chi-square test and quantified by the I 2 statistic. Eight cohort studies comprising 5477 individuals with CKD were incorporated. The overall pooled data demonstrated that periodontal disease was associated with all-cause death in CKD population (RR, 1.254; 95% CI 1.046-1.503; P = 0.005), with a moderate heterogeneity, I 2 = 52.2%. However, no evident association was observed between periodontal disease and cardiovascular mortality (RR, 1.30, 95% CI, 0.82-2.06; P = 0.259). Besides, statistical heterogeneity was substantial (I 2 = 72.5%; P = 0.012). Associations for mortality were similar between subgroups, such as the different stages of CKD, adjustment for confounding factors. Specific to all-cause death, sensitivity and cumulative analyses both suggested that our results were robust. As for cardiovascular mortality, the association with periodontal disease needs to be further strengthened. We demonstrated that periodontal disease was associated with an increased risk of all-cause death in CKD people. Yet no adequate evidence suggested periodontal disease was also at elevated risk for cardiovascular death.
Quantitative approaches in climate change ecology
Brown, Christopher J; Schoeman, David S; Sydeman, William J; Brander, Keith; Buckley, Lauren B; Burrows, Michael; Duarte, Carlos M; Moore, Pippa J; Pandolfi, John M; Poloczanska, Elvira; Venables, William; Richardson, Anthony J
2011-01-01
Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer-reviewed articles that examined relationships between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non-climate drivers of change, ignoring temporal and spatial autocorrelation, averaging across spatial patterns and not reporting key metrics. We provide a list of issues that need to be addressed to make inferences more defensible, including the consideration of (i) data limitations and the comparability of data sets; (ii) alternative mechanisms for change; (iii) appropriate response variables; (iv) a suitable model for the process under study; (v) temporal autocorrelation; (vi) spatial autocorrelation and patterns; and (vii) the reporting of rates of change. While the focus of our review was marine studies, these suggestions are equally applicable to terrestrial studies. Consideration of these suggestions will help advance global knowledge of climate impacts and understanding of the processes driving ecological change.
Temporal trends in the acidity of precipitation and surface waters of New York
Peters, Norman E.; Schroeder, Roy A.; Troutman, David E.
1982-01-01
Statistical analyses of precipitation data from a nine-station monitoring network indicate little change in pH from 1965-78 within New York State as a whole but suggest that pH of bulk precipitation has decreased in the western part of the State by approximately 0.2 pH units since 1965 and increased in the eastern part by a similar amount. This trend is equivalent to an annual change in hydrogen-ion concentration of 0.2 microequivalents per liter. An average annual increase in precipitation quantity of 2 to 3 percent since 1965 has resulted in an increased acid load in the western and central parts of the State. During 1965-78, sulfate concentration in precipitation decreased an average of 1-4 percent annually. In general, no trend in nitrate was detected. Calculated trends in hydrogen-ion concentration do not correlate with measured trends of sulfate and nitrate, which suggests variable neutralization of hydrogen ion, possibly by particles from dry deposition. Neutralization has produced an increase of about 0.3 pH units in nonurban areas and 0.7 pH units in urban areas. Statistical analyses of chemical data from several streams throughout New York suggest that sulfate concentrations decreased an average of 1 to 4 percent per year. This decrease is comparable to the sulfate decrease in precipitation during the same period. In most areas of the State, chemical contributions from urbanization and farming, as well as the neutralizing effect of carbonate soils, conceal whatever effects acid precipitation may have on pH of streams.
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Cox, Tony; Popken, Douglas; Ricci, Paolo F
2013-01-01
Exposures to fine particulate matter (PM2.5) in air (C) have been suspected of contributing causally to increased acute (e.g., same-day or next-day) human mortality rates (R). We tested this causal hypothesis in 100 United States cities using the publicly available NMMAPS database. Although a significant, approximately linear, statistical C-R association exists in simple statistical models, closer analysis suggests that it is not causal. Surprisingly, conditioning on other variables that have been extensively considered in previous analyses (usually using splines or other smoothers to approximate their effects), such as month of the year and mean daily temperature, suggests that they create strong, nonlinear confounding that explains the statistical association between PM2.5 and mortality rates in this data set. As this finding disagrees with conventional wisdom, we apply several different techniques to examine it. Conditional independence tests for potential causation, non-parametric classification tree analysis, Bayesian Model Averaging (BMA), and Granger-Sims causality testing, show no evidence that PM2.5 concentrations have any causal impact on increasing mortality rates. This apparent absence of a causal C-R relation, despite their statistical association, has potentially important implications for managing and communicating the uncertain health risks associated with, but not necessarily caused by, PM2.5 exposures. PMID:23983662
Akiki, Teddy J; Averill, Christopher L; Wrocklage, Kristen M; Scott, J Cobb; Averill, Lynnette A; Schweinsburg, Brian; Alexander-Bloch, Aaron; Martini, Brenda; Southwick, Steven M; Krystal, John H; Abdallah, Chadi G
2018-08-01
Disruption in the default mode network (DMN) has been implicated in numerous neuropsychiatric disorders, including posttraumatic stress disorder (PTSD). However, studies have largely been limited to seed-based methods and involved inconsistent definitions of the DMN. Recent advances in neuroimaging and graph theory now permit the systematic exploration of intrinsic brain networks. In this study, we used resting-state functional magnetic resonance imaging (fMRI), diffusion MRI, and graph theoretical analyses to systematically examine the DMN connectivity and its relationship with PTSD symptom severity in a cohort of 65 combat-exposed US Veterans. We employed metrics that index overall connectivity strength, network integration (global efficiency), and network segregation (clustering coefficient). Then, we conducted a modularity and network-based statistical analysis to identify DMN regions of particular importance in PTSD. Finally, structural connectivity analyses were used to probe whether white matter abnormalities are associated with the identified functional DMN changes. We found decreased DMN functional connectivity strength to be associated with increased PTSD symptom severity. Further topological characterization suggests decreased functional integration and increased segregation in subjects with severe PTSD. Modularity analyses suggest a spared connectivity in the posterior DMN community (posterior cingulate, precuneus, angular gyrus) despite overall DMN weakened connections with increasing PTSD severity. Edge-wise network-based statistical analyses revealed a prefrontal dysconnectivity. Analysis of the diffusion networks revealed no alterations in overall strength or prefrontal structural connectivity. DMN abnormalities in patients with severe PTSD symptoms are characterized by decreased overall interconnections. On a finer scale, we found a pattern of prefrontal dysconnectivity, but increased cohesiveness in the posterior DMN community and relative sparing of connectivity in this region. The DMN measures established in this study may serve as a biomarker of disease severity and could have potential utility in developing circuit-based therapeutics. Published by Elsevier Inc.
Rubio-Aparicio, María; Sánchez-Meca, Julio; López-López, José Antonio; Botella, Juan; Marín-Martínez, Fulgencio
2017-11-01
Subgroup analyses allow us to examine the influence of a categorical moderator on the effect size in meta-analysis. We conducted a simulation study using a dichotomous moderator, and compared the impact of pooled versus separate estimates of the residual between-studies variance on the statistical performance of the Q B (P) and Q B (S) tests for subgroup analyses assuming a mixed-effects model. Our results suggested that similar performance can be expected as long as there are at least 20 studies and these are approximately balanced across categories. Conversely, when subgroups were unbalanced, the practical consequences of having heterogeneous residual between-studies variances were more evident, with both tests leading to the wrong statistical conclusion more often than in the conditions with balanced subgroups. A pooled estimate should be preferred for most scenarios, unless the residual between-studies variances are clearly different and there are enough studies in each category to obtain precise separate estimates. © 2017 The British Psychological Society.
Perkins, Thomas John; Stokes, Mark Andrew; McGillivray, Jane Anne; Mussap, Alexander Julien; Cox, Ivanna Anne; Maller, Jerome Joseph; Bittar, Richard Garth
2014-11-30
There is evidence emerging from Diffusion Tensor Imaging (DTI) research that autism spectrum disorders (ASD) are associated with greater impairment in the left hemisphere. Although this has been quantified with volumetric region of interest analyses, it has yet to be tested with white matter integrity analysis. In the present study, tract based spatial statistics was used to contrast white matter integrity of 12 participants with high-functioning autism or Aspergers syndrome (HFA/AS) with 12 typically developing individuals. Fractional Anisotropy (FA) was examined, in addition to axial, radial and mean diffusivity (AD, RD and MD). In the left hemisphere, participants with HFA/AS demonstrated significantly reduced FA in predominantly thalamic and fronto-parietal pathways and increased RD. Symmetry analyses confirmed that in the HFA/AS group, WM disturbance was significantly greater in the left compared to right hemisphere. These findings contribute to a growing body of literature suggestive of reduced FA in ASD, and provide preliminary evidence for RD impairments in the left hemisphere. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Trull, Timothy J.; Vergés, Alvaro; Wood, Phillip K.; Jahng, Seungmin; Sher, Kenneth J.
2013-01-01
We examined the latent structure underlying the criteria for DSM–IV–TR (American Psychiatric Association, 2000, Diagnostic and statistical manual of mental disorders (4th ed., text revision). Washington, DC: Author.) personality disorders in a large nationally representative sample of U.S. adults. Personality disorder symptom data were collected using a structured diagnostic interview from approximately 35,000 adults assessed over two waves of data collection in the National Epidemiologic Survey on Alcohol and Related Conditions. Our analyses suggested that a seven-factor solution provided the best fit for the data, and these factors were marked primarily by one or at most two personality disorder criteria sets. A series of regression analyses that used external validators tapping Axis I psychopathology, treatment for mental health problems, functioning scores, interpersonal conflict, and suicidal ideation and behavior provided support for the seven-factor solution. We discuss these findings in the context of previous studies that have examined the structure underlying the personality disorder criteria as well as the current proposals for DSM-5 personality disorders. PMID:22506626
Subsurface microbial diversity in deep-granitic-fracture water in Colorado
Sahl, J.W.; Schmidt, R.; Swanner, E.D.; Mandernack, K.W.; Templeton, A.S.; Kieft, Thomas L.; Smith, R.L.; Sanford, W.E.; Callaghan, R.L.; Mitton, J.B.; Spear, J.R.
2008-01-01
A microbial community analysis using 16S rRNA gene sequencing was performed on borehole water and a granite rock core from Henderson Mine, a >1,000-meter-deep molybdenum mine near Empire, CO. Chemical analysis of borehole water at two separate depths (1,044 m and 1,004 m below the mine entrance) suggests that a sharp chemical gradient exists, likely from the mixing of two distinct subsurface fluids, one metal rich and one relatively dilute; this has created unique niches for microorganisms. The microbial community analyzed from filtered, oxic borehole water indicated an abundance of sequences from iron-oxidizing bacteria (Gallionella spp.) and was compared to the community from the same borehole after 2 weeks of being plugged with an expandable packer. Statistical analyses with UniFrac revealed a significant shift in community structure following the addition of the packer. Phospholipid fatty acid (PLFA) analysis suggested that Nitrosomonadales dominated the oxic borehole, while PLFAs indicative of anaerobic bacteria were most abundant in the samples from the plugged borehole. Microbial sequences were represented primarily by Firmicutes, Proteobacteria, and a lineage of sequences which did not group with any identified bacterial division; phylogenetic analyses confirmed the presence of a novel candidate division. This "Henderson candidate division" dominated the clone libraries from the dilute anoxic fluids. Sequences obtained from the granitic rock core (1,740 m below the surface) were represented by the divisions Proteobacteria (primarily the family Ralstoniaceae) and Firmicutes. Sequences grouping within Ralstoniaceae were also found in the clone libraries from metal-rich fluids yet were absent in more dilute fluids. Lineage-specific comparisons, combined with phylogenetic statistical analyses, show that geochemical variance has an important effect on microbial community structure in deep, subsurface systems. Copyright ?? 2008, American Society for Microbiology. All Rights Reserved.
Subsurface Microbial Diversity in Deep-Granitic-Fracture Water in Colorado▿
Sahl, Jason W.; Schmidt, Raleigh; Swanner, Elizabeth D.; Mandernack, Kevin W.; Templeton, Alexis S.; Kieft, Thomas L.; Smith, Richard L.; Sanford, William E.; Callaghan, Robert L.; Mitton, Jeffry B.; Spear, John R.
2008-01-01
A microbial community analysis using 16S rRNA gene sequencing was performed on borehole water and a granite rock core from Henderson Mine, a >1,000-meter-deep molybdenum mine near Empire, CO. Chemical analysis of borehole water at two separate depths (1,044 m and 1,004 m below the mine entrance) suggests that a sharp chemical gradient exists, likely from the mixing of two distinct subsurface fluids, one metal rich and one relatively dilute; this has created unique niches for microorganisms. The microbial community analyzed from filtered, oxic borehole water indicated an abundance of sequences from iron-oxidizing bacteria (Gallionella spp.) and was compared to the community from the same borehole after 2 weeks of being plugged with an expandable packer. Statistical analyses with UniFrac revealed a significant shift in community structure following the addition of the packer. Phospholipid fatty acid (PLFA) analysis suggested that Nitrosomonadales dominated the oxic borehole, while PLFAs indicative of anaerobic bacteria were most abundant in the samples from the plugged borehole. Microbial sequences were represented primarily by Firmicutes, Proteobacteria, and a lineage of sequences which did not group with any identified bacterial division; phylogenetic analyses confirmed the presence of a novel candidate division. This “Henderson candidate division” dominated the clone libraries from the dilute anoxic fluids. Sequences obtained from the granitic rock core (1,740 m below the surface) were represented by the divisions Proteobacteria (primarily the family Ralstoniaceae) and Firmicutes. Sequences grouping within Ralstoniaceae were also found in the clone libraries from metal-rich fluids yet were absent in more dilute fluids. Lineage-specific comparisons, combined with phylogenetic statistical analyses, show that geochemical variance has an important effect on microbial community structure in deep, subsurface systems. PMID:17981950
Prevalence and risk factors of mucous retention cysts in a Brazilian population.
Rodrigues, C D; Freire, G F; Silva, L B; Fonseca da Silveira, M M; Estrela, C
2009-10-01
The aim of this study was to estimate the prevalence and analyse the risk factors of mucous retention cysts (MRCs) of the maxillary sinus. From November 2002 to May 2007, 6293 panoramic radiographs were taken and retrospectively reviewed to estimate the prevalence of MRCs and to analyse risk factors (month, relative air humidity and mean temperature). The months in which MRCs occurred were recorded and analysed. The Spearman rank correlation coefficient was used to correlate MRCs with relative air humidity, environmental temperature and month (significance level R(2)>0.85). Of the 6293 radiographs analysed, 201 (3.19%) images were suggestive of MRCs. No significant correlation was found between MRCs and relative humidity (R(2) = 0.15) of the air or temperature (R(2) = 0.40). The months with the highest numbers of MRC cases were September, October and November. The prevalence of MRCs was low, and no statistical correlation was found between MRCs and relative humidity of the air, mean temperature or month.
Continuous Covariate Imbalance and Conditional Power for Clinical Trial Interim Analyses
Ciolino, Jody D.; Martin, Renee' H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.
2014-01-01
Oftentimes valid statistical analyses for clinical trials involve adjustment for known influential covariates, regardless of imbalance observed in these covariates at baseline across treatment groups. Thus, it must be the case that valid interim analyses also properly adjust for these covariates. There are situations, however, in which covariate adjustment is not possible, not planned, or simply carries less merit as it makes inferences less generalizable and less intuitive. In this case, covariate imbalance between treatment groups can have a substantial effect on both interim and final primary outcome analyses. This paper illustrates the effect of influential continuous baseline covariate imbalance on unadjusted conditional power (CP), and thus, on trial decisions based on futility stopping bounds. The robustness of the relationship is illustrated for normal, skewed, and bimodal continuous baseline covariates that are related to a normally distributed primary outcome. Results suggest that unadjusted CP calculations in the presence of influential covariate imbalance require careful interpretation and evaluation. PMID:24607294
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Seismology: tectonic strain in plate interiors?
Calais, E; Mattioli, G; DeMets, C; Nocquet, J-M; Stein, S; Newman, A; Rydelek, P
2005-12-15
It is not fully understood how or why the inner areas of tectonic plates deform, leading to large, although infrequent, earthquakes. Smalley et al. offer a potential breakthrough by suggesting that surface deformation in the central United States accumulates at rates comparable to those across plate boundaries. However, we find no statistically significant deformation in three independent analyses of the data set used by Smalley et al., and conclude therefore that only the upper bounds of magnitude and repeat time for large earthquakes can be inferred at present.
Which comes first: employee attitudes or organizational financial and market performance?
Schneider, Benjamin; Hanges, Paul J; Smith, D Brent; Salvaggio, Amy Nicole
2003-10-01
Employee attitude data from 35 companies over 8 years were analyzed at the organizational level of analysis against financial (return on assets; ROA) and market performance (earnings per share: EPS) data using lagged analyses permitting exploration of priority in likely causal ordering. Analyses revealed statistically significant and stable relationships across various time lags for 3 of 7 scales. Overall Job Satisfaction and Satisfaction With Security were predicted by ROA and EPS more strongly than the reverse (although some of the reverse relationships were also significant); Satisfaction With Pay suggested a more reciprocal relationship with ROA and EPS. The discussion of results provides a preliminary framework for understanding issues surrounding employee attitudes, high-performance work practices, and organizational financial and market performance.
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
Statistical power analysis in wildlife research
Steidl, R.J.; Hayes, J.P.
1997-01-01
Statistical power analysis can be used to increase the efficiency of research efforts and to clarify research results. Power analysis is most valuable in the design or planning phases of research efforts. Such prospective (a priori) power analyses can be used to guide research design and to estimate the number of samples necessary to achieve a high probability of detecting biologically significant effects. Retrospective (a posteriori) power analysis has been advocated as a method to increase information about hypothesis tests that were not rejected. However, estimating power for tests of null hypotheses that were not rejected with the effect size observed in the study is incorrect; these power estimates will always be a??0.50 when bias adjusted and have no relation to true power. Therefore, retrospective power estimates based on the observed effect size for hypothesis tests that were not rejected are misleading; retrospective power estimates are only meaningful when based on effect sizes other than the observed effect size, such as those effect sizes hypothesized to be biologically significant. Retrospective power analysis can be used effectively to estimate the number of samples or effect size that would have been necessary for a completed study to have rejected a specific null hypothesis. Simply presenting confidence intervals can provide additional information about null hypotheses that were not rejected, including information about the size of the true effect and whether or not there is adequate evidence to 'accept' a null hypothesis as true. We suggest that (1) statistical power analyses be routinely incorporated into research planning efforts to increase their efficiency, (2) confidence intervals be used in lieu of retrospective power analyses for null hypotheses that were not rejected to assess the likely size of the true effect, (3) minimum biologically significant effect sizes be used for all power analyses, and (4) if retrospective power estimates are to be reported, then the I?-level, effect sizes, and sample sizes used in calculations must also be reported.
Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary
2003-02-01
Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-01-01
Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-08-01
To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.
NASA Astrophysics Data System (ADS)
Estrada, Francisco; Perron, Pierre; Martínez-López, Benjamín
2013-12-01
The warming of the climate system is unequivocal as evidenced by an increase in global temperatures by 0.8°C over the past century. However, the attribution of the observed warming to human activities remains less clear, particularly because of the apparent slow-down in warming since the late 1990s. Here we analyse radiative forcing and temperature time series with state-of-the-art statistical methods to address this question without climate model simulations. We show that long-term trends in total radiative forcing and temperatures have largely been determined by atmospheric greenhouse gas concentrations, and modulated by other radiative factors. We identify a pronounced increase in the growth rates of both temperatures and radiative forcing around 1960, which marks the onset of sustained global warming. Our analyses also reveal a contribution of human interventions to two periods when global warming slowed down. Our statistical analysis suggests that the reduction in the emissions of ozone-depleting substances under the Montreal Protocol, as well as a reduction in methane emissions, contributed to the lower rate of warming since the 1990s. Furthermore, we identify a contribution from the two world wars and the Great Depression to the documented cooling in the mid-twentieth century, through lower carbon dioxide emissions. We conclude that reductions in greenhouse gas emissions are effective in slowing the rate of warming in the short term.
diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.
Lun, Aaron T L; Smyth, Gordon K
2015-08-19
Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.
D'Addabbo, Annarita; Palmieri, Orazio; Maglietta, Rosalia; Latiano, Anna; Mukherjee, Sayan; Annese, Vito; Ancona, Nicola
2011-08-01
A meta-analysis has re-analysed previous genome-wide association scanning definitively confirming eleven genes and further identifying 21 new loci. However, the identified genes/loci still explain only the minority of genetic predisposition of Crohn's disease. To identify genes weakly involved in disease predisposition by analysing chromosomal regions enriched of single nucleotide polymorphisms with modest statistical association. We utilized the WTCCC data set evaluating 1748 CD and 2938 controls. The identification of candidate genes/loci was performed by a two-step procedure: first of all chromosomal regions enriched of weak association signals were localized; subsequently, weak signals clustered in gene regions were identified. The statistical significance was assessed by non parametric permutation tests. The cytoband enrichment analysis highlighted 44 regions (P≤0.05) enriched with single nucleotide polymorphisms significantly associated with the trait including 23 out of 31 previously confirmed and replicated genes. Importantly, we highlight further 20 novel chromosomal regions carrying approximately one hundred genes/loci with modest association. Amongst these we find compelling functional candidate genes such as MAPT, GRB2 and CREM, LCT, and IL12RB2. Our study suggests a different statistical perspective to discover genes weakly associated with a given trait, although further confirmatory functional studies are needed. Copyright © 2011 Editrice Gastroenterologica Italiana S.r.l. All rights reserved.
Anderson, Craig A; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L; Bushman, Brad J; Sakamoto, Akira; Rothstein, Hannah R; Saleem, Muniba
2010-03-01
Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past meta-analyses; (b) cross-cultural comparisons; (c) longitudinal studies for all outcomes except physiological arousal; (d) conservative statistical controls; (e) multiple moderator analyses; and (f) sensitivity analyses. Social-cognitive models and cultural differences between Japan and Western countries were used to generate theory-based predictions. Meta-analyses yielded significant effects for all 6 outcome variables. The pattern of results for different outcomes and research designs (experimental, cross-sectional, longitudinal) fit theoretical predictions well. The evidence strongly suggests that exposure to violent video games is a causal risk factor for increased aggressive behavior, aggressive cognition, and aggressive affect and for decreased empathy and prosocial behavior. Moderator analyses revealed significant research design effects, weak evidence of cultural differences in susceptibility and type of measurement effects, and no evidence of sex differences in susceptibility. Results of various sensitivity analyses revealed these effects to be robust, with little evidence of selection (publication) bias.
Tavares, M; de Lima, C; Fernandes, W; Martinelli, V; de Lucena, M; Lima, F; Telles, A; Brandão, L; de Melo Júnior, M
2016-12-01
Inflammatory bowel disease consists of multifactorial diseases whose common manifestation is inflammation of the gastrointestinal tract and their pathogenesis remains unknown. This study aimed to analyse the gene polymorphisms in Brazilian patients with inflammatory bowel disease. A total of 101 patients diagnosed with inflammatory bowel disease were analysed for the tumour necrosis factor-alpha (-308 G/A; rs1800629) and interleukin-10 (-1082 G/A; rs1800896) gene polymorphisms. Genotyping was performed through polymerase chain reaction-sequence-specific primer, then fractionated on 2% agarose gel and visualized after staining by ethidium bromide. The anatomic-clinical form of Crohn's disease (CD) predominant was the inflammatory (32.75%), followed by fistulizing (29.31%) and 27.58% stricturing. As control group, a total of 136 healthy subjects, from the same geographical region, were enrolled. The statistical analyses were performed using R program. The frequency of the A allele at tumour necrosis factor-alpha was high in ulcerative colitis (UC) patients (51%) than in controls (22%; P > 0.01). No statistical difference was found with the genotypic and allelic frequencies of CD patients compared to controls (P = 0.54). The polymorphism -1082G/A of interleukin-10 was not statistical different between the diseases compared to controls. Tumour necrosis factor-alpha (TNF-α) (-308G/A) is associated with UC onset, suggesting that the presence of -308A allele could confer a relative risk of 3.62 more to develop UC in general population. Further studies, increasing the number of individuals, should be performed to ratify the role of TNF-α in the inflammatory bowel disease pathogenesis. © 2016 John Wiley & Sons Ltd.
Mason, W Alex; Fleming, Charles B; Gross, Thomas J; Thompson, Ronald W; Parra, Gilbert R; Haggerty, Kevin P; Snyder, James J
2016-12-01
This randomized controlled trial tested a widely used general parent training program, Common Sense Parenting (CSP), with low-income 8th graders and their families to support a positive transition to high school. The program was tested in its original 6-session format and in a modified format (CSP-Plus), which added 2 sessions that included adolescents. Over 2 annual cohorts, 321 families were enrolled and randomly assigned to either the CSP, CSP-Plus, or minimal-contact control condition. Pretest, posttest, 1-year follow-up, and 2-year follow-up survey data on parenting as well as youth school bonding, social skills, and problem behaviors were collected from parents and youth (94% retention). Extending prior examinations of posttest outcomes, intent-to-treat regression analyses tested for intervention effects at the 2 follow-up assessments, and growth curve analyses examined experimental condition differences in yearly change across time. Separate exploratory tests of moderation by youth gender, youth conduct problems, and family economic hardship also were conducted. Out of 52 regression models predicting 1- and 2-year follow-up outcomes, only 2 out of 104 possible intervention effects were statistically significant. No statistically significant intervention effects were found in the growth curve analyses. Tests of moderation also showed few statistically significant effects. Because CSP already is in widespread use, findings have direct implications for practice. Specifically, findings suggest that the program may not be efficacious with parents of adolescents in a selective prevention context and may reveal the limits of brief, general parent training for achieving outcomes with parents of adolescents. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Kelechi, Teresa J; Mueller, Martina; Zapka, Jane G; King, Dana E
2011-11-01
The aim of this randomized clinical trial was to investigate a cryotherapy (cooling) gel wrap applied to lower leg skin affected by chronic venous disorders to determine whether therapeutic cooling improves skin microcirculation. Chronic venous disorders are under-recognized vascular health problems that result in severe skin damage and ulcerations of the lower legs. Impaired skin microcirculation contributes to venous leg ulcer development, thus new prevention therapies should address the microcirculation to prevent venous leg ulcers. Sixty participants (n = 30 per group) were randomized to receive one of two daily 30-minute interventions for four weeks. The treatment group applied the cryotherapy gel wrap around the affected lower leg skin, or compression and elevated the legs on a special pillow each evening at bedtime. The standard care group wore compression and elevated the legs only. Laboratory pre- and post-measures included microcirculation measures of skin temperature with a thermistor, blood flow with a laser Doppler flowmeter, and venous refill time with a photoplethysmograph. Data were collected between 2008 2009 and analysed using descriptive statistics, paired t-tests or Wilcoxon signed ranks tests, logistic regression analyses, and mixed model analyses. Fifty-seven participants (treatment = 28; standard care = 29) completed the study. The mean age was 62 years, 70% female, 50% African American. In the final adjusted model, there was a statistically significant decrease in blood flow between the two groups (-6.2[-11.8; -0.6], P = 0.03). No statistically significant differences were noted in temperature or venous refill time. Study findings suggest that cryotherapy improves blood flow by slowing movement within the microcirculation and thus might potentially provide a therapeutic benefit to prevent leg ulcers. © 2011 Blackwell Publishing Ltd.
Mueller, Martina; Zapka, Jane G.; King, Dana E.
2011-01-01
Aim This randomized clinical trial was conducted 2008 – 2009 to investigate a cryotherapy (cooling) gel wrap applied to lower leg skin affected by chronic venous disorders to determine whether therapeutic cooling improves skin microcirculation. Impaired skin microcirculation contributes to venous leg ulcer development, thus new prevention therapies should address the microcirculation to prevent venous leg ulcers. Data Sources Sixty participants (n = 30 per group) were randomized to receive one of two daily 30-minute interventions for four weeks. The treatment group applied the cryotherapy gel wrap around the affected lower leg skin, or compression and elevated the legs on a special pillow each evening at bedtime. The standard care group wore compression and elevated the legs only. Laboratory pre- and post-measures included microcirculation measures of skin temperature with a thermistor, blood flow with a laser Doppler flowmeter, and venous refill time with a photoplethysmograph. Review methods Data were analysed using descriptive statistics, paired t-tests or Wilcoxon signed ranks tests, logistic regression analyses, and mixed model analyses. Results Fifty-seven participants (treatment = 28; standard care = 29) completed the study. The mean age was 62 years, 70% female, 50% African American. In the final adjusted model, there was a statistically significant decrease in blood flow between the two groups (−6.2[−11.8; −0.6], P = 0.03). No statistically significant differences were noted in temperature or venous refill time. Conclusion Study findings suggest that cryotherapy improves blood flow by slowing movement within the microcirculation and thus might potentially provide a therapeutic benefit to prevent leg ulcers. PMID:21592186
Lamb survival analysis from birth to weaning in Iranian Kermani sheep.
Barazandeh, Arsalan; Moghbeli, Sadrollah Molaei; Vatankhah, Mahmood; Hossein-Zadeh, Navid Ghavi
2012-04-01
Survival records from 1,763 Kermani lambs born between 1996 and 2004 from 294 ewes and 81 rams were used to determine genetic and non-genetic factors affecting lamb survival. Traits included were lamb survival across five periods from birth to 7, 14, 56, 70, and 90 days of age. Traits were analyzed under Weibull proportional hazard sire models. Several binary analyses were also conducted using animal models. Statistical models included the fixed class effects of sex of lamb, month and year of birth, a covariate effect of birth weight, and random genetic effects of both sire (in survival analyses) and animal (in binary analyses). The average survival to 90 days of age was 94.8%. Hazard rates ranged from 1.00 (birth to 90 days of age) to 1.73 (birth to 7 days of age) between the two sexes indicating that male lambs were at higher risk of mortality than females (P < 0.01). This study also revealed a curvilinear relationship between lamb survival and lamb birth weight, suggesting that viability and birth weight could be considered simultaneously in the selection programs to obtain optimal birth weight in Kermani lambs. Estimates of heritabilities from survival analyses were medium and ranged from 0.23 to 0.29. In addition, heritability estimates obtained from binary analyses were low and varied from 0.04 to 0.09. The results of this study suggest that progress in survival traits could be possible through managerial strategies and genetic selection.
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
Effect of the menstrual cycle on voice quality.
Silverman, E M; Zimmer, C H
1978-01-01
The question addressed was whether most young women with no vocal training exhibit premenstrual hoarseness. Spectral (acoustical) analyses of the sustained productions of three vowels produced by 20 undergraduates at and at premenstruation were rated for degree of hoarseness. Statistical analysis of the data indicated that the typical subject was no more hoarse of premenstruation than at ovulation. To determine whether this finding represented a genuine characteristic of women's voices or a type II statistical error, a systematic replication was undertaken with another sample of 27 undergraduates. The finding replicated that of the original investigation, suggesting that premenstrual hoarseness is a rarely occurring condition among young women with no vocal training. The apparent differential effect of the menstrual cycle on trained as opposed to untrained voices deserves systematic investigation.
Key indicators of obstetric and neonatal care in the Republic of Sakha (Yakutia).
Burtseva, Tatyana E; Odland, Jon Øyvind; Douglas, Natalya I; Grigoreva, Antonina N; Pavlova, Tatyana Y; Chichahov, Dgulustan A; Afanasieva, Lena N; Baisheva, Nurguyana S; Rad, Yana G; Tomsky, Mikhail I; Postoev, Vitaly A
2016-01-01
In the absence of a medical birth registry, the official statistics are the only sources of information about pregnancy outcomes in the Republic of Sakha (Yakutia) (RS). We analysed the official statistical data about birth rate, fertility, infant and maternal mortality in the RS in the period 2003-2014. Compared with all-Russian data, the RS had a higher birth rate, especially in rural districts. Maternal and infant mortality were also higher compared with all-Russian data, but had a decreasing trend. The majority of deaths occurred in the small level 1 units. We suggest that establishment of good predelivery transportation of pregnant women with high risk of complications from remote areas and centralization of risk deliveries with improved prenatal and neonatal care could improve the pregnancy outcome in Yakutia.
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364
Which statistics should tropical biologists learn?
Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián
2011-09-01
Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.
Crema, Enrico R; Habu, Junko; Kobayashi, Kenichi; Madella, Marco
2016-01-01
Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.
Habu, Junko; Kobayashi, Kenichi; Madella, Marco
2016-01-01
Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes. PMID:27128032
Effect of crowd size on patient volume at a large, multipurpose, indoor stadium.
De Lorenzo, R A; Gray, B C; Bennett, P C; Lamparella, V J
1989-01-01
A prediction of patient volume expected at "mass gatherings" is desirable in order to provide optimal on-site emergency medical care. While several methods of predicting patient loads have been suggested, a reliable technique has not been established. This study examines the frequency of medical emergencies at the Syracuse University Carrier Dome, a 50,500-seat indoor stadium. Patient volume and level of care at collegiate basketball and football games as well as rock concerts, over a 7-year period were examined and tabulated. This information was analyzed using simple regression and nonparametric statistical methods to determine level of correlation between crowd size and patient volume. These analyses demonstrated no statistically significant increase in patient volume for increasing crowd size for basketball and football events. There was a small but statistically significant increase in patient volume for increasing crowd size for concerts. A comparison of similar crowd size for each of the three events showed that patient frequency is greatest for concerts and smallest for basketball. The study suggests that crowd size alone has only a minor influence on patient volume at any given event. Structuring medical services based solely on expected crowd size and not considering other influences such as event type and duration may give poor results.
Clinical correlates of augmentation/combination treatment strategies in major depressive disorder.
Dold, M; Bartova, L; Mendlewicz, J; Souery, D; Serretti, A; Porcelli, S; Zohar, J; Montgomery, S; Kasper, S
2018-05-01
This multicenter, multinational, cross-sectional study aimed to investigate clinical characteristics and treatment outcomes associated with augmentation/combination treatment strategies in major depressive disorder (MDD). Sociodemographic, clinical, and treatment features of 1410 adult MDD patients were compared between MDD patients treated with monotherapy and augmentation/combination medication using descriptive statistics, analyses of covariance (ancova), and Spearman's correlation analyses. 60.64% of all participants received augmentation and/or combination strategies with a mean number of 2.18 ± 1.22 simultaneously prescribed psychiatric drugs. We found male gender, older age, Caucasian descent, higher weight, low educational status, absence of occupation, psychotic symptoms, melancholic and atypical features, suicide risk, in-patient treatment, longer duration of hospitalization, some psychiatric comorbidities (panic disorder, agoraphobia, obsessive-compulsive disorder, and bulimia nervosa), comorbid somatic comorbidity in general and concurrent hypertension, thyroid dysfunction, diabetes, and heart disease in particular, higher current and retrospective Montgomery and Åsberg Depression Rating Scale total scores, treatment resistance, and higher antidepressant dosing to be significantly associated with augmentation/combination treatment. These findings were corroborated when examining the number of concurrently administered psychiatric drugs in the statistical analyses. Our findings suggest a clear association between augmentation/combination strategies and treatment-resistant/difficult-to-treat MDD conditions characterized by severe symptomatology and high amount of psychiatric and somatic comorbidities. © 2018 The Authors Acta Psychiatrica Scandinavica Published by John Wiley & Sons Ltd.
Bao, Zhihua; Ikunaga, Yoko; Matsushita, Yuko; Morimoto, Sho; Takada-Hoshino, Yuko; Okada, Hiroaki; Oba, Hirosuke; Takemoto, Shuhei; Niwa, Shigeru; Ohigashi, Kentaro; Suzuki, Chika; Nagaoka, Kazunari; Takenaka, Makoto; Urashima, Yasufumi; Sekiguchi, Hiroyuki; Kushida, Atsuhiko; Toyota, Koki; Saito, Masanori; Tsushima, Seiya
2012-01-01
We simultaneously examined the bacteria, fungi and nematode communities in Andosols from four agro-geographical sites in Japan using polymerase chain reaction-denaturing gradient gel electrophoresis (PCR-DGGE) and statistical analyses to test the effects of environmental factors including soil properties on these communities depending on geographical sites. Statistical analyses such as Principal component analysis (PCA) and Redundancy analysis (RDA) revealed that the compositions of the three soil biota communities were strongly affected by geographical sites, which were in turn strongly associated with soil characteristics such as total C (TC), total N (TN), C/N ratio and annual mean soil temperature (ST). In particular, the TC, TN and C/N ratio had stronger effects on bacterial and fungal communities than on the nematode community. Additionally, two-way cluster analysis using the combined DGGE profile also indicated that all soil samples were classified into four clusters corresponding to the four sites, showing high site specificity of soil samples, and all DNA bands were classified into four clusters, showing the coexistence of specific DGGE bands of bacteria, fungi and nematodes in Andosol fields. The results of this study suggest that geography relative to soil properties has a simultaneous impact on soil microbial and nematode community compositions. This is the first combined profile analysis of bacteria, fungi and nematodes at different sites with agricultural Andosols. PMID:22223474
Bao, Zhihua; Ikunaga, Yoko; Matsushita, Yuko; Morimoto, Sho; Takada-Hoshino, Yuko; Okada, Hiroaki; Oba, Hirosuke; Takemoto, Shuhei; Niwa, Shigeru; Ohigashi, Kentaro; Suzuki, Chika; Nagaoka, Kazunari; Takenaka, Makoto; Urashima, Yasufumi; Sekiguchi, Hiroyuki; Kushida, Atsuhiko; Toyota, Koki; Saito, Masanori; Tsushima, Seiya
2012-01-01
We simultaneously examined the bacteria, fungi and nematode communities in Andosols from four agro-geographical sites in Japan using polymerase chain reaction-denaturing gradient gel electrophoresis (PCR-DGGE) and statistical analyses to test the effects of environmental factors including soil properties on these communities depending on geographical sites. Statistical analyses such as Principal component analysis (PCA) and Redundancy analysis (RDA) revealed that the compositions of the three soil biota communities were strongly affected by geographical sites, which were in turn strongly associated with soil characteristics such as total C (TC), total N (TN), C/N ratio and annual mean soil temperature (ST). In particular, the TC, TN and C/N ratio had stronger effects on bacterial and fungal communities than on the nematode community. Additionally, two-way cluster analysis using the combined DGGE profile also indicated that all soil samples were classified into four clusters corresponding to the four sites, showing high site specificity of soil samples, and all DNA bands were classified into four clusters, showing the coexistence of specific DGGE bands of bacteria, fungi and nematodes in Andosol fields. The results of this study suggest that geography relative to soil properties has a simultaneous impact on soil microbial and nematode community compositions. This is the first combined profile analysis of bacteria, fungi and nematodes at different sites with agricultural Andosols.
Amino acid pair- and triplet-wise groupings in the interior of α-helical segments in proteins.
de Sousa, Miguel M; Munteanu, Cristian R; Pazos, Alejandro; Fonseca, Nuno A; Camacho, Rui; Magalhães, A L
2011-02-21
A statistical approach has been applied to analyse primary structure patterns at inner positions of α-helices in proteins. A systematic survey was carried out in a recent sample of non-redundant proteins selected from the Protein Data Bank, which were used to analyse α-helix structures for amino acid pairing patterns. Only residues more than three positions apart from both termini of the α-helix were considered as inner. Amino acid pairings i, i+k (k=1, 2, 3, 4, 5), were analysed and the corresponding 20×20 matrices of relative global propensities were constructed. An analysis of (i, i+4, i+8) and (i, i+3, i+4) triplet patterns was also performed. These analysis yielded information on a series of amino acid patterns (pairings and triplets) showing either high or low preference for α-helical motifs and suggested a novel approach to protein alphabet reduction. In addition, it has been shown that the individual amino acid propensities are not enough to define the statistical distribution of these patterns. Global pair propensities also depend on the type of pattern, its composition and orientation in the protein sequence. The data presented should prove useful to obtain and refine useful predictive rules which can further the development and fine-tuning of protein structure prediction algorithms and tools. Copyright © 2010 Elsevier Ltd. All rights reserved.
Statistical technique for analysing functional connectivity of multiple spike trains.
Masud, Mohammad Shahed; Borisyuk, Roman
2011-03-15
A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.
Pichon, Christophe; du Merle, Laurence; Caliot, Marie Elise; Trieu-Cuot, Patrick; Le Bouguénec, Chantal
2012-04-01
Characterization of small non-coding ribonucleic acids (sRNA) among the large volume of data generated by high-throughput RNA-seq or tiling microarray analyses remains a challenge. Thus, there is still a need for accurate in silico prediction methods to identify sRNAs within a given bacterial species. After years of effort, dedicated software were developed based on comparative genomic analyses or mathematical/statistical models. Although these genomic analyses enabled sRNAs in intergenic regions to be efficiently identified, they all failed to predict antisense sRNA genes (asRNA), i.e. RNA genes located on the DNA strand complementary to that which encodes the protein. The statistical models enabled any genomic region to be analyzed theorically but not efficiently. We present a new model for in silico identification of sRNA and asRNA candidates within an entire bacterial genome. This model was successfully used to analyze the Gram-negative Escherichia coli and Gram-positive Streptococcus agalactiae. In both bacteria, numerous asRNAs are transcribed from the complementary strand of genes located in pathogenicity islands, strongly suggesting that these asRNAs are regulators of the virulence expression. In particular, we characterized an asRNA that acted as an enhancer-like regulator of the type 1 fimbriae production involved in the virulence of extra-intestinal pathogenic E. coli.
Pichon, Christophe; du Merle, Laurence; Caliot, Marie Elise; Trieu-Cuot, Patrick; Le Bouguénec, Chantal
2012-01-01
Characterization of small non-coding ribonucleic acids (sRNA) among the large volume of data generated by high-throughput RNA-seq or tiling microarray analyses remains a challenge. Thus, there is still a need for accurate in silico prediction methods to identify sRNAs within a given bacterial species. After years of effort, dedicated software were developed based on comparative genomic analyses or mathematical/statistical models. Although these genomic analyses enabled sRNAs in intergenic regions to be efficiently identified, they all failed to predict antisense sRNA genes (asRNA), i.e. RNA genes located on the DNA strand complementary to that which encodes the protein. The statistical models enabled any genomic region to be analyzed theorically but not efficiently. We present a new model for in silico identification of sRNA and asRNA candidates within an entire bacterial genome. This model was successfully used to analyze the Gram-negative Escherichia coli and Gram-positive Streptococcus agalactiae. In both bacteria, numerous asRNAs are transcribed from the complementary strand of genes located in pathogenicity islands, strongly suggesting that these asRNAs are regulators of the virulence expression. In particular, we characterized an asRNA that acted as an enhancer-like regulator of the type 1 fimbriae production involved in the virulence of extra-intestinal pathogenic E. coli. PMID:22139924
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Using conventional F-statistics to study unconventional sex-chromosome differentiation.
Rodrigues, Nicolas; Dufresnes, Christophe
2017-01-01
Species with undifferentiated sex chromosomes emerge as key organisms to understand the astonishing diversity of sex-determination systems. Whereas new genomic methods are widening opportunities to study these systems, the difficulty to separately characterize their X and Y homologous chromosomes poses limitations. Here we demonstrate that two simple F -statistics calculated from sex-linked genotypes, namely the genetic distance ( F st ) between sexes and the inbreeding coefficient ( F is ) in the heterogametic sex, can be used as reliable proxies to compare sex-chromosome differentiation between populations. We correlated these metrics using published microsatellite data from two frog species ( Hyla arborea and Rana temporaria ), and show that they intimately relate to the overall amount of X-Y differentiation in populations. However, the fits for individual loci appear highly variable, suggesting that a dense genetic coverage will be needed for inferring fine-scale patterns of differentiation along sex-chromosomes. The applications of these F -statistics, which implies little sampling requirement, significantly facilitate population analyses of sex-chromosomes.
Application of multivariate statistical techniques in microbial ecology
Paliy, O.; Shankar, V.
2016-01-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791
Quantifying variation in speciation and extinction rates with clade data.
Paradis, Emmanuel; Tedesco, Pablo A; Hugueny, Bernard
2013-12-01
High-level phylogenies are very common in evolutionary analyses, although they are often treated as incomplete data. Here, we provide statistical tools to analyze what we name "clade data," which are the ages of clades together with their numbers of species. We develop a general approach for the statistical modeling of variation in speciation and extinction rates, including temporal variation, unknown variation, and linear and nonlinear modeling. We show how this approach can be generalized to a wide range of situations, including testing the effects of life-history traits and environmental variables on diversification rates. We report the results of an extensive simulation study to assess the performance of some statistical tests presented here as well as of the estimators of speciation and extinction rates. These latter results suggest the possibility to estimate correctly extinction rate in the absence of fossils. An example with data on fish is presented. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.
Narita, Saki; Inoue, Manami; Saito, Eiko; Abe, Sarah K; Sawada, Norie; Ishihara, Junko; Iwasaki, Motoki; Yamaji, Taiki; Shimazu, Taichi; Sasazuki, Shizuka; Shibuya, Kenji; Tsugane, Shoichiro
2017-06-01
Epidemiological studies have suggested a protective effect of dietary fiber intake on breast cancer risk while the results have been inconsistent. Our study aimed to investigate the association between dietary fiber intake and breast cancer risk and to explore whether this association is modified by reproductive factors and hormone receptor status of the tumor. A total of 44,444 women aged 45 to 74 years from the Japan Public Health Center-based Prospective Study were included in analyses. Dietary intake assessment was performed using a validated 138-item food frequency questionnaire (FFQ). Hazard ratios (HRs) and 95% confidence intervals (CIs) for breast cancer incidence were calculated by multivariate Cox proportional hazards regression models. During 624,423 person-years of follow-up period, 681 breast cancer cases were identified. After adjusting for major confounders for breast cancer risk, inverse trends were observed but statistically non-significant. Extremely high intake of fiber was associated with decreased risk of breast cancer but this should be interpreted with caution due to limited statistical power. In stratified analyses by menopausal and hormone receptor status, null associations were observed except for ER-PR- status. Our findings suggest that extreme high fiber intake may be associated with decreased risk of breast cancer but the level of dietary fiber intake among Japanese population might not be sufficient to examine the association between dietary fiber intake and breast cancer risk.
2011-01-01
Background While there is extensive literature evaluating the impact of phytoestrogen consumption on breast cancer risk, its role on ovarian cancer has received little attention. Methods We conducted a population-based case-control study to evaluate phytoestrogen intake from foods and supplements and epithelial ovarian cancer risk. Cases were identified in six counties in New Jersey through the New Jersey State Cancer Registry. Controls were identified by random digit dialing, CMS (Centers for Medicare and Medicaid Service) lists, and area sampling. A total of 205 cases and 390 controls were included in analyses. Unconditional logistic regression analyses were conducted to examine associations with total phytoestrogens, as well as isoflavones (daidzein, genistein, formononetin, and glycitein), lignans (matairesinol, lariciresinol, pinoresinol, secoisolariciresinol), and coumestrol. Results No statistically significant associations were found with any of the phytoestrogens under evaluation. However, there was a suggestion of an inverse association with total phytoestrogen consumption (from foods and supplements), with an odds ratio (OR) of 0.62 (95% CI: 0.38-1.00; p for trend: 0.04) for the highest vs. lowest tertile of consumption, after adjusting for reproductive covariates, age, race, education, BMI, and total energy. Further adjustment for smoking and physical activity attenuated risk estimates (OR: 0.66; 95% CI: 0.41-1.08). There was little evidence of an inverse association for isoflavones, lignans, or coumestrol. Conclusions This study provided some suggestion that phytoestrogen consumption may decrease ovarian cancer risk, although results did not reach statistical significance. PMID:21943063
Bull, Marta E; Heath, Laura M; McKernan-Mullin, Jennifer L; Kraft, Kelli M; Acevedo, Luis; Hitti, Jane E; Cohn, Susan E; Tapia, Kenneth A; Holte, Sarah E; Dragavon, Joan A; Coombs, Robert W; Mullins, James I; Frenkel, Lisa M
2013-04-15
Whether unique human immunodeficiency type 1 (HIV) genotypes occur in the genital tract is important for vaccine development and management of drug resistant viruses. Multiple cross-sectional studies suggest HIV is compartmentalized within the female genital tract. We hypothesize that bursts of HIV replication and/or proliferation of infected cells captured in cross-sectional analyses drive compartmentalization but over time genital-specific viral lineages do not form; rather viruses mix between genital tract and blood. Eight women with ongoing HIV replication were studied during a period of 1.5 to 4.5 years. Multiple viral sequences were derived by single-genome amplification of the HIV C2-V5 region of env from genital secretions and blood plasma. Maximum likelihood phylogenies were evaluated for compartmentalization using 4 statistical tests. In cross-sectional analyses compartmentalization of genital from blood viruses was detected in three of eight women by all tests; this was associated with tissue specific clades containing multiple monotypic sequences. In longitudinal analysis, the tissues-specific clades did not persist to form viral lineages. Rather, across women, HIV lineages were comprised of both genital tract and blood sequences. The observation of genital-specific HIV clades only in cross-sectional analysis and an absence of genital-specific lineages in longitudinal analyses suggest a dynamic interchange of HIV variants between the female genital tract and blood.
DNA viewed as an out-of-equilibrium structure
NASA Astrophysics Data System (ADS)
Provata, A.; Nicolis, C.; Nicolis, G.
2014-05-01
The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ2 tests shows that DNA can not be described as a low order Markov chain of order up to r =6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.
DNA viewed as an out-of-equilibrium structure.
Provata, A; Nicolis, C; Nicolis, G
2014-05-01
The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ^{2} tests shows that DNA can not be described as a low order Markov chain of order up to r=6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
Latent transition analysis of pre-service teachers' efficacy in mathematics and science
NASA Astrophysics Data System (ADS)
Ward, Elizabeth Kennedy
This study modeled changes in pre-service teacher efficacy in mathematics and science over the course of the final year of teacher preparation using latent transition analysis (LTA), a longitudinal form of analysis that builds on two modeling traditions (latent class analysis (LCA) and auto-regressive modeling). Data were collected using the STEBI-B, MTEBI-r, and the ABNTMS instruments. The findings suggest that LTA is a viable technique for use in teacher efficacy research. Teacher efficacy is modeled as a construct with two dimensions: personal teaching efficacy (PTE) and outcome expectancy (OE). Findings suggest that the mathematics and science teaching efficacy (PTE) of pre-service teachers is a multi-class phenomena. The analyses revealed a four-class model of PTE at the beginning and end of the final year of teacher training. Results indicate that when pre-service teachers transition between classes, they tend to move from a lower efficacy class into a higher efficacy class. In addition, the findings suggest that time-varying variables (attitudes and beliefs) and time-invariant variables (previous coursework, previous experiences, and teacher perceptions) are statistically significant predictors of efficacy class membership. Further, analyses suggest that the measures used to assess outcome expectancy are not suitable for LCA and LTA procedures.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497
Hopewell, Sally; Witt, Claudia M; Linde, Klaus; Icke, Katja; Adedire, Olubusola; Kirtley, Shona; Altman, Douglas G
2018-01-11
Selective reporting of outcomes in clinical trials is a serious problem. We aimed to investigate the influence of the peer review process within biomedical journals on reporting of primary outcome(s) and statistical analyses within reports of randomised trials. Each month, PubMed (May 2014 to April 2015) was searched to identify primary reports of randomised trials published in six high-impact general and 12 high-impact specialty journals. The corresponding author of each trial was invited to complete an online survey asking authors about changes made to their manuscript as part of the peer review process. Our main outcomes were to assess: (1) the nature and extent of changes as part of the peer review process, in relation to reporting of the primary outcome(s) and/or primary statistical analysis; (2) how often authors followed these requests; and (3) whether this was related to specific journal or trial characteristics. Of 893 corresponding authors who were invited to take part in the online survey 258 (29%) responded. The majority of trials were multicentre (n = 191; 74%); median sample size 325 (IQR 138 to 1010). The primary outcome was clearly defined in 92% (n = 238), of which the direction of treatment effect was statistically significant in 49%. The majority responded (1-10 Likert scale) they were satisfied with the overall handling (mean 8.6, SD 1.5) and quality of peer review (mean 8.5, SD 1.5) of their manuscript. Only 3% (n = 8) said that the editor or peer reviewers had asked them to change or clarify the trial's primary outcome. However, 27% (n = 69) reported they were asked to change or clarify the statistical analysis of the primary outcome; most had fulfilled the request, the main motivation being to improve the statistical methods (n = 38; 55%) or avoid rejection (n = 30; 44%). Overall, there was little association between authors being asked to make this change and the type of journal, intervention, significance of the primary outcome, or funding source. Thirty-six percent (n = 94) of authors had been asked to include additional analyses that had not been included in the original manuscript; in 77% (n = 72) these were not pre-specified in the protocol. Twenty-three percent (n = 60) had been asked to modify their overall conclusion, usually (n = 53; 88%) to provide a more cautious conclusion. Overall, most changes, as a result of the peer review process, resulted in improvements to the published manuscript; there was little evidence of a negative impact in terms of post hoc changes of the primary outcome. However, some suggested changes might be considered inappropriate, such as unplanned additional analyses, and should be discouraged.
Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
Huncharek, M; Kupelnick, B
2001-01-01
The etiology of epithelial ovarian cancer is unknown. Prior work suggests that high dietary fat intake is associated with an increased risk of this tumor, although this association remains speculative. A meta-analysis was performed to evaluate this suspected relationship. Using previously described methods, a protocol was developed for a meta-analysis examining the association between high vs. low dietary fat intake and the risk of epithelial ovarian cancer. Literature search techniques, study inclusion criteria, and statistical procedures were prospectively defined. Data from observational studies were pooled using a general variance-based meta-analytic method employing confidence intervals (CI) previously described by Greenland. The outcome of interest was a summary relative risk (RRs) reflecting the risk of ovarian cancer associated with high vs. low dietary fat intake. Sensitivity analyses were performed when necessary to evaluate any observed statistical heterogeneity. The literature search yielded 8 observational studies enrolling 6,689 subjects. Data were stratified into three dietary fat intake categories: total fat, animal fat, and saturated fat. Initial tests for statistical homogeneity demonstrated that hospital-based studies accounted for observed heterogeneity possibly because of selection bias. Accounting for this, an RRs was calculated for high vs. low total fat intake, yielding a value of 1.24 (95% CI = 1.07-1.43), a statistically significant result. That is, high total fat intake is associated with a 24% increased risk of ovarian cancer development. The RRs for high saturated fat intake was 1.20 (95% CI = 1.04-1.39), suggesting a 20% increased risk of ovarian cancer among subjects with these dietary habits. High vs. low animal fat diet gave an RRs of 1.70 (95% CI = 1.43-2.03), consistent with a statistically significant 70% increased ovarian cancer risk. High dietary fat intake appears to represent a significant risk factor for the development of ovarian cancer. The magnitude of this risk associated with total fat and saturated fat is rather modest. Ovarian cancer risk associated with high animal fat intake appears significantly greater than that associated with the other types of fat intake studied, although this requires confirmation via larger analyses. Further work is needed to clarify factors that may modify the effects of dietary fat in vivo.
NASA Astrophysics Data System (ADS)
Kahveci, Ajda; Kahveci, Murat; Mansour, Nasser; Alarfaj, Maher Mohammed
2017-06-01
Teachers play a key role in moving reform-based science education practices into the classroom. Based on research that emphasizes the importance of teachers' affective states, this study aimed to explore the constructs pedagogical discontentment, science teaching self-efficacy, intentions to reform, and their correlations. Also, it aimed to provide empirical evidence in light of a previously proposed theoretical model while focusing on an entirely new context in Middle East. Data were collected in Saudi Arabia with a total of randomly selected 994 science teachers, 656 of whom were females and 338 were males. To collect the data, the Arabic versions of the Science Teachers' Pedagogical Discontentment scale, the Science Teaching Efficacy Beliefs Instrument and the Intentions to Reform Science Teaching scale were developed. For assuring the validity of the instruments in a non-Western context, rigorous cross-cultural validations procedures were followed. Factor analyses were conducted for construct validation and descriptive statistical analyses were performed including frequency distributions and normality checks. Univariate analyses of variance were run to explore statistically significant differences between groups of teachers. Cross-tabulation and correlation analyses were conducted to explore relationships. The findings suggest effect of teacher characteristics such as age and professional development program attendance on the affective states. The results demonstrate that teachers who attended a relatively higher number of programs had lower level of intentions to reform raising issues regarding the conduct and outcomes of professional development. Some of the findings concerning interrelationships among the three constructs challenge and serve to expand the previously proposed theoretical model.
Haynos, Ann F.; Pearson, Carolyn M.; Utzinger, Linsey M.; Wonderlich, Stephen A.; Crosby, Ross D.; Mitchell, James E.; Crow, Scott J.; Peterson, Carol B.
2016-01-01
Objective Evidence suggests that eating disorder subtypes reflecting under-controlled, over-controlled, and low psychopathology personality traits constitute reliable phenotypes that differentiate treatment response. This study is the first to use statistical analyses to identify these subtypes within treatment-seeking individuals with bulimia nervosa (BN) and to use these statistically derived clusters to predict clinical outcomes. Methods Using variables from the Dimensional Assessment of Personality Pathology–Basic Questionnaire, K-means cluster analyses identified under-controlled, over-controlled, and low psychopathology subtypes within BN patients (n = 80) enrolled in a treatment trial. Generalized linear models examined the impact of personality subtypes on Eating Disorder Examination global score, binge eating frequency, and purging frequency cross-sectionally at baseline and longitudinally at end of treatment (EOT) and follow-up. In the longitudinal models, secondary analyses were conducted to examine personality subtype as a potential moderator of response to Cognitive Behavioral Therapy-Enhanced (CBT-E) or Integrative Cognitive-Affective Therapy for BN (ICAT-BN). Results There were no baseline clinical differences between groups. In the longitudinal models, personality subtype predicted binge eating (p = .03) and purging (p = .01) frequency at EOT and binge eating frequency at follow-up (p = .045). The over-controlled group demonstrated the best outcomes on these variables. In secondary analyses, there was a treatment by subtype interaction for purging at follow-up (p = .04), which indicated a superiority of CBT-E over ICAT-BN for reducing purging among the over-controlled group. Discussion Empirically derived personality subtyping is appears to be a valid classification system with potential to guide eating disorder treatment decisions. PMID:27611235
Ibáñez, Sergio J.; García, Javier; Feu, Sebastian; Lorenzo, Alberto; Sampaio, Jaime
2009-01-01
The aim of the present study was to identify the game-related statistics that discriminated basketball winning and losing teams in each of the three consecutive games played in a condensed tournament format. The data were obtained from the Spanish Basketball Federation and included game-related statistics from the Under-20 league (2005-2006 and 2006-2007 seasons). A total of 223 games were analyzed with the following game-related statistics: two and three-point field goal (made and missed), free-throws (made and missed), offensive and defensive rebounds, assists, steals, turnovers, blocks (made and received), fouls committed, ball possessions and offensive rating. Results showed that winning teams in this competition had better values in all game-related statistics, with the exception of three point field goals made, free-throws missed and turnovers (p ≥ 0.05). The main effect of game number was only identified in turnovers, with a statistical significant decrease between the second and third game. No interaction was found in the analysed variables. A discriminant analysis allowed identifying the two-point field goals made, the defensive rebounds and the assists as discriminators between winning and losing teams in all three games. Additionally to these, only the three-point field goals made contributed to discriminate teams in game three, suggesting a moderate effect of fatigue. Coaches may benefit from being aware of this variation in game determinant related statistics and, also, from using offensive and defensive strategies in the third game, allowing to explore or hide the three point field-goals performance. Key points Overall team performances along the three consecutive games were very similar, not confirming an accumulated fatigue effect. The results from the three-point field goals in the third game suggested that winning teams were able to shoot better from longer distances and this could be the result of exhibiting higher conditioning status and/or the losing teams’ exhibiting low conditioning in defense. PMID:24150011
Hospital inpatient self-administration of medicine programmes: a critical literature review.
Wright, Julia; Emerson, Angela; Stephens, Martin; Lennan, Elaine
2006-06-01
The Department of Health, pharmaceutical and nursing bodies have advocated the benefits of self-administration programmes (SAPs), but their implementation within UK hospitals has been limited. Perceived barriers are: anticipated increased workload, insufficient resources and patient safety concerns. This review aims to discover if benefits of SAPs are supported in the literature in relation to risk and resource implications. Electronic databases were searched up to March 2004. Published English language articles that described and evaluated implementation of an SAP were included. Outcomes reported were: compliance measures, errors, knowledge, patient satisfaction, and nursing and pharmacy time. Most of the 51 papers reviewed had methodological flaws. SAPs varied widely in content and structure. Twelve studies (10 controlled) measured compliance by tablet counts. Of 7 studies subjected to statistical analysis, four demonstrated a significant difference in compliance between SAP and controls. Eight studies (5 controlled) measured errors as an outcome. Of the two evaluated statistically, only one demonstrated significantly fewer medication errors in the SAP group than in controls. Seventeen papers (11 controlled) studied the effect of SAPs on patients' medication knowledge. Ten of the 11 statistically analysed studies showed that SAP participants knew significantly more about some aspects of their medication than did controls. Seventeen studies (5 controlled), measured patient satisfaction. Two studies were statistically analysed and these studies suggested that patients were satisfied and preferred SAP. Seven papers studied pharmacy time, three studied nursing time but results were not compared to controls. The paucity of well-designed studies, flawed methodology and inadequate reporting in many papers make conclusions hard to draw. Conclusive evidence that SAPs improve compliance was not provided. Although patients participating in SAPs make errors, small numbers of patients are often responsible for a large number of errors. Whilst most studies suggest that SAPs increase patient's knowledge in part, it is difficult to separate out the effect of the educational component of many SAPs. Most patients who participated in SAPs were satisfied with their care and many would choose to take part in a SAP in the future. No studies measured the total resource requirement of implementing and maintaining a SAP.
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
Information filtering via biased heat conduction.
Liu, Jian-Guo; Zhou, Tao; Guo, Qiang
2011-09-01
The process of heat conduction has recently found application in personalized recommendation [Zhou et al., Proc. Natl. Acad. Sci. USA 107, 4511 (2010)], which is of high diversity but low accuracy. By decreasing the temperatures of small-degree objects, we present an improved algorithm, called biased heat conduction, which could simultaneously enhance the accuracy and diversity. Extensive experimental analyses demonstrate that the accuracy on MovieLens, Netflix, and Delicious datasets could be improved by 43.5%, 55.4% and 19.2%, respectively, compared with the standard heat conduction algorithm and also the diversity is increased or approximately unchanged. Further statistical analyses suggest that the present algorithm could simultaneously identify users' mainstream and special tastes, resulting in better performance than the standard heat conduction algorithm. This work provides a creditable way for highly efficient information filtering.
Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas
2015-12-01
The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.
2013-04-01
Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.
Statistics of fully turbulent impinging jets
NASA Astrophysics Data System (ADS)
Wilke, Robert; Sesterhenn, Jörn
2017-08-01
Direct numerical simulations of sub- and supersonic impinging jets with Reynolds numbers of 3300 and 8000 are carried out to analyse their statistical properties. The influence of the parameters Mach number, Reynolds number and ambient temperature on the mean velocity and temperature fields are studied. For the compressible subsonic cold impinging jets into a heated environment, different Reynolds analogies are assesses. It is shown, that the (original) Reynolds analogy as well as the Chilton Colburn analogy are in good agreement with the DNS data outside the impinging area. The generalised Reynolds analogy (GRA) and the Crocco-Busemann relation are not suited for the estimation of the mean temperature field based on the mean velocity field of impinging jets. Furthermore, the prediction of fluctuating temperatures according to the GRA fails. On the contrary, the linear relation between thermodynamic fluctuations of entropy, density and temperature as suggested by Lechner et al. (2001) can be confirmed for the entire wall jet. The turbulent heat flux and Reynolds stress tensor are analysed and brought into coherence with the primary and secondary ring vortices of the wall jet. Budget terms of the Reynolds stress tensor are given as data base for the improvement of turbulence models.
Trull, Timothy J; Vergés, Alvaro; Wood, Phillip K; Jahng, Seungmin; Sher, Kenneth J
2012-10-01
We examined the latent structure underlying the criteria for DSM-IV-TR (American Psychiatric Association, 2000, Diagnostic and statistical manual of mental disorders (4th ed., text revision). Washington, DC: Author.) personality disorders in a large nationally representative sample of U.S. adults. Personality disorder symptom data were collected using a structured diagnostic interview from approximately 35,000 adults assessed over two waves of data collection in the National Epidemiologic Survey on Alcohol and Related Conditions. Our analyses suggested that a seven-factor solution provided the best fit for the data, and these factors were marked primarily by one or at most two personality disorder criteria sets. A series of regression analyses that used external validators tapping Axis I psychopathology, treatment for mental health problems, functioning scores, interpersonal conflict, and suicidal ideation and behavior provided support for the seven-factor solution. We discuss these findings in the context of previous studies that have examined the structure underlying the personality disorder criteria as well as the current proposals for DSM-5 personality disorders. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Jones, D. H.
1985-01-01
A new flexible model of pilot instrument scanning behavior is presented which assumes that the pilot uses a set of deterministic scanning patterns on the pilot's perception of error in the state of the aircraft, and the pilot's knowledge of the interactive nature of the aircraft's systems. Statistical analyses revealed that a three stage Markov process composed of the pilot's three predicted lookpoints (LP), occurring 1/30, 2/30, and 3/30 of a second prior to each LP, accurately modelled the scanning behavior of 14 commercial airline pilots while flying steep turn maneuvers in a Boeing 737 flight simulator. The modelled scanning data for each pilot were not statistically different from the observed scanning data in comparisons of mean dwell time, entropy, and entropy rate. These findings represent the first direct evidence that pilots are using deterministic scanning patterns during instrument flight. The results are interpreted as direct support for the error dependent model and suggestions are made for further research that could allow for identification of the specific scanning patterns suggested by the model.
Hydrometeorological and statistical analyses of heavy rainfall in Midwestern USA
NASA Astrophysics Data System (ADS)
Thorndahl, S.; Smith, J. A.; Krajewski, W. F.
2012-04-01
During the last two decades the mid-western states of the United States of America has been largely afflicted by heavy flood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which raise the question if it is possible to derive general characteristics of the space-time structures of these heavy storms. This is important in order to understand hydrometeorological features, e.g. how storms evolve and with what frequency we can expect extreme storms to occur. In the literature, most studies of extreme rainfall are based on point measurements (rain gauges). However, with high resolution and quality radar observation periods exceeding more than two decades, it is possible to do long-term spatio-temporal statistical analyses of extremes. This makes it possible to link return periods to distributed rainfall estimates and to study precipitation structures which cause floods. However, doing these statistical frequency analyses of rainfall based on radar observations introduces some different challenges, converting radar reflectivity observations to "true" rainfall, which are not problematic doing traditional analyses on rain gauge data. It is for example difficult to distinguish reflectivity from high intensity rain from reflectivity from other hydrometeors such as hail, especially using single polarization radars which are used in this study. Furthermore, reflectivity from bright band (melting layer) should be discarded and anomalous propagation should be corrected in order to produce valid statistics of extreme radar rainfall. Other challenges include combining observations from several radars to one mosaic, bias correction against rain gauges, range correction, ZR-relationships, etc. The present study analyzes radar rainfall observations from 1996 to 2011 based the American NEXRAD network of radars over an area covering parts of Iowa, Wisconsin, Illinois, and Lake Michigan. The radar observations are processed using Hydro-NEXRAD algorithms in order to produce rainfall estimates with a spatial resolution of 1 km and a temporal resolution of 15 min. The rainfall estimates are bias-corrected on a daily basis using a network of rain gauges. Besides a thorough evaluation of the different challenges in investigating heavy rain as described above the study includes suggestions for frequency analysis methods as well as studies of hydrometeorological features of single events.
Publication bias in obesity treatment trials?
Allison, D B; Faith, M S; Gorman, B S
1996-10-01
The present investigation examined the extent of publication bias (namely the tendency to publish significant findings and file away non-significant findings) within the obesity treatment literature. Quantitative literature synthesis of four published meta-analyses from the obesity treatment literature. Interventions in these studies included pharmacological, educational, child, and couples treatments. To assess publication bias, several regression procedures (for example weighted least-squares, random-effects multi-level modeling, and robust regression methods) were used to regress effect sizes onto their standard errors, or proxies thereof, within each of the four meta-analysis. A significant positive beta weight in these analyses signified publication bias. There was evidence for publication bias within two of the four published meta-analyses, such that reviews of published studies were likely to overestimate clinical efficacy. The lack of evidence for publication bias within the two other meta-analyses might have been due to insufficient statistical power rather than the absence of selection bias. As in other disciplines, publication bias appears to exist in the obesity treatment literature. Suggestions are offered for managing publication bias once identified or reducing its likelihood in the first place.
Use of Multivariate Linkage Analysis for Dissection of a Complex Cognitive Trait
Marlow, Angela J.; Fisher, Simon E.; Francks, Clyde; MacPhie, I. Laurence; Cherny, Stacey S.; Richardson, Alex J.; Talcott, Joel B.; Stein, John F.; Monaco, Anthony P.; Cardon, Lon R.
2003-01-01
Replication of linkage results for complex traits has been exceedingly difficult, owing in part to the inability to measure the precise underlying phenotype, small sample sizes, genetic heterogeneity, and statistical methods employed in analysis. Often, in any particular study, multiple correlated traits have been collected, yet these have been analyzed independently or, at most, in bivariate analyses. Theoretical arguments suggest that full multivariate analysis of all available traits should offer more power to detect linkage; however, this has not yet been evaluated on a genomewide scale. Here, we conduct multivariate genomewide analyses of quantitative-trait loci that influence reading- and language-related measures in families affected with developmental dyslexia. The results of these analyses are substantially clearer than those of previous univariate analyses of the same data set, helping to resolve a number of key issues. These outcomes highlight the relevance of multivariate analysis for complex disorders for dissection of linkage results in correlated traits. The approach employed here may aid positional cloning of susceptibility genes in a wide spectrum of complex traits. PMID:12587094
Relating triggering processes in lab experiments with earthquakes.
NASA Astrophysics Data System (ADS)
Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.
2016-12-01
Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.
NASA Astrophysics Data System (ADS)
Alizee, D.; Bonamy, D.
2017-12-01
In inhomogeneous brittle solids like rocks, concrete or ceramics, one usually distinguish nominally brittle fracture, driven by the propagation of a single crack from quasibrittle one, resulting from the accumulation of many microcracks. The latter goes along with intermittent sharp noise, as e.g. revealed by the acoustic emission observed in lab scale compressive fracture experiments or at geophysical scale in the seismic activity. In both cases, statistical analyses have revealed a complex time-energy organization into aftershock sequences obeying a range of robust empirical scaling laws (the Omori-Utsu, productivity and Bath's law) that help carry out seismic hazard analysis and damage mitigation. These laws are usually conjectured to emerge from the collective dynamics of microcrack nucleation. In the experiments presented at AGU, we will show that such a statistical organization is not specific to the quasi-brittle multicracking situations, but also rules the acoustic events produced by a single crack slowly driven in an artificial rock made of sintered polymer beads. This simpler situation has advantageous properties (statistical stationarity in particular) permitting us to uncover the origins of these seismic laws: Both productivity law and Bath's law result from the scale free statistics for event energy and Omori-Utsu law results from the scale-free statistics of inter-event time. This yields predictions on how the associated parameters are related, which were analytically derived. Surprisingly, the so-obtained relations are also compatible with observations on lab scale compressive fracture experiments, suggesting that, in these complex multicracking situations also, the organization into aftershock sequences and associated seismic laws are also ruled by the propagation of individual microcrack fronts, and not by the collective, stress-mediated, microcrack nucleation. Conversely, the relations are not fulfilled in seismology signals, suggesting that additional ingredient should be taken into account.
Inferential Statistics in "Language Teaching Research": A Review and Ways Forward
ERIC Educational Resources Information Center
Lindstromberg, Seth
2016-01-01
This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…
The Impact of Global Budgets on Pharmaceutical Spending and Utilization
Fendrick, A. Mark; Song, Zirui; Landon, Bruce E.; Safran, Dana Gelb; Mechanic, Robert E.; Chernew, Michael E.
2014-01-01
In 2009, Blue Cross Blue Shield of Massachusetts implemented a global budget-based payment system, the Alternative Quality Contract (AQC), in which provider groups assumed accountability for spending. We investigate the impact of global budgets on the utilization of prescription drugs and related expenditures. Our analyses indicate no statistically significant evidence that the AQC reduced the use of drugs. Although the impact may change over time, early evidence suggests that it is premature to conclude that global budget systems may reduce access to medications. PMID:25500751
NASA Astrophysics Data System (ADS)
Zhe, Wang
By using the methods of document literature, questionnaire survey and mathematical statistics, this paper investigates and analyses the cuurent situation of students' participation in extrucurricular sports activities of 36 private middle schools in Henan province which have legal education procedures through the following aspects: the attitude, motivation, times, duration, selection of programs, and influential factors of participating in extracurricular sports activities. Based on the investigation and analysis, this paper points out the existing problems and puts forward suggestions
Morris, Roisin; MacNeela, Padraig; Scott, Anne; Treacy, Pearl; Hyde, Abbey; O'Brien, Julian; Lehwaldt, Daniella; Byrne, Anne; Drennan, Jonathan
2008-04-01
In a study to establish the interrater reliability of the Irish Nursing Minimum Data Set (I-NMDS) for mental health difficulties relating to the choice of reliability test statistic were encountered. The objective of this paper is to highlight the difficulties associated with testing interrater reliability for an ordinal scale using a relatively homogenous sample and the recommended kw statistic. One pair of mental health nurses completed the I-NMDS for mental health for a total of 30 clients attending a mental health day centre over a two-week period. Data was analysed using the kw and percentage agreement statistics. A total of 34 of the 38 I-NMDS for mental health variables with lower than acceptable levels of kw reliability scores achieved acceptable levels of reliability according to their percentage agreement scores. The study findings implied that, due to the homogeneity of the sample, low variability within the data resulted in the 'base rate problem' associated with the use of kw statistic. Conclusions point to the interpretation of kw in tandem with percentage agreement scores. Suggestions that kw scores were low due to chance agreement and that one should strive to use a study sample with known variability are queried.
Marzulli, F; Maguire, H C
1982-02-01
Several guinea-pig predictive test methods were evaluated by comparison of results with those obtained with human predictive tests, using ten compounds that have been used in cosmetics. The method involves the statistical analysis of the frequency with which guinea-pig tests agree with the findings of tests in humans. In addition, the frequencies of false positive and false negative predictive findings are considered and statistically analysed. The results clearly demonstrate the superiority of adjuvant tests (complete Freund's adjuvant) in determining skin sensitizers and the overall superiority of the guinea-pig maximization test in providing results similar to those obtained by human testing. A procedure is suggested for utilizing adjuvant and non-adjuvant test methods for characterizing compounds as of weak, moderate or strong sensitizing potential.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Jackson, Dan; Bowden, Jack
2016-09-07
Confidence intervals for the between study variance are useful in random-effects meta-analyses because they quantify the uncertainty in the corresponding point estimates. Methods for calculating these confidence intervals have been developed that are based on inverting hypothesis tests using generalised heterogeneity statistics. Whilst, under the random effects model, these new methods furnish confidence intervals with the correct coverage, the resulting intervals are usually very wide, making them uninformative. We discuss a simple strategy for obtaining 95 % confidence intervals for the between-study variance with a markedly reduced width, whilst retaining the nominal coverage probability. Specifically, we consider the possibility of using methods based on generalised heterogeneity statistics with unequal tail probabilities, where the tail probability used to compute the upper bound is greater than 2.5 %. This idea is assessed using four real examples and a variety of simulation studies. Supporting analytical results are also obtained. Our results provide evidence that using unequal tail probabilities can result in shorter 95 % confidence intervals for the between-study variance. We also show some further results for a real example that illustrates how shorter confidence intervals for the between-study variance can be useful when performing sensitivity analyses for the average effect, which is usually the parameter of primary interest. We conclude that using unequal tail probabilities when computing 95 % confidence intervals for the between-study variance, when using methods based on generalised heterogeneity statistics, can result in shorter confidence intervals. We suggest that those who find the case for using unequal tail probabilities convincing should use the '1-4 % split', where greater tail probability is allocated to the upper confidence bound. The 'width-optimal' interval that we present deserves further investigation.
NASA Astrophysics Data System (ADS)
Bansal, Sandeep Kumar; Jaiswal, Deepika; Gupta, Nishi; Singh, Kiran; Dada, Rima; Sankhwar, Satya Narayan; Gupta, Gopal; Rajender, Singh
2016-02-01
We analyzed the AZFc region of the Y-chromosome for complete (b2/b4) and distinct partial deletions (gr/gr, b1/b3, b2/b3) in 822 infertile and 225 proven fertile men. We observed complete AZFc deletions in 0.97% and partial deletions in 6.20% of the cases. Among partial deletions, the frequency of gr/gr deletions was the highest (5.84%). The comparison of partial deletion data between cases and controls suggested a significant association of the gr/gr deletions with infertility (P = 0.0004); however, the other partial deletions did not correlate with infertility. In cohort analysis, men with gr/gr deletions had a relatively poor sperm count (54.20 ± 57.45 million/ml) in comparison to those without deletions (72.49 ± 60.06), though the difference was not statistically significant (p = 0.071). Meta-analysis also suggested that gr/gr deletions are significantly associated with male infertility risk (OR = 1.821, 95% CI = 1.39-2.37, p = 0.000). We also performed trial sequential analyses that strengthened the evidence for an overall significant association of gr/gr deletions with the risk of male infertility. Another meta-analysis suggested a significant association of the gr/gr deletions with low sperm count. In conclusion, the gr/gr deletions show a strong correlation with male infertility risk and low sperm count, particularly in the Caucasian populations.
Quadriceps Tendon Autograft in Anterior Cruciate Ligament Reconstruction: A Systematic Review.
Hurley, Eoghan T; Calvo-Gurry, Manuel; Withers, Dan; Farrington, Shane K; Moran, Ray; Moran, Cathal J
2018-05-01
To systematically review the current evidence to ascertain whether quadriceps tendon autograft (QT) is a viable option in anterior cruciate ligament reconstruction. A literature review was conducted in accordance with Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines. Cohort studies comparing QT with bone-patellar tendon-bone autograft (BPTB) or hamstring tendon autograft (HT) were included. Clinical outcomes were compared, with all statistical analyses performed using IBM SPSS Statistics for Windows, version 22.0, with P < .05 being considered statistically significant. We identified 15 clinical trials with 1,910 patients. In all included studies, QT resulted in lower rates of anterior knee pain than BPTB. There was no difference in the rate of graft rupture between QT and BPTB or HT in any of the studies reporting this. One study found that QT resulted in greater knee stability than BPTB, and another study found increased stability compared with HT. One study found that QT resulted in improved functional outcomes compared with BPTB, and another found improved outcomes compared with HT, but one study found worse outcomes compared with BPTB. Current literature suggests QT is a viable option in anterior cruciate ligament reconstruction, with published literature showing comparable knee stability, functional outcomes, donor-site morbidity, and rerupture rates compared with BPTB and HT. Level III, systematic review of Level I, II, and III studies. Copyright © 2018 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Ndwandwe, Duduzile; Uthman, Olalekan A; Adamu, Abdu A; Sambala, Evanson Z; Wiyeh, Alison B; Olukade, Tawa; Bishwajit, Ghose; Yaya, Sanni; Okwo-Bele, Jean-Marie; Wiysonge, Charles S
2018-04-24
Understanding the gaps in missed opportunities for vaccination (MOV) in sub-Saharan Africa would inform interventions for improving immunisation coverage to achieving universal childhood immunisation. We aimed to conduct a multicountry analyses to decompose the gap in MOV between poor and non-poor in SSA. We used cross-sectional data from 35 Demographic and Health Surveys in SSA conducted between 2007 and 2016. Descriptive statistics used to understand the gap in MOV between the urban poor and non-poor, and across the selected covariates. Out of the 35 countries included in this analysis, 19 countries showed pro-poor inequality, 5 showed pro-non-poor inequality and remaining 11 countries showed no statistically significant inequality. Among the countries with statistically significant pro-illiterate inequality, the risk difference ranged from 4.2% in DR Congo to 20.1% in Kenya. Important factors responsible for the inequality varied across countries. In Madagascar, the largest contributors to inequality in MOV were media access, number of under-five children, and maternal education. However, in Liberia media access narrowed inequality in MOV between poor and non-poor households. The findings indicate that in most SSA countries, children belonging to poor households are most likely to have MOV and that socio-economic inequality in is determined not only by health system functions, but also by factors beyond the scope of health authorities and care delivery system. The findings suggest the need for addressing social determinants of health.
Common variants of the EPDR1 gene and the risk of Dupuytren’s disease.
Dębniak, T; Żyluk, A; Puchalski, P; Serrano-Fernandez, P
2013-10-01
The object of this study was the investigation of 3 common variants of single nucleotide polymorphisms of the ependymin-related gene 1 and its association with the occurrence of Dupuytren's disease. DNA samples were obtained from the peripheral blood of 508 consecutive patients. The control group comprised 515 healthy adults who were age-matched with the Dupuytren's patients. 3 common variants were analysed using TaqMan® genotyping assays and sequencing. The differences in the frequencies of variants of single nucleotide polymorphisms in patients and the control group were statistically tested. Additionally, haplotype frequency and linkage disequilibrium were analysed for these variants. A statistically significant association was noted between rs16879765_CT, rs16879765_TT and rs13240429_AA variants and Dupuytren's disease. 2 haplotypes: rs2722280_C+rs13240429_A+rs16879765_C and rs2722280_C+rs13240429_G+rs16879765_T were found to be statistically significantly associated with Dupuytren's disease. Moreover, we found that rs13240429 and rs16879765 variants were in strong linkage disequilibrium, while rs2722280 was only in moderate linkage disequilibrium. No significant differences were found in the frequencies of the variants of the gene between the groups with a positive and negative familial history of Dupuytren's disease. In conclusion, results of this study suggest that EPDR1 gene can be added to a growing list of genes associated with Dupuytren's disease development. © Georg Thieme Verlag KG Stuttgart · New York.
Andrus, J Malia; Porter, Matthew D; Rodríguez, Luis F; Kuehlhorn, Timothy; Cooke, Richard A C; Zhang, Yuanhui; Kent, Angela D; Zilles, Julie L
2014-02-01
Denitrifying biofilters can remove agricultural nitrates from subsurface drainage, reducing nitrate pollution that contributes to coastal hypoxic zones. The performance and reliability of natural and engineered systems dependent upon microbially mediated processes, such as the denitrifying biofilters, can be affected by the spatial structure of their microbial communities. Furthermore, our understanding of the relationship between microbial community composition and function is influenced by the spatial distribution of samples.In this study we characterized the spatial structure of bacterial communities in a denitrifying biofilter in central Illinois. Bacterial communities were assessed using automated ribosomal intergenic spacer analysis for bacteria and terminal restriction fragment length polymorphism of nosZ for denitrifying bacteria.Non-metric multidimensional scaling and analysis of similarity (ANOSIM) analyses indicated that bacteria showed statistically significant spatial structure by depth and transect,while denitrifying bacteria did not exhibit significant spatial structure. For determination of spatial patterns, we developed a package of automated functions for the R statistical environment that allows directional analysis of microbial community composition data using either ANOSIM or Mantel statistics.Applying this package to the biofilter data, the flow path correlation range for the bacterial community was 6.4 m at the shallower, periodically in undated depth and 10.7 m at the deeper, continually submerged depth. These spatial structures suggest a strong influence of hydrology on the microbial community composition in these denitrifying biofilters. Understanding such spatial structure can also guide optimal sample collection strategies for microbial community analyses.
Assessment of statistical methods used in library-based approaches to microbial source tracking.
Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D
2003-12-01
Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
When ab ≠ c - c': published errors in the reports of single-mediator models.
Petrocelli, John V; Clarkson, Joshua J; Whitmire, Melanie B; Moon, Paul E
2013-06-01
Accurate reports of mediation analyses are critical to the assessment of inferences related to causality, since these inferences are consequential for both the evaluation of previous research (e.g., meta-analyses) and the progression of future research. However, upon reexamination, approximately 15% of published articles in psychology contain at least one incorrect statistical conclusion (Bakker & Wicherts, Behavior research methods, 43, 666-678 2011), disparities that beget the question of inaccuracy in mediation reports. To quantify this question of inaccuracy, articles reporting standard use of single-mediator models in three high-impact journals in personality and social psychology during 2011 were examined. More than 24% of the 156 models coded failed an equivalence test (i.e., ab = c - c'), suggesting that one or more regression coefficients in mediation analyses are frequently misreported. The authors cite common sources of errors, provide recommendations for enhanced accuracy in reports of single-mediator models, and discuss implications for alternative methods.
Schilbach, Leonhard; Bzdok, Danilo; Timmermans, Bert; Fox, Peter T.; Laird, Angela R.; Vogeley, Kai; Eickhoff, Simon B.
2012-01-01
Previous research suggests overlap between brain regions that show task-induced deactivations and those activated during the performance of social-cognitive tasks. Here, we present results of quantitative meta-analyses of neuroimaging studies, which confirm a statistical convergence in the neural correlates of social and resting state cognition. Based on the idea that both social and unconstrained cognition might be characterized by introspective processes, which are also thought to be highly relevant for emotional experiences, a third meta-analysis was performed investigating studies on emotional processing. By using conjunction analyses across all three sets of studies, we can demonstrate significant overlap of task-related signal change in dorso-medial prefrontal and medial parietal cortex, brain regions that have, indeed, recently been linked to introspective abilities. Our findings, therefore, provide evidence for the existence of a core neural network, which shows task-related signal change during socio-emotional tasks and during resting states. PMID:22319593
Meat consumption and cancer risk: a critical review of published meta-analyses.
Lippi, Giuseppe; Mattiuzzi, Camilla; Cervellin, Gianfranco
2016-01-01
Dietary habits play a substantial role for increasing or reducing cancer risk. We performed a critical review of scientific literature, to describe the findings of meta-analyses that explored the association between meat consumption and cancer risk. Overall, 42 eligible meta-analyses were included in this review, in which meat consumption was assumed from sheer statistics. Convincing association was found between larger intake of red meat and cancer, especially with colorectal, lung, esophageal and gastric malignancies. Increased consumption of processed meat was also found to be associated with colorectal, esophageal, gastric and bladder cancers. Enhanced intake of white meat or poultry was found to be negatively associated with some types of cancers. Larger beef consumption was significantly associated with cancer, whereas the risk was not increased consuming high amounts of pork. Our analysis suggest increased risk of cancer in subjects consuming large amounts of red and processed meat, but not in those with high intake of white meat or poultry. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
Comments on `A Cautionary Note on the Interpretation of EOFs'.
NASA Astrophysics Data System (ADS)
Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio
2003-04-01
The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.
NASA Astrophysics Data System (ADS)
Poulos, M. J.; Pierce, J. L.; McNamara, J. P.; Flores, A. N.; Benner, S. G.
2015-12-01
Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley slope asymmetry is most strongly predicted by asymmetries of insolation and drainage density, which generally supports a water-balance based conceptual model of valley asymmetry development. Surprisingly, vegetation asymmetries had relatively low predictive importance.
Nieuwenhuys, Angela; Papageorgiou, Eirini; Desloovere, Kaat; Molenaers, Guy; De Laet, Tinne
2017-01-01
Experts recently identified 49 joint motion patterns in children with cerebral palsy during a Delphi consensus study. Pattern definitions were therefore the result of subjective expert opinion. The present study aims to provide objective, quantitative data supporting the identification of these consensus-based patterns. To do so, statistical parametric mapping was used to compare the mean kinematic waveforms of 154 trials of typically developing children (n = 56) to the mean kinematic waveforms of 1719 trials of children with cerebral palsy (n = 356), which were classified following the classification rules of the Delphi study. Three hypotheses stated that: (a) joint motion patterns with 'no or minor gait deviations' (n = 11 patterns) do not differ significantly from the gait pattern of typically developing children; (b) all other pathological joint motion patterns (n = 38 patterns) differ from typically developing gait and the locations of difference within the gait cycle, highlighted by statistical parametric mapping, concur with the consensus-based classification rules. (c) all joint motion patterns at the level of each joint (n = 49 patterns) differ from each other during at least one phase of the gait cycle. Results showed that: (a) ten patterns with 'no or minor gait deviations' differed somewhat unexpectedly from typically developing gait, but these differences were generally small (≤3°); (b) all other joint motion patterns (n = 38) differed from typically developing gait and the significant locations within the gait cycle that were indicated by the statistical analyses, coincided well with the classification rules; (c) joint motion patterns at the level of each joint significantly differed from each other, apart from two sagittal plane pelvic patterns. In addition to these results, for several joints, statistical analyses indicated other significant areas during the gait cycle that were not included in the pattern definitions of the consensus study. Based on these findings, suggestions to improve pattern definitions were made.
Papageorgiou, Eirini; Desloovere, Kaat; Molenaers, Guy; De Laet, Tinne
2017-01-01
Experts recently identified 49 joint motion patterns in children with cerebral palsy during a Delphi consensus study. Pattern definitions were therefore the result of subjective expert opinion. The present study aims to provide objective, quantitative data supporting the identification of these consensus-based patterns. To do so, statistical parametric mapping was used to compare the mean kinematic waveforms of 154 trials of typically developing children (n = 56) to the mean kinematic waveforms of 1719 trials of children with cerebral palsy (n = 356), which were classified following the classification rules of the Delphi study. Three hypotheses stated that: (a) joint motion patterns with ‘no or minor gait deviations’ (n = 11 patterns) do not differ significantly from the gait pattern of typically developing children; (b) all other pathological joint motion patterns (n = 38 patterns) differ from typically developing gait and the locations of difference within the gait cycle, highlighted by statistical parametric mapping, concur with the consensus-based classification rules. (c) all joint motion patterns at the level of each joint (n = 49 patterns) differ from each other during at least one phase of the gait cycle. Results showed that: (a) ten patterns with ‘no or minor gait deviations’ differed somewhat unexpectedly from typically developing gait, but these differences were generally small (≤3°); (b) all other joint motion patterns (n = 38) differed from typically developing gait and the significant locations within the gait cycle that were indicated by the statistical analyses, coincided well with the classification rules; (c) joint motion patterns at the level of each joint significantly differed from each other, apart from two sagittal plane pelvic patterns. In addition to these results, for several joints, statistical analyses indicated other significant areas during the gait cycle that were not included in the pattern definitions of the consensus study. Based on these findings, suggestions to improve pattern definitions were made. PMID:28081229
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
Description and evaluation of an initiative to develop advanced practice nurses in mainland China.
Wong, Frances Kam Yuet; Peng, Gangyi; Kan, Eva C; Li, Yajie; Lau, Ada T; Zhang, Liying; Leung, Annie F; Liu, Xueqin; Leung, Vilna O; Chen, Weiju; Li, Ming
2010-05-01
This paper describes an initiative to develop Advanced Practice Nurses (APNs) in mainland China and evaluation of the outcomes of the described programme. The pioneer project was an APN postgraduate programme involving 38 students conducted in Guangzhou, China during 2004-2005. Data related to curriculum content and process, student performance, self-reported competence and programme effects were collected. Quantitative data such as demographic data, student performance were analysed using descriptive statistics and the pre and post self-reported practice of competence was compared using chi-square test. Qualitative data such as case reports and interviews were examined using thematic analyses. Reflective journals and case studies revealed the attributes of APNs in managing clinical cases at advanced level, applying theory into practice and exercising evidence-based practice. The relatively modest self-reported practice of competence suggested that the graduates were novice APNs and needed continued development after the completion of the programme. This study reports the experience of an initiative in China and suggests a useful curriculum framework for educating APNs. Copyright 2009 Elsevier Ltd. All rights reserved.
Acid rain, air pollution, and tree growth in southeastern New York
Puckett, L.J.
1982-01-01
Whether dendroecological analyses could be used to detect changes in the relationship of tree growth to climate that might have resulted from chronic exposure to components of the acid rain-air pollution complex was determined. Tree-ring indices of white pine (Pinus strobus L.), eastern hemlock (Tsuga canadensis (L.) Cart.), pitch pine (Pinus rigida Mill.), and chestnut oak (Quercus prinus L.) were regressed against orthogonally transformed values of temperature and precipitation in order to derive a response-function relationship. Results of the regression analyses for three time periods, 1901–1920, 1926–1945, and 1954–1973 suggest that the relationship of tree growth to climate has been altered. Statistical tests of the temperature and precipitation data suggest that this change was nonclimatic. Temporally, the shift in growth response appears to correspond with the suspected increase in acid rain and air pollution in the Shawangunk Mountain area of southeastern New York in the early 1950's. This change could be the result of physiological stress induced by components of the acid rain-air pollution complex, causing climatic conditions to be more limiting to tree growth.
Page, Matthew J; McKenzie, Joanne E; Kirkham, Jamie; Dwan, Kerry; Kramer, Sharon; Green, Sally; Forbes, Andrew
2014-10-01
Systematic reviews may be compromised by selective inclusion and reporting of outcomes and analyses. Selective inclusion occurs when there are multiple effect estimates in a trial report that could be included in a particular meta-analysis (e.g. from multiple measurement scales and time points) and the choice of effect estimate to include in the meta-analysis is based on the results (e.g. statistical significance, magnitude or direction of effect). Selective reporting occurs when the reporting of a subset of outcomes and analyses in the systematic review is based on the results (e.g. a protocol-defined outcome is omitted from the published systematic review). To summarise the characteristics and synthesise the results of empirical studies that have investigated the prevalence of selective inclusion or reporting in systematic reviews of randomised controlled trials (RCTs), investigated the factors (e.g. statistical significance or direction of effect) associated with the prevalence and quantified the bias. We searched the Cochrane Methodology Register (to July 2012), Ovid MEDLINE, Ovid EMBASE, Ovid PsycINFO and ISI Web of Science (each up to May 2013), and the US Agency for Healthcare Research and Quality (AHRQ) Effective Healthcare Program's Scientific Resource Center (SRC) Methods Library (to June 2013). We also searched the abstract books of the 2011 and 2012 Cochrane Colloquia and the article alerts for methodological work in research synthesis published from 2009 to 2011 and compiled in Research Synthesis Methods. We included both published and unpublished empirical studies that investigated the prevalence and factors associated with selective inclusion or reporting, or both, in systematic reviews of RCTs of healthcare interventions. We included empirical studies assessing any type of selective inclusion or reporting, such as investigations of how frequently RCT outcome data is selectively included in systematic reviews based on the results, outcomes and analyses are discrepant between protocol and published review or non-significant outcomes are partially reported in the full text or summary within systematic reviews. Two review authors independently selected empirical studies for inclusion, extracted the data and performed a risk of bias assessment. A third review author resolved any disagreements about inclusion or exclusion of empirical studies, data extraction and risk of bias. We contacted authors of included studies for additional unpublished data. Primary outcomes included overall prevalence of selective inclusion or reporting, association between selective inclusion or reporting and the statistical significance of the effect estimate, and association between selective inclusion or reporting and the direction of the effect estimate. We combined prevalence estimates and risk ratios (RRs) using a random-effects meta-analysis model. Seven studies met the inclusion criteria. No studies had investigated selective inclusion of results in systematic reviews, or discrepancies in outcomes and analyses between systematic review registry entries and published systematic reviews. Based on a meta-analysis of four studies (including 485 Cochrane Reviews), 38% (95% confidence interval (CI) 23% to 54%) of systematic reviews added, omitted, upgraded or downgraded at least one outcome between the protocol and published systematic review. The association between statistical significance and discrepant outcome reporting between protocol and published systematic review was uncertain. The meta-analytic estimate suggested an increased risk of adding or upgrading (i.e. changing a secondary outcome to primary) when the outcome was statistically significant, although the 95% CI included no association and a decreased risk as plausible estimates (RR 1.43, 95% CI 0.71 to 2.85; two studies, n = 552 meta-analyses). Also, the meta-analytic estimate suggested an increased risk of downgrading (i.e. changing a primary outcome to secondary) when the outcome was statistically significant, although the 95% CI included no association and a decreased risk as plausible estimates (RR 1.26, 95% CI 0.60 to 2.62; two studies, n = 484 meta-analyses). None of the included studies had investigated whether the association between statistical significance and adding, upgrading or downgrading of outcomes was modified by the type of comparison, direction of effect or type of outcome; or whether there is an association between direction of the effect estimate and discrepant outcome reporting.Several secondary outcomes were reported in the included studies. Two studies found that reasons for discrepant outcome reporting were infrequently reported in published systematic reviews (6% in one study and 22% in the other). One study (including 62 Cochrane Reviews) found that 32% (95% CI 21% to 45%) of systematic reviews did not report all primary outcomes in the abstract. Another study (including 64 Cochrane and 118 non-Cochrane reviews) found that statistically significant primary outcomes were more likely to be completely reported in the systematic review abstract than non-significant primary outcomes (RR 2.66, 95% CI 1.81 to 3.90). None of the studies included systematic reviews published after 2009 when reporting standards for systematic reviews (Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement, and Methodological Expectations of Cochrane Intervention Reviews (MECIR)) were disseminated, so the results might not be generalisable to more recent systematic reviews. Discrepant outcome reporting between the protocol and published systematic review is fairly common, although the association between statistical significance and discrepant outcome reporting is uncertain. Complete reporting of outcomes in systematic review abstracts is associated with statistical significance of the results for those outcomes. Systematic review outcomes and analysis plans should be specified prior to seeing the results of included studies to minimise post-hoc decisions that may be based on the observed results. Modifications that occur once the review has commenced, along with their justification, should be clearly reported. Effect estimates and CIs should be reported for all systematic review outcomes regardless of the results. The lack of research on selective inclusion of results in systematic reviews needs to be addressed and studies that avoid the methodological weaknesses of existing research are also needed.
Bayesian phylogenetic estimation of fossil ages.
Drummond, Alexei J; Stadler, Tanja
2016-07-19
Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth-death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the 'morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses.This article is part of the themed issue 'Dating species divergences using rocks and clocks'. © 2016 The Authors.
Bayesian phylogenetic estimation of fossil ages
Drummond, Alexei J.; Stadler, Tanja
2016-01-01
Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth–death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the ‘morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses. This article is part of the themed issue ‘Dating species divergences using rocks and clocks’. PMID:27325827
Quinn, Michael C J; Wilson, Daniel J; Young, Fiona; Dempsey, Adam A; Arcand, Suzanna L; Birch, Ashley H; Wojnarowicz, Paulina M; Provencher, Diane; Mes-Masson, Anne-Marie; Englert, David; Tonin, Patricia N
2009-07-06
As gene expression signatures may serve as biomarkers, there is a need to develop technologies based on mRNA expression patterns that are adaptable for translational research. Xceed Molecular has recently developed a Ziplex technology, that can assay for gene expression of a discrete number of genes as a focused array. The present study has evaluated the reproducibility of the Ziplex system as applied to ovarian cancer research of genes shown to exhibit distinct expression profiles initially assessed by Affymetrix GeneChip analyses. The new chemiluminescence-based Ziplex gene expression array technology was evaluated for the expression of 93 genes selected based on their Affymetrix GeneChip profiles as applied to ovarian cancer research. Probe design was based on the Affymetrix target sequence that favors the 3' UTR of transcripts in order to maximize reproducibility across platforms. Gene expression analysis was performed using the Ziplex Automated Workstation. Statistical analyses were performed to evaluate reproducibility of both the magnitude of expression and differences between normal and tumor samples by correlation analyses, fold change differences and statistical significance testing. Expressions of 82 of 93 (88.2%) genes were highly correlated (p < 0.01) in a comparison of the two platforms. Overall, 75 of 93 (80.6%) genes exhibited consistent results in normal versus tumor tissue comparisons for both platforms (p < 0.001). The fold change differences were concordant for 87 of 93 (94%) genes, where there was agreement between the platforms regarding statistical significance for 71 (76%) of 87 genes. There was a strong agreement between the two platforms as shown by comparisons of log2 fold differences of gene expression between tumor versus normal samples (R = 0.93) and by Bland-Altman analysis, where greater than 90% of expression values fell within the 95% limits of agreement. Overall concordance of gene expression patterns based on correlations, statistical significance between tumor and normal ovary data, and fold changes was consistent between the Ziplex and Affymetrix platforms. The reproducibility and ease-of-use of the technology suggests that the Ziplex array is a suitable platform for translational research.
Bansal, Ravi; Peterson, Bradley S
2018-06-01
Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Csillik, O.; Evans, I. S.; Drăguţ, L.
2015-03-01
Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.
Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr
2017-01-01
The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p < 0.05 was considered as statistically significant. Statistically significant differences were found between all of the investigated sperm parameters, except for non-progressive motility, measured with CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.
Luo, Sean X; Wall, Melanie; Covey, Lirio; Hu, Mei-Chen; Scodes, Jennifer M; Levin, Frances R; Nunes, Edward V; Winhusen, Theresa
2018-01-25
A double blind, placebo-controlled randomized trial (NCT00253747) evaluating osmotic-release oral system methylphenidate (OROS-MPH) for smoking-cessation revealed a significant interaction effect in which participants with higher baseline ADHD severity had better abstinence outcomes with OROS-MPH while participants with lower baseline ADHD severity had worse outcomes. This current report examines secondary outcomes that might bear on the mechanism for this differential treatment effect. Longitudinal analyses were conducted to evaluate the effect of OROS-MPH on three secondary outcomes (ADHD symptom severity, nicotine craving, and withdrawal) in the total sample (N = 255, 56% Male), and in the high (N = 134) and low (N = 121) baseline ADHD severity groups. OROS-MPH significantly improved ADHD symptoms and nicotine withdrawal symptoms in the total sample, and exploratory analyses showed that in both higher and lower baseline severity groups, OROS-MPH statistically significantly improved these two outcomes. No effect on craving overall was detected, though exploratory analyses showed statistically significantly decreased craving in the high ADHD severity participants on OROS-MPH. No treatment by ADHD baseline severity interaction was detected for the outcomes. Methylphenidate improved secondary outcomes during smoking cessation independent of baseline ADHD severity, with no evident treatment-baseline severity interaction. Our results suggest divergent responses to smoking cessation treatment in the higher and lower severity groups cannot be explained by concordant divergence in craving, withdrawal and ADHD symptom severity, and alternative hypotheses may need to be identified.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Clinical lead poisoning in England: an analysis of routine sources of data.
Elliott, P; Arnold, R; Barltrop, D; Thornton, I; House, I M; Henry, J A
1999-12-01
To examine the occurrence of clinical lead poisoning in England based on routine sources of data. Three routine data sources were examined, over different periods according to availability of data: (a) mortality for England, 1981-96; (b) hospital episode statistics data for England, for the 3 years 1 April 1992-31 March 1995; (c) statutory returns to the Health and Safety Executive under the reporting of injuries, diseases, and dangerous occurrences regulations (RIDDOR), also for the period 1 April 1992-31 March 1995. Also, analyses of blood lead concentrations carried out by the Medical Toxicology Unit, Guy's and St Thomas' Hospital Trust in London during the period 1 January 1991-31 December 1997 were examined. The analyses were performed both for industrial screening purposes and in response to clinicians' requests where lead poisoning was suspected. This is one of several laboratories carrying out such analyses in the United Kingdom. One death, of a 2 year old girl, was coded to lead poisoning in England during 1981-96. Analysis of hospital episode statistics data identified 83 hospital cases (124 admissions) over 3 years with any mention of lead poisoning, excluding two with admissions dating from 1965 and 1969. For these 83 cases the median hospital stay per admission was 3 days (range 0-115 days). Five were coded as having received intravenous treatment. Further clinical details of these cases beyond what is routinely recorded on the hospital episode statistics database were not available, except for blood lead concentrations in cases also identified on the Medical Toxicology Unit database. Eighteen cases (22%) were below 5 years of age of whom 10 (56%) came from the most deprived quintile of electoral wards. There was evidence to suggest spatial clustering of cases (p = 0.02). Six occupational cases were reported under RIDDOR in England during the period of study, two of whom were identified on the hospital episode statistics database. One further occupational case was identified on hospital episode statistics. Blood lead analyses for 4424 people carried out by the Medical Toxicology Unit (estimated at about 5% of such analyses in England over 7 years) found that among 547 children aged 0-4, 45 (8.2%) had a blood lead concentration in excess of 25 micrograms/dl, the action level in the United Kingdom for investigation, or removal of environmental sources of lead. At all ages, there were 419 (9.5%) such people, including 106 adults with no mention of industrial exposure. Both mortality and hospital admission ascribed to lead poisoning in England are rare, but cases continue to occur and some, at least, seem to be associated with considerable morbidity. Lead poisoning was confirmed as a probable cause of clinical signs and symptoms in only a small proportion of those in whom a blood lead concentration was requested. Where indicated, appropriate remedial action for the safe removal of environmental sources of lead should be taken.
Bull, Marta E.; Heath, Laura M.; McKernan-Mullin, Jennifer L.; Kraft, Kelli M.; Acevedo, Luis; Hitti, Jane E.; Cohn, Susan E.; Tapia, Kenneth A.; Holte, Sarah E.; Dragavon, Joan A.; Coombs, Robert W.; Mullins, James I.; Frenkel, Lisa M.
2013-01-01
Background. Whether unique human immunodeficiency type 1 (HIV) genotypes occur in the genital tract is important for vaccine development and management of drug resistant viruses. Multiple cross-sectional studies suggest HIV is compartmentalized within the female genital tract. We hypothesize that bursts of HIV replication and/or proliferation of infected cells captured in cross-sectional analyses drive compartmentalization but over time genital-specific viral lineages do not form; rather viruses mix between genital tract and blood. Methods. Eight women with ongoing HIV replication were studied during a period of 1.5 to 4.5 years. Multiple viral sequences were derived by single-genome amplification of the HIV C2-V5 region of env from genital secretions and blood plasma. Maximum likelihood phylogenies were evaluated for compartmentalization using 4 statistical tests. Results. In cross-sectional analyses compartmentalization of genital from blood viruses was detected in three of eight women by all tests; this was associated with tissue specific clades containing multiple monotypic sequences. In longitudinal analysis, the tissues-specific clades did not persist to form viral lineages. Rather, across women, HIV lineages were comprised of both genital tract and blood sequences. Conclusions. The observation of genital-specific HIV clades only in cross-sectional analysis and an absence of genital-specific lineages in longitudinal analyses suggest a dynamic interchange of HIV variants between the female genital tract and blood. PMID:23315326
Per capita alcohol consumption and sickness absence in Norway.
Norström, Thor; Moan, Inger Synnøve
2009-08-01
There is only one previous study addressing the relationship between population drinking and sickness absence. That study, based on Swedish time-series data, showed a statistically significant relationship between per capita alcohol consumption and the male sickness absence rate. Estimates suggested that a 1-l increase in consumption was associated with a 13% increase in sickness absence among men. In the present study, we aim at replicating and expanding the Swedish study on the basis of data for Norway. The outcome measure comprised annual data for Norway on registered sickness absence for manual employees covering the period 1957-2001. The unemployment rate was included as a control, as this factor may be correlated with alcohol as well as sickness absence. Alcohol consumption was gauged by sales of alcohol (total and beverage specific by beer, spirits and wine) per inhabitant 15 years and above. The data were analysed using the Box-Jenkins method for time-series analysis. The results suggested that a 1-l increase in total consumption was associated with a 13% increase in sickness absence among men (P < 0.05). This corresponds to an elasticity coefficient equal to 0.62. The alcohol effect was not significant for women. Unemployment was negatively associated with the outcome for men as well as for women (P < 0.05). In the beverage-specific analyses, spirits were statistically significant for men (P < 0.05), but not beer and wine. The present findings strengthen the conclusion from the Swedish study, that sickness absence may be added to the list of indicators of alcohol-related harm.
The impact of tinnitus on daily activities in adult tinnitus sufferers: A pilot study.
Moroe, Nomfundo F; Khoza-Shangase, Katijah
2014-08-27
Few South African studies have been published on the impact of tinnitus on quality of life of tinnitus sufferers, although evidence suggests that a large portion of the general population suffers from tinnitus. The current study aimed at describing the effects of tinnitus on the quality of life of the participants as measured by the Tinnitus Handicap Inventory (THI). In a cross-sectional descriptive study design, 27 participants took part in the study by completing a self-administered THI questionnaire and participating in a semi-structured interview. Descriptive and inferential statistics were used to analyse the data. Descriptively, content analysis was used to organise and convey results from the interviews. Participants reported a wide range of perceived disability on the THI. Results ranged from mild to catastrophic, with functional disability being most prominent in all participants, although there were differences when results were analysed according to gender. There was an association between gender and the type of perceived disability, although this was statistically non-significant (p > 0.05). Only 26% of the participants reported no effect on occupational performance and quality of life, with the remainder of the participants reporting a significant effect. Limited effective management strategies were reported to have been implemented - a significant implication for the audiologists. The results have implications for audiologists as they suggest that audiologists should take a detailed case history to determine the extent to which tinnitus affects the individual. Furthermore, audiologists should administer a scale such as the THI in the management of tinnitus.
Examining overlap in behavioral and neural representations of morals, facts, and preferences.
Theriault, Jordan; Waytz, Adam; Heiphetz, Larisa; Young, Liane
2017-11-01
Metaethical judgments refer to judgments about the information expressed by moral claims. Moral objectivists generally believe that moral claims are akin to facts, whereas moral subjectivists generally believe that moral claims are more akin to preferences. Evidence from developmental and social psychology has generally favored an objectivist view; however, this work has typically relied on few examples, and analyses have disallowed statistical generalizations beyond these few stimuli. The present work addresses whether morals are represented as fact-like or preference-like, using behavioral and neuroimaging methods, in combination with statistical techniques that can (a) generalize beyond our sample stimuli, and (b) test whether particular item features are associated with neural activity. Behaviorally, and contrary to prior work, morals were perceived as more preference-like than fact-like. Neurally, morals and preferences elicited common magnitudes and spatial patterns of activity, particularly within the dorsal-medial prefrontal cortex (DMPFC), a critical region for social cognition. This common DMPFC activity for morals and preferences was present across whole-brain conjunctions, and in individually localized functional regions of interest (targeting the theory of mind network). By contrast, morals and facts did not elicit any neural activity in common. Follow-up item analyses suggested that the activity elicited in common by morals and preferences was explained by their shared tendency to evoke representations of mental states. We conclude that morals are represented as far more subjective than prior work has suggested. This conclusion is consistent with recent theoretical research, which has argued that morality is fundamentally about regulating social relationships. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
High statistical heterogeneity is more frequent in meta-analysis of continuous than binary outcomes.
Alba, Ana C; Alexander, Paul E; Chang, Joanne; MacIsaac, John; DeFry, Samantha; Guyatt, Gordon H
2016-02-01
We compared the distribution of heterogeneity in meta-analyses of binary and continuous outcomes. We searched citations in MEDLINE and Cochrane databases for meta-analyses of randomized trials published in 2012 that reported a measure of heterogeneity of either binary or continuous outcomes. Two reviewers independently performed eligibility screening and data abstraction. We evaluated the distribution of I(2) in meta-analyses of binary and continuous outcomes and explored hypotheses explaining the difference in distributions. After full-text screening, we selected 671 meta-analyses evaluating 557 binary and 352 continuous outcomes. Heterogeneity as assessed by I(2) proved higher in continuous than in binary outcomes: the proportion of continuous and binary outcomes reporting an I(2) of 0% was 34% vs. 52%, respectively, and reporting an I(2) of 60-100% was 39% vs. 14%. In continuous but not binary outcomes, I(2) increased with larger number of studies included in a meta-analysis. Increased precision and sample size do not explain the larger I(2) found in meta-analyses of continuous outcomes with a larger number of studies. Meta-analyses evaluating continuous outcomes showed substantially higher I(2) than meta-analyses of binary outcomes. Results suggest differing standards for interpreting I(2) in continuous vs. binary outcomes may be appropriate. Copyright © 2016 Elsevier Inc. All rights reserved.
Haynos, Ann F; Pearson, Carolyn M; Utzinger, Linsey M; Wonderlich, Stephen A; Crosby, Ross D; Mitchell, James E; Crow, Scott J; Peterson, Carol B
2017-05-01
Evidence suggests that eating disorder subtypes reflecting under-controlled, over-controlled, and low psychopathology personality traits constitute reliable phenotypes that differentiate treatment response. This study is the first to use statistical analyses to identify these subtypes within treatment-seeking individuals with bulimia nervosa (BN) and to use these statistically derived clusters to predict clinical outcomes. Using variables from the Dimensional Assessment of Personality Pathology-Basic Questionnaire, K-means cluster analyses identified under-controlled, over-controlled, and low psychopathology subtypes within BN patients (n = 80) enrolled in a treatment trial. Generalized linear models examined the impact of personality subtypes on Eating Disorder Examination global score, binge eating frequency, and purging frequency cross-sectionally at baseline and longitudinally at end of treatment (EOT) and follow-up. In the longitudinal models, secondary analyses were conducted to examine personality subtype as a potential moderator of response to Cognitive Behavioral Therapy-Enhanced (CBT-E) or Integrative Cognitive-Affective Therapy for BN (ICAT-BN). There were no baseline clinical differences between groups. In the longitudinal models, personality subtype predicted binge eating (p = 0.03) and purging (p = 0.01) frequency at EOT and binge eating frequency at follow-up (p = 0.045). The over-controlled group demonstrated the best outcomes on these variables. In secondary analyses, there was a treatment by subtype interaction for purging at follow-up (p = 0.04), which indicated a superiority of CBT-E over ICAT-BN for reducing purging among the over-controlled group. Empirically derived personality subtyping appears to be a valid classification system with potential to guide eating disorder treatment decisions. © 2016 Wiley Periodicals, Inc.(Int J Eat Disord 2017; 50:506-514). © 2016 Wiley Periodicals, Inc.
gHRV: Heart rate variability analysis made easy.
Rodríguez-Liñares, L; Lado, M J; Vila, X A; Méndez, A J; Cuesta, P
2014-08-01
In this paper, the gHRV software tool is presented. It is a simple, free and portable tool developed in python for analysing heart rate variability. It includes a graphical user interface and it can import files in multiple formats, analyse time intervals in the signal, test statistical significance and export the results. This paper also contains, as an example of use, a clinical analysis performed with the gHRV tool, namely to determine whether the heart rate variability indexes change across different stages of sleep. Results from tests completed by researchers who have tried gHRV are also explained: in general the application was positively valued and results reflect a high level of satisfaction. gHRV is in continuous development and new versions will include suggestions made by testers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Information filtering via biased heat conduction
NASA Astrophysics Data System (ADS)
Liu, Jian-Guo; Zhou, Tao; Guo, Qiang
2011-09-01
The process of heat conduction has recently found application in personalized recommendation [Zhou , Proc. Natl. Acad. Sci. USA PNASA60027-842410.1073/pnas.1000488107107, 4511 (2010)], which is of high diversity but low accuracy. By decreasing the temperatures of small-degree objects, we present an improved algorithm, called biased heat conduction, which could simultaneously enhance the accuracy and diversity. Extensive experimental analyses demonstrate that the accuracy on MovieLens, Netflix, and Delicious datasets could be improved by 43.5%, 55.4% and 19.2%, respectively, compared with the standard heat conduction algorithm and also the diversity is increased or approximately unchanged. Further statistical analyses suggest that the present algorithm could simultaneously identify users' mainstream and special tastes, resulting in better performance than the standard heat conduction algorithm. This work provides a creditable way for highly efficient information filtering.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
USDA-ARS?s Scientific Manuscript database
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
ERIC Educational Resources Information Center
Steyvers, Mark; Tenenbaum, Joshua B.
2005-01-01
We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…
Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane
2016-03-01
Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic quality standards but also include more statistical modeling results when data allow. © 2016 American Society for Nutrition.
Shafer, Steven L; Lemmer, Bjoern; Boselli, Emmanuel; Boiste, Fabienne; Bouvet, Lionel; Allaouchiche, Bernard; Chassard, Dominique
2010-10-01
The duration of analgesia from epidural administration of local anesthetics to parturients has been shown to follow a rhythmic pattern according to the time of drug administration. We studied whether there was a similar pattern after intrathecal administration of bupivacaine in parturients. In the course of the analysis, we came to believe that some data points coincident with provider shift changes were influenced by nonbiological, health care system factors, thus incorrectly suggesting a periodic signal in duration of labor analgesia. We developed graphical and analytical tools to help assess the influence of individual points on the chronobiological analysis. Women with singleton term pregnancies in vertex presentation, cervical dilation 3 to 5 cm, pain score >50 mm (of 100 mm), and requesting labor analgesia were enrolled in this study. Patients received 2.5 mg of intrathecal bupivacaine in 2 mL using a combined spinal-epidural technique. Analgesia duration was the time from intrathecal injection until the first request for additional analgesia. The duration of analgesia was analyzed by visual inspection of the data, application of smoothing functions (Supersmoother; LOWESS and LOESS [locally weighted scatterplot smoothing functions]), analysis of variance, Cosinor (Chronos-Fit), Excel, and NONMEM (nonlinear mixed effect modeling). Confidence intervals (CIs) were determined by bootstrap analysis (1000 replications with replacement) using PLT Tools. Eighty-two women were included in the study. Examination of the raw data using 3 smoothing functions revealed a bimodal pattern, with a peak at approximately 0630 and a subsequent peak in the afternoon or evening, depending on the smoother. Analysis of variance did not identify any statistically significant difference between the duration of analgesia when intrathecal injection was given from midnight to 0600 compared with the duration of analgesia after intrathecal injection at other times. Chronos-Fit, Excel, and NONMEM produced identical results, with a mean duration of analgesia of 38.4 minutes (95% CI: 35.4-41.6 minutes), an 8-hour periodic waveform with an amplitude of 5.8 minutes (95% CI: 2.1-10.7 minutes), and a phase offset of 6.5 hours (95% CI: 5.4-8.0 hours) relative to midnight. The 8-hour periodic model did not reach statistical significance in 40% of bootstrap analyses, implying that statistical significance of the 8-hour periodic model was dependent on a subset of the data. Two data points before the change of shift at 0700 contributed most strongly to the statistical significance of the periodic waveform. Without these data points, there was no evidence of an 8-hour periodic waveform for intrathecal bupivacaine analgesia. Chronobiology includes the influence of external daily rhythms in the environment (e.g., nursing shifts) as well as human biological rhythms. We were able to distinguish the influence of an external rhythm by combining several novel analyses: (1) graphical presentation superimposing the raw data, external rhythms (e.g., nursing and anesthesia provider shifts), and smoothing functions; (2) graphical display of the contribution of each data point to the statistical significance; and (3) bootstrap analysis to identify whether the statistical significance was highly dependent on a data subset. These approaches suggested that 2 data points were likely artifacts of the change in nursing and anesthesia shifts. When these points were removed, there was no suggestion of biological rhythm in the duration of intrathecal bupivacaine analgesia.
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.
Shin, S M; Kim, Y-I; Choi, Y-S; Yamaguchi, T; Maki, K; Cho, B-H; Park, S-B
2015-01-01
To evaluate axial cervical vertebral (ACV) shape quantitatively and to build a prediction model for skeletal maturation level using statistical shape analysis for Japanese individuals. The sample included 24 female and 19 male patients with hand-wrist radiographs and CBCT images. Through generalized Procrustes analysis and principal components (PCs) analysis, the meaningful PCs were extracted from each ACV shape and analysed for the estimation regression model. Each ACV shape had meaningful PCs, except for the second axial cervical vertebra. Based on these models, the smallest prediction intervals (PIs) were from the combination of the shape space PCs, age and gender. Overall, the PIs of the male group were smaller than those of the female group. There was no significant correlation between centroid size as a size factor and skeletal maturation level. Our findings suggest that the ACV maturation method, which was applied by statistical shape analysis, could confirm information about skeletal maturation in Japanese individuals as an available quantifier of skeletal maturation and could be as useful a quantitative method as the skeletal maturation index.
Shin, S M; Choi, Y-S; Yamaguchi, T; Maki, K; Cho, B-H; Park, S-B
2015-01-01
Objectives: To evaluate axial cervical vertebral (ACV) shape quantitatively and to build a prediction model for skeletal maturation level using statistical shape analysis for Japanese individuals. Methods: The sample included 24 female and 19 male patients with hand–wrist radiographs and CBCT images. Through generalized Procrustes analysis and principal components (PCs) analysis, the meaningful PCs were extracted from each ACV shape and analysed for the estimation regression model. Results: Each ACV shape had meaningful PCs, except for the second axial cervical vertebra. Based on these models, the smallest prediction intervals (PIs) were from the combination of the shape space PCs, age and gender. Overall, the PIs of the male group were smaller than those of the female group. There was no significant correlation between centroid size as a size factor and skeletal maturation level. Conclusions: Our findings suggest that the ACV maturation method, which was applied by statistical shape analysis, could confirm information about skeletal maturation in Japanese individuals as an available quantifier of skeletal maturation and could be as useful a quantitative method as the skeletal maturation index. PMID:25411713
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
Sultan, Amira; Kaminski, Juliane; Mojzisch, Andreas
2018-01-01
In recent years, an increasing number of studies has investigated majority influence in nonhuman animals. However, due to both terminological and methodological issues, evidence for conformity in nonhuman animals is scarce and controversial. Preliminary evidence suggests that wild birds, wild monkeys, and fish show conformity, that is, forgoing personal information in order to copy the majority. By contrast, chimpanzees seem to lack this tendency. The present study is the first to examine whether dogs (Canis familiaris) show conformity. Specifically, we tested whether dogs conform to a majority of conspecifics rather than stick to what they have previously learned. After dogs had acquired a behavioral preference via training (i.e., shaping), they were confronted with counter-preferential behavior of either no, one or three conspecifics. Traditional frequentist analyses show that the dogs’ behavior did not differ significantly between the three conditions. Complementary Bayesian analyses suggest that our data provide moderate evidence for the null hypothesis. In conclusion, our results suggest that dogs stick to what they have learned rather than conform to the counter-preferential behavior of others. We discuss the possible statistical and methodological limitations of this finding. Furthermore, we take a functional perspective on conformity and discuss under which circumstances dogs might show conformity after all. PMID:29570747
Nonlinear dynamic analysis of voices before and after surgical excision of vocal polyps
NASA Astrophysics Data System (ADS)
Zhang, Yu; McGilligan, Clancy; Zhou, Liang; Vig, Mark; Jiang, Jack J.
2004-05-01
Phase space reconstruction, correlation dimension, and second-order entropy, methods from nonlinear dynamics, are used to analyze sustained vowels generated by patients before and after surgical excision of vocal polyps. Two conventional acoustic perturbation parameters, jitter and shimmer, are also employed to analyze voices before and after surgery. Presurgical and postsurgical analyses of jitter, shimmer, correlation dimension, and second-order entropy are statistically compared. Correlation dimension and second-order entropy show a statistically significant decrease after surgery, indicating reduced complexity and higher predictability of postsurgical voice dynamics. There is not a significant postsurgical difference in shimmer, although jitter shows a significant postsurgical decrease. The results suggest that jitter and shimmer should be applied to analyze disordered voices with caution; however, nonlinear dynamic methods may be useful for analyzing abnormal vocal function and quantitatively evaluating the effects of surgical excision of vocal polyps.
Keywords and Co-Occurrence Patterns in the Voynich Manuscript: An Information-Theoretic Analysis
Montemurro, Marcelo A.; Zanette, Damián H.
2013-01-01
The Voynich manuscript has remained so far as a mystery for linguists and cryptologists. While the text written on medieval parchment -using an unknown script system- shows basic statistical patterns that bear resemblance to those from real languages, there are features that suggested to some researches that the manuscript was a forgery intended as a hoax. Here we analyse the long-range structure of the manuscript using methods from information theory. We show that the Voynich manuscript presents a complex organization in the distribution of words that is compatible with those found in real language sequences. We are also able to extract some of the most significant semantic word-networks in the text. These results together with some previously known statistical features of the Voynich manuscript, give support to the presence of a genuine message inside the book. PMID:23805215
Universal properties of mythological networks
NASA Astrophysics Data System (ADS)
Mac Carron, Pádraig; Kenna, Ralph
2012-07-01
As in statistical physics, the concept of universality plays an important, albeit qualitative, role in the field of comparative mythology. Here we apply statistical mechanical tools to analyse the networks underlying three iconic mythological narratives with a view to identifying common and distinguishing quantitative features. Of the three narratives, an Anglo-Saxon and a Greek text are mostly believed by antiquarians to be partly historically based while the third, an Irish epic, is often considered to be fictional. Here we use network analysis in an attempt to discriminate real from imaginary social networks and place mythological narratives on the spectrum between them. This suggests that the perceived artificiality of the Irish narrative can be traced back to anomalous features associated with six characters. Speculating that these are amalgams of several entities or proxies, renders the plausibility of the Irish text comparable to the others from a network-theoretic point of view.
Bispectral analysis of equatorial spread F density irregularities
NASA Technical Reports Server (NTRS)
Labelle, J.; Lund, E. J.
1992-01-01
Bispectral analysis has been applied to density irregularities at frequencies 5-30 Hz observed with a sounding rocket launched from Peru in March 1983. Unlike the power spectrum, the bispectrum contains statistical information about the phase relations between the Fourier components which make up the waveform. In the case of spread F data from 475 km the 5-30 Hz portion of the spectrum displays overall enhanced bicoherence relative to that of the background instrumental noise and to that expected due to statistical considerations, implying that the observed f exp -2.5 power law spectrum has a significant non-Gaussian component. This is consistent with previous qualitative analyses. The bicoherence has also been calculated for simulated equatorial spread F density irregularities in approximately the same wavelength regime, and the resulting bispectrum has some features in common with that of the rocket data. The implications of this analysis for equatorial spread F are discussed, and some future investigations are suggested.
Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.
Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S
2016-01-01
Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Youngstrom, Eric A
2014-03-01
To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
Chung, Dongjun; Kim, Hang J; Zhao, Hongyu
2017-02-01
Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.
Statistical palaeomagnetic field modelling and dynamo numerical simulation
NASA Astrophysics Data System (ADS)
Bouligand, C.; Hulot, G.; Khokhlov, A.; Glatzmaier, G. A.
2005-06-01
By relying on two numerical dynamo simulations for which such investigations are possible, we test the validity and sensitivity of a statistical palaeomagnetic field modelling approach known as the giant gaussian process (GGP) modelling approach. This approach is currently used to analyse palaeomagnetic data at times of stable polarity and infer some information about the way the main magnetic field (MF) of the Earth has been behaving in the past and has possibly been influenced by core-mantle boundary (CMB) conditions. One simulation has been run with homogeneous CMB conditions, the other with more realistic non-homogeneous symmetry breaking CMB conditions. In both simulations, it is found that, as required by the GGP approach, the field behaves as a short-term memory process. Some severe non-stationarity is however found in the non-homogeneous case, leading to very significant departures of the Gauss coefficients from a Gaussian distribution, in contradiction with the assumptions underlying the GGP approach. A similar but less severe non-stationarity is found in the case of the homogeneous simulation, which happens to display a more Earth-like temporal behaviour than the non-homogeneous case. This suggests that a GGP modelling approach could nevertheless be applied to try and estimate the mean μ and covariance matrix γ(τ) (first- and second-order statistical moments) of the field produced by the geodynamo. A detailed study of both simulations is carried out to assess the possibility of detecting statistical symmetry breaking properties of the underlying dynamo process by inspection of estimates of μ and γ(τ). As expected (because of the role of the rotation of the Earth in the dynamo process), those estimates reveal spherical symmetry breaking properties. Equatorial symmetry breaking properties are also detected in both simulations, showing that such symmetry breaking properties can occur spontaneously under homogeneous CMB conditions. By contrast axial symmetry breaking is detected only in the non-homogenous simulation, testifying for the constraints imposed by the CMB conditions. The signature of this axial symmetry breaking is however found to be much weaker than the signature of equatorial symmetry breaking. We note that this could be the reason why only equatorial symmetry breaking properties (in the form of the well-known axial quadrupole term in the time-averaged field) have unambiguously been found so far by analysing the real data. However, this could also be because those analyses have all assumed to simple a form for γ(τ) when attempting to estimate μ. Suggestions are provided to make sure future attempts of GGP modelling with real data are being carried out in a more consistent and perhaps more efficient way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casar, B; Carot, I Mendez; Peterlin, P
2016-06-15
Purpose: Aim of the multi-centre study was to analyse beam hardening effect of the Integral Quality Monitor (IQM) for high energy photon beams used in radiotherapy with linear accelerators. Generic values for attenuation coefficient k(IQM) of IQM system were additionally investigated. Methods: Beam hardening effect of the IQM system was studied for a set of standard nominal photon energies (6 MV–18 MV) and two flattening filter free (FFF) energies (6 MV FFF and 10 MV FFF). PDD curves were measured and analysed for various square radiation fields, with and without IQM in place. Differences between PDD curves were statistically analysedmore » through comparison of respective PDD-20,10 values. Attenuation coefficients k(IQM) were determined for the same range of photon energies. Results: Statistically significant differences in beam qualities for all evaluated high energy photon beams were found, comparing PDD-20,10 values derived from PDD curves with and without IQM in place. Significance of beam hardening effect was statistically proven with high confidence (p < 0,01) for all analysed photon beams except for 15 MV (p = 0,078), although relative differences in beam qualities were minimal, ranging from 0,1 % to 0,5 %. Attenuation of the IQM system showed negligible dependence on radiation field size. However, clinically important dependence of kIQM versus TPRs20,10 was found: 0,941 for 6 MV photon beams, to 0,959 for 18 MV photon beams, with highest uncertainty below 0,006. k(IQM) versus TPRs were tabulated and polynomial equation for the determination of k(IQM) is suggested for clinical use. Conclusion: There was no clinically relevant beam hardening, when IQM system was on linear accelerators. Consequently, no additional commissioning is needed for the IQM system regarding the determination of beam qualities. Generic values for k(IQM) are proposed and can be used as tray factors for complete range of examined photon beam energies.« less
Rivkin, Michael J; Davis, Peter E; Lemaster, Jennifer L; Cabral, Howard J; Warfield, Simon K; Mulkern, Robert V; Robson, Caroline D; Rose-Jacobs, Ruth; Frank, Deborah A
2008-04-01
The objective of this study was to use volumetric MRI to study brain volumes in 10- to 14-year-old children with and without intrauterine exposure to cocaine, alcohol, cigarettes, or marijuana. Volumetric MRI was performed on 35 children (mean age: 12.3 years; 14 with intrauterine exposure to cocaine, 21 with no intrauterine exposure to cocaine) to determine the effect of prenatal drug exposure on volumes of cortical gray matter; white matter; subcortical gray matter; cerebrospinal fluid; and total parenchymal volume. Head circumference was also obtained. Analyses of each individual substance were adjusted for demographic characteristics and the remaining 3 prenatal substance exposures. Regression analyses adjusted for demographic characteristics showed that children with intrauterine exposure to cocaine had lower mean cortical gray matter and total parenchymal volumes and smaller mean head circumference than comparison children. After adjustment for other prenatal exposures, these volumes remained smaller but lost statistical significance. Similar analyses conducted for prenatal ethanol exposure adjusted for demographics showed significant reduction in mean cortical gray matter; total parenchymal volumes; and head circumference, which remained smaller but lost statistical significance after adjustment for the remaining 3 exposures. Notably, prenatal cigarette exposure was associated with significant reductions in cortical gray matter and total parenchymal volumes and head circumference after adjustment for demographics that retained marginal significance after adjustment for the other 3 exposures. Finally, as the number of exposures to prenatal substances grew, cortical gray matter and total parenchymal volumes and head circumference declined significantly with smallest measures found among children exposed to all 4. CONCLUSIONS; These data suggest that intrauterine exposures to cocaine, alcohol, and cigarettes are individually related to reduced head circumference; cortical gray matter; and total parenchymal volumes as measured by MRI at school age. Adjustment for other substance exposures precludes determination of statistically significant individual substance effect on brain volume in this small sample; however, these substances may act cumulatively during gestation to exert lasting effects on brain size and volume.
Pekala, Ronald J; Kumar, V K; Maurer, Ronald; Elliott-Carter, Nancy; Moon, Edward; Mullen, Karen
2010-04-01
This study sought to determine if self-reported hypnotic depth (srHD) could be predicted from the variables of the Phenomenology of Consciousness Inventory - Hypnotic Assessment Procedure (PCI-HAP) (Pekala, 1995a, 1995b; Pekala & Kumar, 2007; Pekala et al., 2010), assessing several of the processes theorized by researchers to be associated with hypnotism: trance (altered state effects), suggestibility, and expectancy. One hundred and eighty participants completed the PCI-HAP. Using regression analyses, srHD scores were predicted from the PCI-HAP pre-hypnotic and post-hypnotic assessment items, and several other variables. The results suggested that the srHD scores were found to be a function of imagoic suggestibility, expectancy (both estimated hypnotic depth and expected therapeutic efficacy), and trance state and eye catalepsy effects; effects that appear to be additive and not (statistically) interactive. The results support the theorizing of many investigators concerning the involvement of the aforementioned component processes with this particular aspect of hypnotism, the self-reported hypnotic depth score.
Zhou, Zheng; Dai, Cong; Liu, Wei-Xin
2015-01-01
TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with conventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR = 1.82; 95% CI 1.24 to 2.67) and clinical response (RR = 1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.
Zhou, Zheng; Dai, Cong; Liu, Wei-xin
2015-06-01
TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with con- ventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR =1.82; 95% CI 1.24 to 2.67) and clinical response (RR =1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
[Mass anomalies of the extremities in anurans].
Kovalenko, E E
2000-01-01
The author analyses literature data on anomalies of limbs in Anura. It is shown that published data is usually not enough to discuss either conditions of appearance or the causes of anomalies. Traditional statistical methods does not adequately characterise the frequency of anomalies. The author suggests a new criteria for ascertaining the fact of appearance of mass anomalies. A number of experimental data don't correspond to current theoretical ideas about the nature of anomalies. It is considered to distinguish "background" and "mass" anomalies. "Background" anomalies can not be a good indicator of unfavourable condition of development.
VCSEL-based fiber optic link for avionics: implementation and performance analyses
NASA Astrophysics Data System (ADS)
Shi, Jieqin; Zhang, Chunxi; Duan, Jingyuan; Wen, Huaitao
2006-11-01
A Gb/s fiber optic link with built-in test capability (BIT) basing on vertical-cavity surface-emitting laser (VCSEL) sources for military avionics bus for next generation has been presented in this paper. To accurately predict link performance, statistical methods and Bit Error Rate (BER) measurements have been examined. The results show that the 1Gb/s fiber optic link meets the BER requirement and values for link margin can reach up to 13dB. Analysis shows that the suggested photonic network may provide high performance and low cost interconnections alternative for future military avionics.
Afendulis, Christopher C; Fendrick, A Mark; Song, Zirui; Landon, Bruce E; Safran, Dana Gelb; Mechanic, Robert E; Chernew, Michael E
2014-01-01
In 2009, Blue Cross Blue Shield of Massachusetts implemented a global budget-based payment system, the Alternative Quality Contract (AQC), in which provider groups assumed accountability for spending. We investigate the impact of global budgets on the utilization of prescription drugs and related expenditures. Our analyses indicate no statistically significant evidence that the AQC reduced the use of drugs. Although the impact may change over time, early evidence suggests that it is premature to conclude that global budget systems may reduce access to medications. © The Author(s) 2014.
Reconceptualizing the classification of PNAS articles
Airoldi, Edoardo M.; Erosheva, Elena A.; Fienberg, Stephen E.; Joutard, Cyrille; Love, Tanzy; Shringarpure, Suyash
2010-01-01
PNAS article classification is rooted in long-standing disciplinary divisions that do not necessarily reflect the structure of modern scientific research. We reevaluate that structure using latent pattern models from statistical machine learning, also known as mixed-membership models, that identify semantic structure in co-occurrence of words in the abstracts and references. Our findings suggest that the latent dimensionality of patterns underlying PNAS research articles in the Biological Sciences is only slightly larger than the number of categories currently in use, but it differs substantially in the content of the categories. Further, the number of articles that are listed under multiple categories is only a small fraction of what it should be. These findings together with the sensitivity analyses suggest ways to reconceptualize the organization of papers published in PNAS. PMID:21078953
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Goldmann, Andrew; Kalinich, Donald A.
2016-12-01
In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in thismore » study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs 2 MoO 4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU analyses. Additionally, current analyses suggest that the NUREG-1465 release fractions are conservative by about a factor of 2 in terms of release fractions and that release durations for in-vessel and late in-vessel release periods are in fact longer than the NUREG-1465 durations. It is currently planned that a subsequent report will further characterize these results using more refined statistical methods, permitting a more precise reformulation of the NUREG-1465 alternative source term for both LBU and HBU fuels, with the most important finding being that the NUREG-1465 formula appears to embody significant conservatism compared to current best-estimate analyses. ACKNOWLEDGEMENTS This work was supported by the United States Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. The authors would like to thank Dr. Ian Gauld and Dr. Germina Ilas, of Oak Ridge National Laboratory, for their contributions to this work. In addition to development of core fission product inventory and decay heat information for use in MELCOR models, their insights related to fuel management practices and resulting effects on spatial distribution of fission products in the core was instrumental in completion of our work.« less
Development of a model of the tobacco industry's interference with tobacco control programmes
Trochim, W; Stillman, F; Clark, P; Schmitt, C
2003-01-01
Objective: To construct a conceptual model of tobacco industry tactics to undermine tobacco control programmes for the purposes of: (1) developing measures to evaluate industry tactics, (2) improving tobacco control planning, and (3) supplementing current or future frameworks used to classify and analyse tobacco industry documents. Design: Web based concept mapping was conducted, including expert brainstorming, sorting, and rating of statements describing industry tactics. Statistical analyses used multidimensional scaling and cluster analysis. Interpretation of the resulting maps was accomplished by an expert panel during a face-to-face meeting. Subjects: 34 experts, selected because of their previous encounters with industry resistance or because of their research into industry tactics, took part in some or all phases of the project. Results: Maps with eight non-overlapping clusters in two dimensional space were developed, with importance ratings of the statements and clusters. Cluster and quadrant labels were agreed upon by the experts. Conclusions: The conceptual maps summarise the tactics used by the industry and their relationships to each other, and suggest a possible hierarchy for measures that can be used in statistical modelling of industry tactics and for review of industry documents. Finally, the maps enable hypothesis of a likely progression of industry reactions as public health programmes become more successful, and therefore more threatening to industry profits. PMID:12773723
Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.
Cole, Steve W; Galic, Zoran; Zack, Jerome A
2003-09-22
Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus
Ekins, Sean; Olechno, Joe; Williams, Antony J.
2013-01-01
Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research. PMID:23658723
Bernard, Nicola K; Kashy, Deborah A; Levendosky, Alytia A; Bogat, G Anne; Lonstein, Joseph S
2017-03-01
Attunement between mothers and infants in their hypothalamic-pituitary-adrenal (HPA) axis responsiveness to acute stressors is thought to benefit the child's emerging physiological and behavioral self-regulation, as well as their socioemotional development. However, there is no universally accepted definition of attunement in the literature, which appears to have resulted in inconsistent statistical analyses for determining its presence or absence, and contributed to discrepant results. We used a series of data analytic approaches, some previously used in the attunement literature and others not, to evaluate the attunement between 182 women and their 1-year-old infants in their HPA axis responsivity to acute stress. Cortisol was measured in saliva samples taken from mothers and infants before and twice after a naturalistic laboratory stressor (infant arm restraint). The results of the data analytic approaches were mixed, with some analyses suggesting attunement while others did not. The strengths and weaknesses of each statistical approach are discussed, and an analysis using a cross-lagged model that considered both time and interactions between mother and infant appeared the most appropriate. Greater consensus in the field about the conceptualization and analysis of physiological attunement would be valuable in order to advance our understanding of this phenomenon. © 2016 Wiley Periodicals, Inc.
A weighted U statistic for association analyses considering genetic heterogeneity.
Wei, Changshuai; Elston, Robert C; Lu, Qing
2016-07-20
Converging evidence suggests that common complex diseases with the same or similar clinical manifestations could have different underlying genetic etiologies. While current research interests have shifted toward uncovering rare variants and structural variations predisposing to human diseases, the impact of heterogeneity in genetic studies of complex diseases has been largely overlooked. Most of the existing statistical methods assume the disease under investigation has a homogeneous genetic effect and could, therefore, have low power if the disease undergoes heterogeneous pathophysiological and etiological processes. In this paper, we propose a heterogeneity-weighted U (HWU) method for association analyses considering genetic heterogeneity. HWU can be applied to various types of phenotypes (e.g., binary and continuous) and is computationally efficient for high-dimensional genetic data. Through simulations, we showed the advantage of HWU when the underlying genetic etiology of a disease was heterogeneous, as well as the robustness of HWU against different model assumptions (e.g., phenotype distributions). Using HWU, we conducted a genome-wide analysis of nicotine dependence from the Study of Addiction: Genetics and Environments dataset. The genome-wide analysis of nearly one million genetic markers took 7h, identifying heterogeneous effects of two new genes (i.e., CYP3A5 and IKBKB) on nicotine dependence. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Additive interaction between heterogeneous environmental ...
BACKGROUND Environmental exposures often occur in tandem; however, epidemiological research often focuses on singular exposures. Statistical interactions among broad, well-characterized environmental domains have not yet been evaluated in association with health. We address this gap by conducting a county-level cross-sectional analysis of interactions between Environmental Quality Index (EQI) domain indices on preterm birth in the Unites States from 2000-2005.METHODS: The EQI, a county-level index constructed for the 2000-2005 time period, was constructed from five domain-specific indices (air, water, land, built and sociodemographic) using principal component analyses. County-level preterm birth rates (n=3141) were estimated using live births from the National Center for Health Statistics. Linear regression was used to estimate prevalence differences (PD) and 95% confidence intervals (CI) comparing worse environmental quality to the better quality for each model for a) each individual domain main effect b) the interaction contrast and c) the two main effects plus interaction effect (i.e. the “net effect”) to show departure from additive interaction for the all U.S counties. Analyses were also performed for subgroupings by four urban/rural strata. RESULTS: We found the suggestion of antagonistic interactions but no synergism, along with several purely additive (i.e., no interaction) associations. In the non-stratified model, we observed antagonistic interac
Zierenberg-Ripoll, A; Pollard, R E; Stewart, S L; Allstadt, S D; Barrett, L E; Gillem, J M; Skorupski, K A
2018-06-01
To estimate prevalence of exposure to environmental tobacco smoke and other environmental toxins in dogs with primary lung tumours and to analyse association between exposure and lung tumour development. In this case-control study, an owner survey was developed to collect data on patient characteristics, general health care and environmental exposures. Dogs diagnosed with primary lung carcinomas formed the Case group. Dogs diagnosed with mast cell tumours served as Control Group 1 and dogs diagnosed with neurologic disease served as Control Group 2. Associations between diagnosis of primary lung tumour and patient and environmental exposure variables were analysed using bivariate and multivariate statistical methods. A total of 1178 owner surveys were mailed and 470 surveys were returned and included in statistical analysis, including 135 Cases, 169 dogs in Control Group 1 and 166 dogs in Control Group 2. An association between exposure to second-hand smoke and prevalence of primary lung cancer was not identified in this study. Second-hand smoke is associated with primary lung cancer in people but a definitive association has not been found in dogs. The results of this study suggest that tobacco smoke exposure may not be associated with primary lung cancer development in dogs but study limitations may have precluded detection of an association. © 2017 British Small Animal Veterinary Association.
Is there a genetic cause for cancer cachexia? – a clinical validation study in 1797 patients
Solheim, T S; Fayers, P M; Fladvad, T; Tan, B; Skorpen, F; Fearon, K; Baracos, V E; Klepstad, P; Strasser, F; Kaasa, S
2011-01-01
Background: Cachexia has major impact on cancer patients' morbidity and mortality. Future development of cachexia treatment needs methods for early identification of patients at risk. The aim of the study was to validate nine single-nucleotide polymorphisms (SNPs) previously associated with cachexia, and to explore 182 other candidate SNPs with the potential to be involved in the pathophysiology. Method: A total of 1797 cancer patients, classified as either having severe cachexia, mild cachexia or no cachexia, were genotyped. Results: After allowing for multiple testing, there was no statistically significant association between any of the SNPs analysed and the cachexia groups. However, consistent with prior reports, two SNPs from the acylpeptide hydrolase (APEH) gene showed suggestive statistical significance (P=0.02; OR, 0.78). Conclusion: This study failed to detect any significant association between any of the SNPs analysed and cachexia; although two SNPs from the APEH gene had a trend towards significance. The APEH gene encodes the enzyme APEH, postulated to be important in the endpoint of the ubiquitin system and thus the breakdown of proteins into free amino acids. In cachexia, there is an extensive breakdown of muscle proteins and an increase in the production of acute phase proteins in the liver. PMID:21934689
Siddall, James; Huebner, E Scott; Jiang, Xu
2013-01-01
This study examined the cross-sectional and prospective relationships between three sources of school-related social support (parent involvement, peer support for learning, and teacher-student relationships) and early adolescents' global life satisfaction. The participants were 597 middle school students from 1 large school in the southeastern United States who completed measures of school social climate and life satisfaction on 2 occasions, 5 months apart. The results revealed that school-related experiences in terms of social support for learning contributed substantial amounts of variance to individual differences in adolescents' satisfaction with their lives as a whole. Cross-sectional multiple regression analyses of the differential contributions of the sources of support demonstrated that family and peer support for learning contributed statistically significant, unique variance to global life satisfaction reports. Prospective multiple regression analyses demonstrated that only family support for learning continued to contribute statistically significant, unique variance to the global life satisfaction reports at Time 2. The results suggest that school-related experiences, especially family-school interactions, spill over into adolescents' overall evaluations of their lives at a time when direct parental involvement in schooling and adolescents' global life satisfaction are generally declining. Recommendations for future research and educational policies and practices are discussed. © 2013 American Orthopsychiatric Association.
Maternal smoking and newborn sex, birth weight and breastfeeding: a population-based study.
Timur Taşhan, Sermin; Hotun Sahin, Nevin; Omaç Sönmez, Mehtap
2017-11-01
Today, it is acknowledged that smoking during pregnancy and/or the postnatal period has significant risks for a foetus and newborn child. This research examines the relationship between smoking only postnatally, both during pregnancy and postnatally, and the newborn sex, birth weight and breastfeeding. Total 664 women of randomly selected five primary healthcare centres between the dates 20 February 2010 and 20 July 2010 were included in the research. Statistical analyses were performed with SPSS for Windows 19.0 (Statistical Package for Social Sciences software package). Data were described as mean, standard deviation, percentages and Chi-square tests and backward stepwise logistic regression were analysed. It was found that the percentage of smoking women with daughters is 2.5 times higher than women with sons. Women who smoke are 3.9 times more likely to start feeding their baby with supplementary infant foods at 4 months or earlier than those who do not smoke. Finally, the risk of a birth weight under 2500 g is 3.8 times higher for maternal smokers. This study suggests that women who expect a girl smoke more heavily than those who expect a boy. The birth weight of maternal smokers' newborns is lower. Those women who smoke while breastfeeding start feeding their babies with supplementary infant foods at an earlier age.
DISTMIX: direct imputation of summary statistics for unmeasured SNPs from mixed ethnicity cohorts.
Lee, Donghyung; Bigdeli, T Bernard; Williamson, Vernell S; Vladimirov, Vladimir I; Riley, Brien P; Fanous, Ayman H; Bacanu, Silviu-Alin
2015-10-01
To increase the signal resolution for large-scale meta-analyses of genome-wide association studies, genotypes at unmeasured single nucleotide polymorphisms (SNPs) are commonly imputed using large multi-ethnic reference panels. However, the ever increasing size and ethnic diversity of both reference panels and cohorts makes genotype imputation computationally challenging for moderately sized computer clusters. Moreover, genotype imputation requires subject-level genetic data, which unlike summary statistics provided by virtually all studies, is not publicly available. While there are much less demanding methods which avoid the genotype imputation step by directly imputing SNP statistics, e.g. Directly Imputing summary STatistics (DIST) proposed by our group, their implicit assumptions make them applicable only to ethnically homogeneous cohorts. To decrease computational and access requirements for the analysis of cosmopolitan cohorts, we propose DISTMIX, which extends DIST capabilities to the analysis of mixed ethnicity cohorts. The method uses a relevant reference panel to directly impute unmeasured SNP statistics based only on statistics at measured SNPs and estimated/user-specified ethnic proportions. Simulations show that the proposed method adequately controls the Type I error rates. The 1000 Genomes panel imputation of summary statistics from the ethnically diverse Psychiatric Genetic Consortium Schizophrenia Phase 2 suggests that, when compared to genotype imputation methods, DISTMIX offers comparable imputation accuracy for only a fraction of computational resources. DISTMIX software, its reference population data, and usage examples are publicly available at http://code.google.com/p/distmix. dlee4@vcu.edu Supplementary Data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dharmarajan, Guha; Beasley, James C.; Beatty, William S.
Many aspects of parasite biology critically depend on their hosts, and understanding how host-parasite populations are co-structured can help improve our understanding of the ecology of parasites, their hosts, and host-parasite interactions. Here, this study utilized genetic data collected from raccoons (Procyon lotor), and a specialist parasite, the raccoon tick (Ixodes texanus), to test for genetic co-structuring of host-parasite populations at both landscape and host scales. At the landscape scale, our analyses revealed a significant correlation between genetic and geographic distance matrices (i.e., isolation by distance) in ticks, but not their hosts. While there are several mechanisms that could leadmore » to a stronger pattern of isolation by distance in tick vs. raccoon datasets, our analyses suggest that at least one reason for the above pattern is the substantial increase in statistical power (due to the ≈8-fold increase in sample size) afforded by sampling parasites. Host-scale analyses indicated higher relatedness between ticks sampled from related vs. unrelated raccoons trapped within the same habitat patch, a pattern likely driven by increased contact rates between related hosts. Lastly, by utilizing fine-scale genetic data from both parasites and hosts, our analyses help improve our understanding of epidemiology and host ecology.« less
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith
2017-01-01
Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899
Further characterisation of the functional neuroanatomy associated with prosodic emotion decoding.
Mitchell, Rachel L C
2013-06-01
Current models of prosodic emotion comprehension propose a three stage cognition mediated by temporal lobe auditory regions through to inferior and orbitofrontal regions. Cumulative evidence suggests that its mediation may be more flexible though, with a facility to respond in a graded manner based on the need for executive control. The location of this fine-tuning system is unclear, as is its similarity to the cognitive control system. In the current study, need for executive control was manipulated in a block-design functional MRI study by systematically altering the proportion of incongruent trials across time, i.e., trials for which participants identified prosodic emotions in the face of conflicting lexico-semantic emotion cues. Resultant Blood Oxygenation Level Dependent contrast data were analysed according to standard procedures using Statistical Parametric Mapping v8 (Ashburner et al., 2009). In the parametric analyses, superior (medial) frontal gyrus activity increased linearly with increased need for executive control. In the separate analyses of each level of incongruity, results suggested that the baseline prosodic emotion comprehension system was sufficient to deal with low proportions of incongruent trials, whereas a more widespread frontal lobe network was required for higher proportions. These results suggest an executive control system for prosodic emotion comprehension exists which has the capability to recruit superior (medial) frontal gyrus in a graded manner and other frontal regions once demand exceeds a certain threshold. The need to revise current models of prosodic emotion comprehension and add a fourth processing stage are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Quantitative Susceptibility Mapping after Sports-Related Concussion.
Koch, K M; Meier, T B; Karr, R; Nencka, A S; Muftuler, L T; McCrea, M
2018-06-07
Quantitative susceptibility mapping using MR imaging can assess changes in brain tissue structure and composition. This report presents preliminary results demonstrating changes in tissue magnetic susceptibility after sports-related concussion. Longitudinal quantitative susceptibility mapping metrics were produced from imaging data acquired from cohorts of concussed and control football athletes. One hundred thirty-six quantitative susceptibility mapping datasets were analyzed across 3 separate visits (24 hours after injury, 8 days postinjury, and 6 months postinjury). Longitudinal quantitative susceptibility mapping group analyses were performed on stability-thresholded brain tissue compartments and selected subregions. Clinical concussion metrics were also measured longitudinally in both cohorts and compared with the measured quantitative susceptibility mapping. Statistically significant increases in white matter susceptibility were identified in the concussed athlete group during the acute (24 hour) and subacute (day 8) period. These effects were most prominent at the 8-day visit but recovered and showed no significant difference from controls at the 6-month visit. The subcortical gray matter showed no statistically significant group differences. Observed susceptibility changes after concussion appeared to outlast self-reported clinical recovery metrics at a group level. At an individual subject level, susceptibility increases within the white matter showed statistically significant correlations with return-to-play durations. The results of this preliminary investigation suggest that sports-related concussion can induce physiologic changes to brain tissue that can be detected using MR imaging-based magnetic susceptibility estimates. In group analyses, the observed tissue changes appear to persist beyond those detected on clinical outcome assessments and were associated with return-to-play duration after sports-related concussion. © 2018 by American Journal of Neuroradiology.
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Lanfredi, M.; Simoniello, T.; Cuomo, V.; Macchiato, M.
2009-02-01
This study originated from recent results reported in literature, which support the existence of long-range (power-law) persistence in atmospheric temperature fluctuations on monthly and inter-annual scales. We investigated the results of Detrended Fluctuation Analysis (DFA) carried out on twenty-two historical daily time series recorded in Europe in order to evaluate the reliability of such findings in depth. More detailed inspections emphasized systematic deviations from power-law and high statistical confidence for functional form misspecification. Rigorous analyses did not support scale-free correlation as an operative concept for Climate modelling, as instead suggested in literature. In order to understand the physical implications of our results better, we designed a bivariate Markov process, parameterised on the basis of the atmospheric observational data by introducing a slow dummy variable. The time series generated by this model, analysed both in time and frequency domains, tallied with the real ones very well. They accounted for both the deceptive scaling found in literature and the correlation details enhanced by our analysis. Our results seem to evidence the presence of slow fluctuations from another climatic sub-system such as ocean, which inflates temperature variance up to several months. They advise more precise re-analyses of temperature time series before suggesting dynamical paradigms useful for Climate modelling and for the assessment of Climate Change.
NASA Astrophysics Data System (ADS)
Lanfredi, M.; Simoniello, T.; Cuomo, V.; Macchiato, M.
2009-07-01
This study originated from recent results reported in literature, which support the existence of long-range (power-law) persistence in atmospheric temperature fluctuations on monthly and inter-annual scales. We investigated the results of Detrended Fluctuation Analysis (DFA) carried out on twenty-two historical daily time series recorded in Europe in order to evaluate the reliability of such findings in depth. More detailed inspections emphasized systematic deviations from power-law and high statistical confidence for functional form misspecification. Rigorous analyses did not support scale-free correlation as an operative concept for Climate modelling, as instead suggested in literature. In order to understand the physical implications of our results better, we designed a bivariate Markov process, parameterised on the basis of the atmospheric observational data by introducing a slow dummy variable. The time series generated by this model, analysed both in time and frequency domains, tallied with the real ones very well. They accounted for both the deceptive scaling found in literature and the correlation details enhanced by our analysis. Our results seem to evidence the presence of slow fluctuations from another climatic sub-system such as ocean, which inflates temperature variance up to several months. They advise more precise re-analyses of temperature time series before suggesting dynamical paradigms useful for Climate modelling and for the assessment of Climate Change.
Using venlafaxine to treat behavioral disorders in patients with autism spectrum disorder.
Carminati, Giuliana Galli; Gerber, Fabienne; Darbellay, Barbara; Kosel, Markus Mathaus; Deriaz, Nicolas; Chabert, Jocelyne; Fathi, Marc; Bertschy, Gilles; Ferrero, François; Carminati, Federico
2016-02-04
To test the efficacy of venlafaxine at a dose of 18.75 mg/day on the reduction of behavioral problems such as irritability and hyperactivity/noncompliance in patients with intellectual disabilities and autism spectrum disorder (ASD). Our secondary hypothesis was that the usual doses of zuclopenthixol and/or clonazepam would decrease in the venlafaxine-treated group. In a randomized double-blind study, we compared six patients who received venlafaxine along with their usual treatment (zuclopenthixol and/or clonazepam) with seven patients who received placebo plus usual care. Irritability, hyperactivity/noncompliance, and overall clinical improvement were measured after 2 and 8 weeks, using validated clinical scales. Univariate analyses showed that the symptom of irritability improved in the entire sample (p = 0.023 after 2 weeks, p = 0.061 at study endpoint), although no difference was observed between the venlafaxine and placebo groups. No significant decrease in hyperactivity/noncompliance was observed during the study. At the end of the study, global improvement was observed in 33% of participants treated with venlafaxine and in 71% of participants in the placebo group (p = 0.29). The study found that decreased cumulative doses of clonazepam and zuclopenthixol were required for the venlafaxine group. Multivariate analyses (principal component analyses) with at least three combinations of variables showed that the two populations could be clearly separated (p b 0.05). Moreover, in all cases, the venlafaxine population had lower values for the Aberrant Behavior Checklist (ABC), Behavior Problems Inventory (BPI), and levels of urea with respect to the placebo group. In one case, a reduction in the dosage of clonazepam was also suggested. For an additional set of variables (ABC factor 2, BPI frequency of aggressive behaviors, hematic ammonia at Day 28, and zuclopenthixol and clonazepam intake), the separation between the two samples was statistically significant as was the Bartlett's test, but the Kaiser–Meyer–Olkin Measure of Sampling Adequacy was below the accepted threshold. This set of variables showed a reduction in the cumulative intake of both zuclopenthixol and clonazepam. Despite the small sample sizes, this study documented a statistically significant effect of venlafaxine. Moreover, we showed that lower doses of zuclopenthixol and clonazepam were needed in the venlafaxine group, although this difference was not statistically significant. This was confirmed by multivariate analyses, where this difference reached statistical significance when using a combination of variables involving zuclopenthixol. Larger-scale studies are recommended to better investigate the effectiveness of venlafaxine treatment in patients with intellectual disabilities and ASD.
Bynum, T E; Koch, G G
1991-08-08
We sought to compare the efficacy of sucralfate to placebo for the prevention of duodenal ulcer recurrence and to determine that the efficacy of sucralfate was due to a true reduction in ulcer prevalence and not due to secondary effects such as analgesic activity or accelerated healing. This was a double-blind, randomized, placebo-controlled, parallel groups, multicenter clinical study with 254 patients. All patients had a past history of at least two duodenal ulcers with at least one ulcer diagnosed by endoscopic examination 3 months or less before the start of the study. Complete ulcer healing without erosions was required to enter the study. Sucralfate or placebo were dosed as a 1-g tablet twice a day for 4 months, or until ulcer recurrence. Endoscopic examinations once a month and when symptoms developed determined the presence or absence of duodenal ulcers. If a patient developed an ulcer between monthly scheduled visits, the patient was dosed with a 1-g sucralfate tablet twice a day until the next scheduled visit. Statistical analyses of the results determined the efficacy of sucralfate compared with placebo for preventing duodenal ulcer recurrence. Comparisons of therapeutic agents for preventing duodenal ulcers have usually been made by testing for statistical differences in the cumulative rates for all ulcers developed during a follow-up period, regardless of the time of detection. Statistical experts at the United States Food and Drug Administration (FDA) and on the FDA Advisory Panel expressed doubts about clinical study results based on this type of analysis. They suggested three possible mechanisms for reducing the number of observed ulcers: (a) analgesic effects, (b) accelerated healing, and (c) true ulcer prevention. Traditional ulcer analysis could miss recurring ulcers due to an analgesic effect or accelerated healing. Point-prevalence analysis could miss recurring ulcers due to accelerated healing between endoscopic examinations. Maximum ulcer analyses, a novel statistical method, eliminated analgesic effects by regularly scheduled endoscopies and accelerated healing of recurring ulcers by frequent endoscopies and an open-label phase. Maximum ulcer analysis reflects true ulcer recurrence and prevention. Sucralfate was significantly superior to placebo in reducing ulcer prevalence by all analyses. Significance (p less than 0.05) was found at months 3 and 4 for all analyses. All months were significant in the traditional analysis, months 2-4 in point-prevalence analysis, and months 3-4 in the maximal ulcer prevalence analysis. Sucralfate was shown to be effective for the prevention of duodenal ulcer recurrence by a true reduction in new ulcer development.
Smith, Gillian E; Elliot, Alex J; Ibbotson, Sue; Morbey, Roger; Edeghere, Obaghe; Hawker, Jeremy; Catchpole, Mike; Endericks, Tina; Fisher, Paul; McCloskey, Brian
2017-09-01
Syndromic surveillance aims to provide early warning and real time estimates of the extent of incidents; and reassurance about lack of impact of mass gatherings. We describe a novel public health risk assessment process to ensure those leading the response to the 2012 Olympic Games were alerted to unusual activity that was of potential public health importance, and not inundated with multiple statistical 'alarms'. Statistical alarms were assessed to identify those which needed to result in 'alerts' as reliably as possible. There was no previously developed method for this. We identified factors that increased our concern about an alarm suggesting that an 'alert' should be made. Between 2 July and 12 September 2012, 350 674 signals were analysed resulting in 4118 statistical alarms. Using the risk assessment process, 122 'alerts' were communicated to Olympic incident directors. Use of a novel risk assessment process enabled the interpretation of large number of statistical alarms in a manageable way for the period of a sustained mass gathering. This risk assessment process guided the prioritization and could be readily adapted to other surveillance systems. The process, which is novel to our knowledge, continues as a legacy of the Games. © Crown copyright 2016.
Langan, Dean; Higgins, Julian P T; Gregory, Walter; Sutton, Alexander J
2012-05-01
We aim to illustrate the potential impact of a new study on a meta-analysis, which gives an indication of the robustness of the meta-analysis. A number of augmentations are proposed to one of the most widely used of graphical displays, the funnel plot. Namely, 1) statistical significance contours, which define regions of the funnel plot in which a new study would have to be located to change the statistical significance of the meta-analysis; and 2) heterogeneity contours, which show how a new study would affect the extent of heterogeneity in a given meta-analysis. Several other features are also described, and the use of multiple features simultaneously is considered. The statistical significance contours suggest that one additional study, no matter how large, may have a very limited impact on the statistical significance of a meta-analysis. The heterogeneity contours illustrate that one outlying study can increase the level of heterogeneity dramatically. The additional features of the funnel plot have applications including 1) informing sample size calculations for the design of future studies eligible for inclusion in the meta-analysis; and 2) informing the updating prioritization of a portfolio of meta-analyses such as those prepared by the Cochrane Collaboration. Copyright © 2012 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
Harris, Michael; Radtke, Arthur S.
1976-01-01
Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.
ERIC Educational Resources Information Center
Neumann, David L.; Hood, Michelle
2009-01-01
A wiki was used as part of a blended learning approach to promote collaborative learning among students in a first year university statistics class. One group of students analysed a data set and communicated the results by jointly writing a practice report using a wiki. A second group analysed the same data but communicated the results in a…
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Cavalcante, Y L; Hauser-Davis, R A; Saraiva, A C F; Brandão, I L S; Oliveira, T F; Silveira, A M
2013-01-01
This paper compared and evaluated seasonal variations in physico-chemical parameters and metals at a hydroelectric power station reservoir by applying Multivariate Analyses and Artificial Neural Networks (ANN) statistical techniques. A Factor Analysis was used to reduce the number of variables: the first factor was composed of elements Ca, K, Mg and Na, and the second by Chemical Oxygen Demand. The ANN showed 100% correct classifications in training and validation samples. Physico-chemical analyses showed that water pH values were not statistically different between the dry and rainy seasons, while temperature, conductivity, alkalinity, ammonia and DO were higher in the dry period. TSS, hardness and COD, on the other hand, were higher during the rainy season. The statistical analyses showed that Ca, K, Mg and Na are directly connected to the Chemical Oxygen Demand, which indicates a possibility of their input into the reservoir system by domestic sewage and agricultural run-offs. These statistical applications, thus, are also relevant in cases of environmental management and policy decision-making processes, to identify which factors should be further studied and/or modified to recover degraded or contaminated water bodies. Copyright © 2012 Elsevier B.V. All rights reserved.
Evans, Jamie; Fitch, Christopher; Collard, Sharon; Henderson, Claire
2018-04-27
In recent years, the UK debt collection industry has taken steps to improve its policies and practices in relation to customers with mental health problems. Little data, however, have been collected to evidence change. This paper examines whether the reported attitudes and practices of debt collection staff when working with customers with mental health problems have changed between 2010 and 2016. This paper draws on descriptive and regression analyses of two cross-sectional surveys of debt collection staff: one conducted in 2010 and one conducted in 2016. All variables analysed show statistically significant changes between 2010 and 2016 indicative of improved reported attitudes and practices. While results suggest an improvement in attitudes and practice may have occurred between 2010 and 2016, research is required to understand this potential shift, its likely causes, and concrete impact on customers.
Visual Exploration of Genetic Association with Voxel-based Imaging Phenotypes in an MCI/AD Study
Kim, Sungeun; Shen, Li; Saykin, Andrew J.; West, John D.
2010-01-01
Neuroimaging genomics is a new transdisciplinary research field, which aims to examine genetic effects on brain via integrated analyses of high throughput neuroimaging and genomic data. We report our recent work on (1) developing an imaging genomic browsing system that allows for whole genome and entire brain analyses based on visual exploration and (2) applying the system to the imaging genomic analysis of an existing MCI/AD cohort. Voxel-based morphometry is used to define imaging phenotypes. ANCOVA is employed to evaluate the effect of the interaction of genotypes and diagnosis in relation to imaging phenotypes while controlling for relevant covariates. Encouraging experimental results suggest that the proposed system has substantial potential for enabling discovery of imaging genomic associations through visual evaluation and for localizing candidate imaging regions and genomic regions for refined statistical modeling. PMID:19963597
Tree-ring variation in western larch (Larix occidentalis Nutt. ) exposed to sulfur dioxide emissions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, C.A.; Kincaid, W.B.; Nash, T.H. III
1984-12-01
Tree-ring analysis of western larch (Larix occidentialis Nutt) demonstrated both direct and indirect affects of sulfur dioxide emissions from the lead/zinc smelter at Trail, B.C. Tree cores were collected from 5 stands known to have been polluted and from 3 control stands. Age effects were removed by fitting theoretical growth curves, and macrocliate was modeled using the average of the controls and two laged values thereof. Separate analyses were performed for years before and after installation of two tall stacks, for drought and nondrought years, and for years prior to initiation of smelting. Regression analyses revealed a negative effect onmore » annual growth that diminished with increasing distance from the smelter and during drought years. Furthermore, chronology statistics suggested an increase in sensitivity to climate that persisted decades beyond implementation of pollution controls, which reduced emissions 10-fold. 38 references, 6 figures, 3 tables.« less
Family Caregiver Role and Burden Related to Gender and Family Relationships
Friedemann, Marie-Luise; Buckwalter, Kathleen C.
2015-01-01
This study described and contrasted family caregivers and explored the effect of gender and family relationship on the caregiver’s role perception, workload, burden, and family help. Home care agencies and community organizations assisted with the recruitment of 533 multicultural, predominantly Latino caregivers who were interviewed at home. The Caregiver Identity Theory guided the study. Survey instruments were standardized tools or were constructed and pretested for this study. Descriptive statistics and t-test analyses assisted in describing the sample and multivariate analyses were used to contrast the caregiver groups. Findings suggested a gendered approach to self-appraisal and coping. Men in this predominantly Latino and Caribbean sample felt less burden and depression than women who believed caregiving is a female duty. Family nurses should pay attention to the most vulnerable groups: older spouses resistant to using family and community resources and hard-working female adult children, and assess each family situation individually. PMID:24777069
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
Application of Numerical Weather Models to Mitigating Atmospheric Artifacts in InSAR
NASA Astrophysics Data System (ADS)
Foster, J. H.; Kealy, J.; Businger, S.; Cherubini, T.; Brooks, B. A.; Albers, S. C.; Lu, Z.; Poland, M. P.; Chen, S.; Mass, C.
2011-12-01
A high-resolution weather "hindcasting" system to model the atmosphere at the time of SAR scene acquisitions has been established to investigate and mitigate the impact of atmospheric water vapor on InSAR deformation maps. Variations in the distributions of water vapor in the atmosphere between SAR acquisitions lead to artifacts in interferograms that can mask real ground motion signals. A database of regional numerical weather prediction model outputs generated by the University of Washington and U.C. Davis for times matching SAR acquisitions was used as "background" for higher resolution analyses of the atmosphere for Mount St Helens volcano in Washington, and Los Angeles in southern California. Using this background, we use LAPS to incrementally incorporate all other available meteorological data sets, including GPS, to explore the impact of additional observations on model accuracy. Our results suggest that, even with significant quantities of contemporaneously measured data, high-resolution atmospheric analyses are unable to model the timing and location of water vapor perturbations accurately enough to produce robust and reliable phase screens that can be directly subtracted from interferograms. Despite this, the analyses are able to reproduce the statistical character of the atmosphere with some confidence, suggesting that, in the absence of unusually dense in-situ measurements (such as is the case with GPS data for Los Angeles), weather analysis can play a valuable role in constraining the power-spectrum expected in an interferogram due to the troposphere. This could be used to provide objective weights to scenes during traditional stacking or to tune the filter parameters in time-series analyses.
Hansen, Anne-Sophie K; Madsen, Ida E H; Thorsen, Sannie Vester; Melkevik, Ole; Bjørner, Jakob Bue; Andersen, Ingelise; Rugulies, Reiner
2018-05-01
Most previous prospective studies have examined workplace social capital as a resource of the individual. However, literature suggests that social capital is a collective good. In the present study we examined whether a high level of workplace aggregated social capital (WASC) predicts a decreased risk of individual-level long-term sickness absence (LTSA) in Danish private sector employees. A sample of 2043 employees (aged 18-64 years, 38.5% women) from 260 Danish private-sector companies filled in a questionnaire on workplace social capital and covariates. WASC was calculated by assigning the company-averaged social capital score to all employees of each company. We derived LTSA, defined as sickness absence of more than three weeks, from a national register. We examined if WASC predicted employee LTSA using multilevel survival analyses, while excluding participants with LTSA in the three months preceding baseline. We found no statistically significant association in any of the analyses. The hazard ratio for LTSA in the fully adjusted model was 0.93 (95% CI 0.77-1.13) per one standard deviation increase in WASC. When using WASC as a categorical exposure we found a statistically non-significant tendency towards a decreased risk of LTSA in employees with medium WASC (fully adjusted model: HR 0.78 (95% CI 0.48-1.27)). Post hoc analyses with workplace social capital as a resource of the individual showed similar results. WASC did not predict LTSA in this sample of Danish private-sector employees.
Frey, N; Hügle, T; Jick, S S; Meier, C R; Spoendlin, J
2016-09-01
Emerging evidence suggests that diabetes may be a risk factor for osteoarthritis (OA). However, previous results on the association between diabetes and all OA were conflicting. We aimed to comprehensively analyse the association between type II diabetes mellitus (T2DM) and osteoarthritis of the hand (HOA) specifically. We conducted a matched (1:1) case-control study using the UK-based Clinical Practice Research Datalink (CPRD) of cases aged 30-90 years with an incident diagnosis of HOA from 1995 to 2013. In multivariable conditional logistic regression analyses, we calculated odds ratios (OR) for incident HOA in patients with T2DM, categorized by T2DM severity (HbA1C), duration, and pharmacological treatment. We further performed sensitivity analyses in patients with and without other metabolic diseases (hypertension (HT), hyperlipidaemia (HL), obesity). Among 13,500 cases and 13,500 controls, we observed no statistically significant association between T2DM and HOA (OR 0.95, 95% confidence interval (CI) 0.87-1.04), regardless of T2DM severity, duration, or pharmacological treatment. Having HT did not change the OR. Although we observed slightly increased ORs in overweight T2DM patients with co-occurring HL with or without coexisting HT, none of these ORs were statistically significant. Our results provide evidence that T2DM is not an independent risk factor for HOA. Concurrence of T2DM with HT, HL, and/or obesity did not change this association significantly. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Oliveira, J T; Santos, T C; Martins, L; Silva, M A; Marques, A P; Castro, A G; Neves, N M; Reis, R L
2009-10-01
Gellan gum is a polysaccharide that has been recently proposed by our group for cartilage tissue-engineering applications. It is commonly used in the food and pharmaceutical industry and has the ability to form stable gels without the use of harsh reagents. Gellan gum can function as a minimally invasive injectable system, gelling inside the body in situ under physiological conditions and efficiently adapting to the defect site. In this work, gellan gum hydrogels were combined with human articular chondrocytes (hACs) and were subcutaneously implanted in nude mice for 4 weeks. The implants were collected for histological (haematoxylin and eosin and Alcian blue staining), biochemical [dimethylmethylene blue (GAG) assay], molecular (real-time PCR analyses for collagen types I, II and X, aggrecan) and immunological analyses (immunolocalization of collagen types I and II). The results showed a homogeneous cell distribution and the typical round-shaped morphology of the chondrocytes within the matrix upon implantation. Proteoglycans synthesis was detected by Alcian blue staining and a statistically significant increase of proteoglycans content was measured with the GAG assay quantified from 1 to 4 weeks of implantation. Real-time PCR analyses showed a statistically significant upregulation of collagen type II and aggrecan levels in the same periods. The immunological assays suggest deposition of collagen type II along with some collagen type I. The overall data shows that gellan gum hydrogels adequately support the growth and ECM deposition of human articular chondrocytes when implanted subcutaneously in nude mice. Copyright (c) 2009 John Wiley & Sons, Ltd.
Isotropy analyses of the Planck convergence map
NASA Astrophysics Data System (ADS)
Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.
2018-01-01
The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.
Hansen, Anne-Sophie K.; Madsen, Ida E. H.; Thorsen, Sannie Vester; Melkevik, Ole; Bjørner, Jakob Bue; Andersen, Ingelise; Rugulies, Reiner
2017-01-01
Aims: Most previous prospective studies have examined workplace social capital as a resource of the individual. However, literature suggests that social capital is a collective good. In the present study we examined whether a high level of workplace aggregated social capital (WASC) predicts a decreased risk of individual-level long-term sickness absence (LTSA) in Danish private sector employees. Methods: A sample of 2043 employees (aged 18–64 years, 38.5% women) from 260 Danish private-sector companies filled in a questionnaire on workplace social capital and covariates. WASC was calculated by assigning the company-averaged social capital score to all employees of each company. We derived LTSA, defined as sickness absence of more than three weeks, from a national register. We examined if WASC predicted employee LTSA using multilevel survival analyses, while excluding participants with LTSA in the three months preceding baseline. Results: We found no statistically significant association in any of the analyses. The hazard ratio for LTSA in the fully adjusted model was 0.93 (95% CI 0.77–1.13) per one standard deviation increase in WASC. When using WASC as a categorical exposure we found a statistically non-significant tendency towards a decreased risk of LTSA in employees with medium WASC (fully adjusted model: HR 0.78 (95% CI 0.48–1.27)). Post hoc analyses with workplace social capital as a resource of the individual showed similar results. Conclusions: WASC did not predict LTSA in this sample of Danish private-sector employees. PMID:28784025
Assogbadjo, A E; Kyndt, T; Sinsin, B; Gheysen, G; van Damme, P
2006-05-01
Baobab (Adansonia digitata) is a multi-purpose tree used daily by rural African communities. The present study aimed at investigating the level of morphometric and genetic variation and spatial genetic structure within and between threatened baobab populations from the three climatic zones of Benin. A total of 137 individuals from six populations were analysed using morphometric data as well as molecular marker data generated using the AFLP technique. Five primer pairs resulted in a total of 217 scored bands with 78.34 % of them being polymorphic. A two-level AMOVA of 137 individuals from six baobab populations revealed 82.37 % of the total variation within populations and 17.63 % among populations (P < 0.001). Analysis of population structure with allele-frequency based F-statistics revealed a global F(ST) of 0.127 +/- 0.072 (P < 0.001). The mean gene diversity within populations (H(S)) and the average gene diversity between populations (D(ST)) were estimated at 0.309 +/- 0.000 and 0.045 +/- 0.072, respectively. Baobabs in the Sudanian and Sudan-Guinean zones of Benin were short and produced the highest yields of pulp, seeds and kernels, in contrast to the ones in the Guinean zone, which were tall and produced only a small number of fruits with a low pulp, seed and kernel productivity. A statistically significant correlation with the observed patterns of genetic diversity was observed for three morphological characteristics: height of the trees, number of branches and thickness of the capsules. The results indicate some degree of physical isolation of the populations collected in the different climatic zones and suggest a substantial amount of genetic structuring between the analysed populations of baobab. Sampling options of the natural populations are suggested for in or ex situ conservation.
Southern hemisphere low level wind circulation statistics from the Seasat scatterometer
NASA Technical Reports Server (NTRS)
Levy, Gad
1994-01-01
Analyses of remotely sensed low-level wind vector data over the Southern Ocean are performed. Five-day averages and monthly means are created and the month-to-month variability during the winter (July-September) of 1978 is investigated. The remotely sensed winds are compared to the Australian Bureau of Meteorology (ABM) and the National Meteorological Center (NMC) surface analyses. In southern latitudes the remotely sensed winds are stronger than what the weather services' analyses suggest, indicating under-estimation by ABM and NMC in these regions. The evolution of the low-level jet and the major stormtracks during the season are studied and different flow regimes are identified. The large-scale variability of the meridional flow is studied with the aid of empirical orthogonal function (EOF) analysis. The dominance of quasi-stationary wave numbers 3,4, and 5 in the winter flows is evident in both the EOF analysis and the mean flow. The signature of an exceptionally strong blocking situation is evident in July and the special conditions leading to it are discussed. A very large intraseasonal variability with different flow regimes at different months is documented.
Martiniano, Rui; McLaughlin, Russell; Silva, Nuno M.; Manco, Licinio; Pereira, Tania; Coelho, Maria J.; Serra, Miguel; Burger, Joachim; Parreira, Rui; Moran, Elena; Valera, Antonio C.; Silva, Ana M.
2017-01-01
We analyse new genomic data (0.05–2.95x) from 14 ancient individuals from Portugal distributed from the Middle Neolithic (4200–3500 BC) to the Middle Bronze Age (1740–1430 BC) and impute genomewide diploid genotypes in these together with published ancient Eurasians. While discontinuity is evident in the transition to agriculture across the region, sensitive haplotype-based analyses suggest a significant degree of local hunter-gatherer contribution to later Iberian Neolithic populations. A more subtle genetic influx is also apparent in the Bronze Age, detectable from analyses including haplotype sharing with both ancient and modern genomes, D-statistics and Y-chromosome lineages. However, the limited nature of this introgression contrasts with the major Steppe migration turnovers within third Millennium northern Europe and echoes the survival of non-Indo-European language in Iberia. Changes in genomic estimates of individual height across Europe are also associated with these major cultural transitions, and ancestral components continue to correlate with modern differences in stature. PMID:28749934
Sharma, Swarkar; Saha, Anjana; Rai, Ekta; Bhat, Audesh; Bamezai, Ramesh
2005-01-01
We have analysed the hypervariable regions (HVR I and II) of human mitochondrial DNA (mtDNA) in individuals from Uttar Pradesh (UP), Bihar (BI) and Punjab (PUNJ), belonging to the Indo-European linguistic group, and from South India (SI), that have their linguistic roots in Dravidian language. Our analysis revealed the presence of known and novel mutations in both hypervariable regions in the studied population groups. Median joining network analyses based on mtDNA showed extensive overlap in mtDNA lineages despite the extensive cultural and linguistic diversity. MDS plot analysis based on Fst distances suggested increased maternal genetic proximity for the studied population groups compared with other world populations. Mismatch distribution curves, respective neighbour joining trees and other statistical analyses showed that there were significant expansions. The study revealed an ancient common ancestry for the studied population groups, most probably through common founder female lineage(s), and also indicated that human migrations occurred (maybe across and within the Indian subcontinent) even after the initial phase of female migration to India.
CWD prevalence, perceived human health risks, and state influences on deer hunting participation.
Vaske, Jerry J; Lyon, Katie M
2011-03-01
This study examined factors predicted by previous research to influence hunters' decisions to stop hunting deer in a state. Data were obtained from mail surveys of resident and nonresident deer hunters in Arizona, North Dakota, South Dakota, and Wisconsin (n = 3,518). Hunters were presented with six scenarios depicting hypothetical CWD prevalence levels and human health risks from the disease (e.g., death), and asked if they would continue or stop hunting deer in the state. Bivariate analyses examined the influence of five predictor variables: (a) CWD prevalence, (b) hypothetical human death from CWD, (c) perceived human health risks from CWD, (d) state, and (e) residency. In the bivariate analyses, prevalence was the strongest predictor of quitting hunting in the state followed by hypothetical human death and perceived risk. The presence of CWD in a state and residency were weak, but statistically significant, predictors. Interactions among these predictors increased the potential for stopping hunting in the state. Multivariate analyses suggested that 64% of our respondents would quit hunting in the worst-case scenario. © 2010 Society for Risk Analysis.
Conceptual and statistical problems associated with the use of diversity indices in ecology.
Barrantes, Gilbert; Sandoval, Luis
2009-09-01
Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.
1984-01-01
An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.
Chan, S C
2012-03-01
This study aimed to determine the views of Malaysian interns and their supervisors on whether undergraduate clinical skills training adequately equipped them for internship and their suggestions for improvement. Pre-tested questionnaires covering demographic characteristics, the participants' views on clinical skills training (communication, history taking, physical examination, diagnosis, patient management and procedures) and their suggestions for improvement were sent to all interns and their supervisors through the hospital directors. Data compiled was analysed to determine any significant association. Out of the 32 hospitals with interns, 22 participated in the study. 521 completed questionnaires (350 interns, 171 supervisors) were analysed. The majority of interns felt that their undergraduate clinical skills training was adequate in all the aspects studied. The majority of supervisors, however, felt that it was grossly inadequate to poor in the areas of communication: breaking bad news (77% supervisors versus 13% interns), dealing with angry patients (75% versus 20%), giving information (59% versus 3%), communicating with patients' families (53% versus 7%); adult resuscitation: intubation (72% versus 23%), defibrillation (77% versus 31%), use of drugs (62% versus 19%); and all aspects of child resuscitation. This was statistically significant (p < 0.05). Suggestions for improvement included more clinical exposure, communication skills workshop and monitoring of logbooks. This study suggests that there are deficiencies, particularly in communication and resuscitation skills training, in undergraduate clinical skills training. In-depth studies are required to identify ways to improve training.
[Clinical research=design*measurements*statistical analyses].
Furukawa, Toshiaki
2012-06-01
A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
Genome-wide identification and analysis of the chicken basic helix-loop-helix factors.
Liu, Wu-Yi; Zhao, Chun-Jiang
2010-01-01
Members of the basic helix-loop-helix (bHLH) family of transcription factors play important roles in a wide range of developmental processes. In this study, we conducted a genome-wide survey using the chicken (Gallus gallus) genomic database, and identified 104 bHLH sequences belonging to 42 gene families in an effort to characterize the chicken bHLH transcription factor family. Phylogenetic analyses revealed that chicken has 50, 21, 15, 4, 8, and 3 bHLH members in groups A, B, C, D, E, and F, respectively, while three members belonging to none of these groups were classified as ''orphans". A comparison between chicken and human bHLH repertoires suggested that both organisms have a number of lineage-specific bHLH members in the proteomes. Chromosome distribution patterns and phylogenetic analyses strongly suggest that the bHLH members should have arisen through gene duplication at an early date. Gene Ontology (GO) enrichment statistics showed 51 top GO annotations of biological processes counted in the frequency. The present study deepens our understanding of the chicken bHLH transcription factor family and provides much useful information for further studies using chicken as a model system.
Nimptsch, Ulrike; Wengler, Annelene; Mansky, Thomas
2016-11-01
In Germany, nationwide hospital discharge data (DRG statistics provided by the research data centers of the Federal Statistical Office and the Statistical Offices of the 'Länder') are increasingly used as data source for health services research. Within this data hospitals can be separated via their hospital identifier ([Institutionskennzeichen] IK). However, this hospital identifier primarily designates the invoicing unit and is not necessarily equivalent to one hospital location. Aiming to investigate direction and extent of possible bias in hospital-level analyses this study examines the continuity of the hospital identifier within a cross-sectional and longitudinal approach and compares the results to official hospital census statistics. Within the DRG statistics from 2005 to 2013 the annual number of hospitals as classified by hospital identifiers was counted for each year of observation. The annual number of hospitals derived from DRG statistics was compared to the number of hospitals in the official census statistics 'Grunddaten der Krankenhäuser'. Subsequently, the temporal continuity of hospital identifiers in the DRG statistics was analyzed within cohorts of hospitals. Until 2013, the annual number of hospital identifiers in the DRG statistics fell by 175 (from 1,725 to 1,550). This decline affected only providers with small or medium case volume. The number of hospitals identified in the DRG statistics was lower than the number given in the census statistics (e.g., in 2013 1,550 IK vs. 1,668 hospitals in the census statistics). The longitudinal analyses revealed that the majority of hospital identifiers persisted in the years of observation, while one fifth of hospital identifiers changed. In cross-sectional studies of German hospital discharge data the separation of hospitals via the hospital identifier might lead to underestimating the number of hospitals and consequential overestimation of caseload per hospital. Discontinuities of hospital identifiers over time might impair the follow-up of hospital cohorts. These limitations must be taken into account in analyses of German hospital discharge data focusing on the hospital level. Copyright © 2016. Published by Elsevier GmbH.
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
Bailey-Wilson, Joan E; Childs, Erica J; Cropp, Cheryl D; Schaid, Daniel J; Xu, Jianfeng; Camp, Nicola J; Cannon-Albright, Lisa A; Farnham, James M; George, Asha; Powell, Isaac; Carpten, John D; Giles, Graham G; Hopper, John L; Severi, Gianluca; English, Dallas R; Foulkes, William D; Mæhle, Lovise; Møller, Pål; Eeles, Rosalind; Easton, Douglas; Guy, Michelle; Edwards, Steve; Badzioch, Michael D; Whittemore, Alice S; Oakley-Girvan, Ingrid; Hsieh, Chih-Lin; Dimitrov, Latchezar; Stanford, Janet L; Karyadi, Danielle M; Deutsch, Kerry; McIntosh, Laura; Ostrander, Elaine A; Wiley, Kathleen E; Isaacs, Sarah D; Walsh, Patrick C; Thibodeau, Stephen N; McDonnell, Shannon K; Hebbring, Scott; Lange, Ethan M; Cooney, Kathleen A; Tammela, Teuvo L J; Schleutker, Johanna; Maier, Christiane; Bochum, Sylvia; Hoegel, Josef; Grönberg, Henrik; Wiklund, Fredrik; Emanuelsson, Monica; Cancel-Tassin, Geraldine; Valeri, Antoine; Cussenot, Olivier; Isaacs, William B
2012-06-19
Genetic variants are likely to contribute to a portion of prostate cancer risk. Full elucidation of the genetic etiology of prostate cancer is difficult because of incomplete penetrance and genetic and phenotypic heterogeneity. Current evidence suggests that genetic linkage to prostate cancer has been found on several chromosomes including the X; however, identification of causative genes has been elusive. Parametric and non-parametric linkage analyses were performed using 26 microsatellite markers in each of 11 groups of multiple-case prostate cancer families from the International Consortium for Prostate Cancer Genetics (ICPCG). Meta-analyses of the resultant family-specific linkage statistics across the entire 1,323 families and in several predefined subsets were then performed. Meta-analyses of linkage statistics resulted in a maximum parametric heterogeneity lod score (HLOD) of 1.28, and an allele-sharing lod score (LOD) of 2.0 in favor of linkage to Xq27-q28 at 138 cM. In subset analyses, families with average age at onset less than 65 years exhibited a maximum HLOD of 1.8 (at 138 cM) versus a maximum regional HLOD of only 0.32 in families with average age at onset of 65 years or older. Surprisingly, the subset of families with only 2-3 affected men and some evidence of male-to-male transmission of prostate cancer gave the strongest evidence of linkage to the region (HLOD = 3.24, 134 cM). For this subset, the HLOD was slightly increased (HLOD = 3.47 at 134 cM) when families used in the original published report of linkage to Xq27-28 were excluded. Although there was not strong support for linkage to the Xq27-28 region in the complete set of families, the subset of families with earlier age at onset exhibited more evidence of linkage than families with later onset of disease. A subset of families with 2-3 affected individuals and with some evidence of male to male disease transmission showed stronger linkage signals. Our results suggest that the genetic basis for prostate cancer in our families is much more complex than a single susceptibility locus on the X chromosome, and that future explorations of the Xq27-28 region should focus on the subset of families identified here with the strongest evidence of linkage to this region.
ERIC Educational Resources Information Center
Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.
2005-01-01
This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
The Empirical Nature and Statistical Treatment of Missing Data
ERIC Educational Resources Information Center
Tannenbaum, Christyn E.
2009-01-01
Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Evaluation and application of summary statistic imputation to discover new height-associated loci.
Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán
2018-05-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.
Evaluation and application of summary statistic imputation to discover new height-associated loci
2018-01-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485
Rushton, Paul R P; Grevitt, Michael P
2013-04-20
Review and statistical analysis of studies evaluating the effect of surgery on the health-related quality of life of adolescents with adolescent idiopathic scoliosis, using Scoliosis Research Society (SRS) outcomes. Apply published minimum clinical important differences (MCID) values for the SRS22r questionnaire to the literature to identify what areas of health-related quality of life are consistently affected by surgery and whether changes are clinically meaningful. The interpretation of published studies using the SRS outcomes has been limited by the lack of MCID values for the questionnaire domains. The recent publication of these data allows the clinical importance of any changes in these studies to be examined for the first time. A literature search was undertaken to locate suitable studies that were then analyzed. Statistically significant differences from baseline to 2 years postoperatively were ascertained by narratively reporting the analyses within included studies. When possible, clinically significant changes were assessed using 95% confidence intervals for the change in mean domain score. If the lower bound of the confidence intervals for the change exceeded the MCID for that domain, the change was considered clinically significant. The numbers of cohorts available for the different analyses varied (5-16). Eighty-one percent and 94% of included cohorts experienced statistically significant improvements in pain and self-image domains. In terms of clinical significance, it was only self-image that regularly improved by more than MCID, doing so in 4 of 5 included cohorts (80%) compared with 1 of 12 cohorts (8%) for pain. No clinically relevant changes occurred in mental health or activity domains. Evidence suggests that surgery can lead to clinically important improvement in patient self-image. Surgeons and patients should be aware of the limited evidence for improvements in domains other than self-image after surgery. Surgical decision-making will also be influenced by the natural history of adolescent idiopathic scoliosis.
Substance use and delinquency during adolescence: a prospective look at an at-risk sample.
Paradise, Matthew J; Cauce, Ana Mari
2003-01-01
This paper focuses on the relationship between adolescent substance use and delinquent behavior in a sample of homeless young people. Confirmatory factor analyses indicated that delinquency and substance use are best described as discrete factors, and competing theoretical models of the longitudinal association between these two factors were examined using structural equations modeling techniques. The results suggest that delinquent behavior is associated with changes in alcohol, marijuana, and drug use across time. This effect was statistically significant over relatively brief lags in time of six months or less. Combined with previous results, these findings challenge the utility of single-factor explanations of adolescent deviance for at-risk populations and suggest that the relationship between substance use and externalizing across time may be more dynamic than previously thought. Implications for intervention are also discussed.
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Stewart, Samuel Alan; Abidi, Syed Sibte Raza
2012-12-04
Knowledge Translation (KT) plays a vital role in the modern health care community, facilitating the incorporation of new evidence into practice. Web 2.0 tools provide a useful mechanism for establishing an online KT environment in which health practitioners share their practice-related knowledge and experiences with an online community of practice. We have implemented a Web 2.0 based KT environment--an online discussion forum--for pediatric pain practitioners across seven different hospitals in Thailand. The online discussion forum enabled the pediatric pain practitioners to share and translate their experiential knowledge to help improve the management of pediatric pain in hospitals. The goal of this research is to investigate the knowledge sharing dynamics of a community of practice through an online discussion forum. We evaluated the communication patterns of the community members using statistical and social network analysis methods in order to better understand how the online community engages to share experiential knowledge. Statistical analyses and visualizations provide a broad overview of the communication patterns within the discussion forum. Social network analysis provides the tools to delve deeper into the social network, identifying the most active members of the community, reporting the overall health of the social network, isolating the potential core members of the social network, and exploring the inter-group relationships that exist across institutions and professions. The statistical analyses revealed a network dominated by a single institution and a single profession, and found a varied relationship between reading and posting content to the discussion forum. The social network analysis discovered a healthy network with strong communication patterns, while identifying which users are at the center of the community in terms of facilitating communication. The group-level analysis suggests that there is strong interprofessional and interregional communication, but a dearth of non-nurse participants has been identified as a shortcoming. The results of the analysis suggest that the discussion forum is active and healthy, and that, though few, the interprofessional and interinstitutional ties are strong.
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
NASA Astrophysics Data System (ADS)
Erfanifard, Y.; Rezayan, F.
2014-10-01
Vegetation heterogeneity biases second-order summary statistics, e.g., Ripley's K-function, applied for spatial pattern analysis in ecology. Second-order investigation based on Ripley's K-function and related statistics (i.e., L- and pair correlation function g) is widely used in ecology to develop hypothesis on underlying processes by characterizing spatial patterns of vegetation. The aim of this study was to demonstrate effects of underlying heterogeneity of wild pistachio (Pistacia atlantica Desf.) trees on the second-order summary statistics of point pattern analysis in a part of Zagros woodlands, Iran. The spatial distribution of 431 wild pistachio trees was accurately mapped in a 40 ha stand in the Wild Pistachio & Almond Research Site, Fars province, Iran. Three commonly used second-order summary statistics (i.e., K-, L-, and g-functions) were applied to analyse their spatial pattern. The two-sample Kolmogorov-Smirnov goodness-of-fit test showed that the observed pattern significantly followed an inhomogeneous Poisson process null model in the study region. The results also showed that heterogeneous pattern of wild pistachio trees biased the homogeneous form of K-, L-, and g-functions, demonstrating a stronger aggregation of the trees at the scales of 0-50 m than actually existed and an aggregation at scales of 150-200 m, while regularly distributed. Consequently, we showed that heterogeneity of point patterns may bias the results of homogeneous second-order summary statistics and we also suggested applying inhomogeneous summary statistics with related null models for spatial pattern analysis of heterogeneous vegetations.
Formalizing the definition of meta-analysis in Molecular Ecology.
ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E
2015-08-01
Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Cornejo-Romero, Amelia; Aguilar-Martínez, Gustavo F.; Medina-Sánchez, Javier; Rendón-Aguilar, Beatriz; Valverde, Pedro Luis; Zavala-Hurtado, Jose Alejandro; Serrato, Alejandra; Rivas-Arancibia, Sombra; Pérez-Hernández, Marco Aurelio; López-Ortega, Gerardo; Jiménez-Sierra, Cecilia
2017-01-01
Historic demography changes of plant species adapted to New World arid environments could be consistent with either the Glacial Refugium Hypothesis (GRH), which posits that populations contracted to refuges during the cold-dry glacial and expanded in warm-humid interglacial periods, or with the Interglacial Refugium Hypothesis (IRH), which suggests that populations contracted during interglacials and expanded in glacial times. These contrasting hypotheses are developed in the present study for the giant columnar cactus Cephalocereus columna-trajani in the intertropical Mexican drylands where the effects of Late Quaternary climatic changes on phylogeography of cacti remain largely unknown. In order to determine if the historic demography and phylogeographic structure of the species are consistent with either hypothesis, sequences of the chloroplast regions psbA-trnH and trnT-trnL from 110 individuals from 10 populations comprising the full distribution range of this species were analysed. Standard estimators of genetic diversity and structure were calculated. The historic demography was analysed using a Bayesian approach and the palaeodistribution was derived from ecological niche modelling to determine if, in the arid environments of south-central Mexico, glacial-interglacial cycles drove the genetic divergence and diversification of this species. Results reveal low but statistically significant population differentiation (FST = 0.124, P < 0.001), although very clear geographic clusters are not formed. Genetic diversity, haplotype network and Approximate Bayesian Computation (ABC) demographic analyses suggest a population expansion estimated to have taken place in the Last Interglacial (123.04 kya, 95% CI 115.3–130.03). The species palaeodistribution is consistent with the ABC analyses and indicates that the potential area of palaedistribution and climatic suitability were larger during the Last Interglacial and Holocene than in the Last Glacial Maximum. Overall, these results suggest that C. columna-trajani experienced an expansion following the warm conditions of interglacials, in accordance with the GRH. PMID:28426818
Cornejo-Romero, Amelia; Vargas-Mendoza, Carlos Fabián; Aguilar-Martínez, Gustavo F; Medina-Sánchez, Javier; Rendón-Aguilar, Beatriz; Valverde, Pedro Luis; Zavala-Hurtado, Jose Alejandro; Serrato, Alejandra; Rivas-Arancibia, Sombra; Pérez-Hernández, Marco Aurelio; López-Ortega, Gerardo; Jiménez-Sierra, Cecilia
2017-01-01
Historic demography changes of plant species adapted to New World arid environments could be consistent with either the Glacial Refugium Hypothesis (GRH), which posits that populations contracted to refuges during the cold-dry glacial and expanded in warm-humid interglacial periods, or with the Interglacial Refugium Hypothesis (IRH), which suggests that populations contracted during interglacials and expanded in glacial times. These contrasting hypotheses are developed in the present study for the giant columnar cactus Cephalocereus columna-trajani in the intertropical Mexican drylands where the effects of Late Quaternary climatic changes on phylogeography of cacti remain largely unknown. In order to determine if the historic demography and phylogeographic structure of the species are consistent with either hypothesis, sequences of the chloroplast regions psbA-trnH and trnT-trnL from 110 individuals from 10 populations comprising the full distribution range of this species were analysed. Standard estimators of genetic diversity and structure were calculated. The historic demography was analysed using a Bayesian approach and the palaeodistribution was derived from ecological niche modelling to determine if, in the arid environments of south-central Mexico, glacial-interglacial cycles drove the genetic divergence and diversification of this species. Results reveal low but statistically significant population differentiation (FST = 0.124, P < 0.001), although very clear geographic clusters are not formed. Genetic diversity, haplotype network and Approximate Bayesian Computation (ABC) demographic analyses suggest a population expansion estimated to have taken place in the Last Interglacial (123.04 kya, 95% CI 115.3-130.03). The species palaeodistribution is consistent with the ABC analyses and indicates that the potential area of palaedistribution and climatic suitability were larger during the Last Interglacial and Holocene than in the Last Glacial Maximum. Overall, these results suggest that C. columna-trajani experienced an expansion following the warm conditions of interglacials, in accordance with the GRH.
Reduction of Fasting Blood Glucose and Hemoglobin A1c Using Oral Aloe Vera: A Meta-Analysis.
Dick, William R; Fletcher, Emily A; Shah, Sachin A
2016-06-01
Diabetes mellitus is a global epidemic and one of the leading causes of morbidity and mortality. Additional medications that are novel, affordable, and efficacious are needed to treat this rampant disease. This meta-analysis was performed to ascertain the effectiveness of oral aloe vera consumption on the reduction of fasting blood glucose (FBG) and hemoglobin A1c (HbA1c). PubMed, CINAHL, Natural Medicines Comprehensive Database, and Natural Standard databases were searched. Studies of aloe vera's effect on FBG, HbA1c, homeostasis model assessment-estimated insulin resistance (HOMA-IR), fasting serum insulin, fructosamine, and oral glucose tolerance test (OGTT) in prediabetic and diabetic populations were examined. After data extraction, the parameters of FBG and HbA1c had appropriate data for meta-analyses. Extracted data were verified and then analyzed by StatsDirect Statistical Software. Reductions of FBG and HbA1c were reported as the weighted mean differences from baseline, calculated by a random-effects model with 95% confidence intervals. Subgroup analyses to determine clinical and statistical heterogeneity were also performed. Publication bias was assessed by using the Egger bias statistic. Nine studies were included in the FBG parameter (n = 283); 5 of these studies included HbA1c data (n = 89). Aloe vera decreased FBG by 46.6 mg/dL (p < 0.0001) and HbA1c by 1.05% (p = 0.004). Significant reductions of both endpoints were maintained in all subgroup analyses. Additionally, the data suggest that patients with an FBG ≥200 mg/dL may see a greater benefit. A mean FBG reduction of 109.9 mg/dL was observed in this population (p ≤ 0.0001). The Egger statistic showed publication bias with FBG but not with HbA1c (p = 0.010 and p = 0.602, respectively). These results support the use of oral aloe vera for significantly reducing FBG (46.6 mg/dL) and HbA1c (1.05%). Further clinical studies that are more robust and better controlled are warranted to further explore these findings.
SEER Cancer Query Systems (CanQues)
These applications provide access to cancer statistics including incidence, mortality, survival, prevalence, and probability of developing or dying from cancer. Users can display reports of the statistics or extract them for additional analyses.
Facilitating the Transition from Bright to Dim Environments
2016-03-04
For the parametric data, a multivariate ANOVA was used in determining the systematic presence of any statistically significant performance differences...performed. All significance levels were p < 0.05, and statistical analyses were performed with the Statistical Package for Social Sciences ( SPSS ...1950. Age changes in rate and level of visual dark adaptation. Journal of Applied Physiology, 2, 407–411. Field, A. 2009. Discovering statistics
Trait humor and longevity: do comics have the last laugh?
Rotton, J
1992-01-01
Four sets of biographical data were analyzed in order to test the hypothesis that the ability to generate humor is associated with longevity. Although steps were taken to ensure that tests had high levels of statistical power, analyses provided very little support for the idea that individuals with a well-developed sense of humor live longer than serious writers and other entertainers. In addition, a subsidiary analysis revealed that those in the business of entertaining others died at an earlier age than those in other lines of endeavor. These findings suggest that researchers should turn their attention from trait humor to the effects of humorous material.
Associating an ionospheric parameter with major earthquake occurrence throughout the world
NASA Astrophysics Data System (ADS)
Ghosh, D.; Midya, S. K.
2014-02-01
With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≥ 6.0) and medium (4.0 ≤ Ms < 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake Parameter (IEP) where `Ms' is earthquake magnitude on the Richter scale. From statistical and graphical analyses, it is concluded that the probability of earthquake occurrence is maximum when the defined parameter lies within the range of 0-75 (lower range). In the higher ranges, earthquake occurrence probability gradually decreases. A probable explanation is also suggested.
A model to predict accommodations needed by disabled persons.
Babski-Reeves, Kari; Williams, Sabrina; Waters, Tzer Nan; Crumpton-Young, Lesia L; McCauley-Bell, Pamela
2005-09-01
In this paper, several approaches to assist employers in the accommodation process for disabled employees are discussed and a mathematical model is proposed to assist employers in predicting the accommodation level needed by an individual with a mobility-related disability. This study investigates the validity and reliability of this model in assessing the accommodation level needed by individuals utilizing data collected from twelve individuals with mobility-related disabilities. Based on the results of the statistical analyses, this proposed model produces a feasible preliminary measure for assessing the accommodation level needed for persons with mobility-related disabilities. Suggestions for practical application of this model in an industrial setting are addressed.
Coastal and Marine Bird Data Base
Anderson, S.H.; Geissler, P.H.; Dawson, D.K.
1980-01-01
Summary: This report discusses the development of a coastal and marine bird data base at the Migratory Bird and Habitat Research Laboratory. The system is compared with other data bases, and suggestions for future development, such as possible adaptations for other taxonomic groups, are included. The data base is based on the Statistical Analysis System but includes extensions programmed in PL/I. The Appendix shows how the system evolved. Output examples are given for heron data and pelagic bird data which indicate the types of analyses that can be conducted and output figures. The Appendixes include a retrieval language user's guide and description of the retrieval process and listing of translator program.
Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V
2017-07-21
DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.
Imaging Depression in Adults with ASD
2017-10-01
collected temporally close enough to imaging data in Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk...Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk attrition between Phase 1 and 2, we chose to hold...supervision is ongoing (since 9/2014). • Co-l Dr. Lerner’s 2nd year Clinical Psychology PhD students have participated in ADOS- 2 Introductory Clinical
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Searching Choices: Quantifying Decision-Making Processes Using Search Engine Data.
Moat, Helen Susannah; Olivola, Christopher Y; Chater, Nick; Preis, Tobias
2016-07-01
When making a decision, humans consider two types of information: information they have acquired through their prior experience of the world, and further information they gather to support the decision in question. Here, we present evidence that data from search engines such as Google can help us model both sources of information. We show that statistics from search engines on the frequency of content on the Internet can help us estimate the statistical structure of prior experience; and, specifically, we outline how such statistics can inform psychological theories concerning the valuation of human lives, or choices involving delayed outcomes. Turning to information gathering, we show that search query data might help measure human information gathering, and it may predict subsequent decisions. Such data enable us to compare information gathered across nations, where analyses suggest, for example, a greater focus on the future in countries with a higher per capita GDP. We conclude that search engine data constitute a valuable new resource for cognitive scientists, offering a fascinating new tool for understanding the human decision-making process. Copyright © 2016 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Ipsen, Andreas
2015-02-03
Despite the widespread use of mass spectrometry (MS) in a broad range of disciplines, the nature of MS data remains very poorly understood, and this places important constraints on the quality of MS data analysis as well as on the effectiveness of MS instrument design. In the following, a procedure for calculating the statistical distribution of the mass peak intensity for MS instruments that use analog-to-digital converters (ADCs) and electron multipliers is presented. It is demonstrated that the physical processes underlying the data-generation process, from the generation of the ions to the signal induced at the detector, and on to the digitization of the resulting voltage pulse, result in data that can be well-approximated by a Gaussian distribution whose mean and variance are determined by physically meaningful instrumental parameters. This allows for a very precise understanding of the signal-to-noise ratio of mass peak intensities and suggests novel ways of improving it. Moreover, it is a prerequisite for being able to address virtually all data analytical problems in downstream analyses in a statistically rigorous manner. The model is validated with experimental data.
Log-Normality and Multifractal Analysis of Flame Surface Statistics
NASA Astrophysics Data System (ADS)
Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.
2013-11-01
The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.
Impact of South American heroin on the US heroin market 1993-2004.
Ciccarone, Daniel; Unick, George J; Kraus, Allison
2009-09-01
The past two decades have seen an increase in heroin-related morbidity and mortality in the United States. We report on trends in US heroin retail price and purity, including the effect of entry of Colombian-sourced heroin on the US heroin market. The average standardized price ($/mg-pure) and purity (% by weight) of heroin from 1993 to 2004 was from obtained from US Drug Enforcement Agency retail purchase data for 20 metropolitan statistical areas. Univariate statistics, robust Ordinary Least Squares regression and mixed fixed and random effect growth curve models were used to predict the price and purity data in each metropolitan statistical area over time. Over the 12 study years, heroin price decreased 62%. The median percentage of all heroin samples that are of South American origin increased an absolute 7% per year. Multivariate models suggest percent South American heroin is a significant predictor of lower heroin price and higher purity adjusting for time and demographics. These analyses reveal trends to historically low-cost heroin in many US cities. These changes correspond to the entrance into and rapid domination of the US heroin market by Colombian-sourced heroin. The implications of these changes are discussed.
Martian cratering 11. Utilizing decameter scale crater populations to study Martian history
NASA Astrophysics Data System (ADS)
Hartmann, W. K.; Daubar, I. J.
2017-03-01
New information has been obtained in recent years regarding formation rates and the production size-frequency distribution (PSFD) of decameter-scale primary Martian craters formed during recent orbiter missions. Here we compare the PSFD of the currently forming small primaries (P) with new data on the PSFD of the total small crater population that includes primaries and field secondaries (P + fS), which represents an average over longer time periods. The two data sets, if used in a combined manner, have extraordinary potential for clarifying not only the evolutionary history and resurfacing episodes of small Martian geological formations (as small as one or few km2) but also possible episodes of recent climatic change. In response to recent discussions of statistical methodologies, we point out that crater counts do not produce idealized statistics, and that inherent uncertainties limit improvements that can be made by more sophisticated statistical analyses. We propose three mutually supportive procedures for interpreting crater counts of small craters in this context. Applications of these procedures support suggestions that topographic features in upper meters of mid-latitude ice-rich areas date only from the last few periods of extreme Martian obliquity, and associated predicted climate excursions.
NASA Astrophysics Data System (ADS)
Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle
2017-08-01
In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.
Reproducibility of ZrO2-based freeze casting for biomaterials.
Naleway, Steven E; Fickas, Kate C; Maker, Yajur N; Meyers, Marc A; McKittrick, Joanna
2016-04-01
The processing technique of freeze casting has been intensely researched for its potential to create porous scaffold and infiltrated composite materials for biomedical implants and structural materials. However, in order for this technique to be employed medically or commercially, it must be able to reliably produce materials in great quantities with similar microstructures and properties. Here we investigate the reproducibility of the freeze casting process by independently fabricating three sets of eight ZrO2-epoxy composite scaffolds with the same processing conditions but varying solid loading (10, 15 and 20 vol.%). Statistical analyses (One-way ANOVA and Tukey's HSD tests) run upon measurements of the microstructural dimensions of these composite scaffold sets show that, while the majority of microstructures are similar, in all cases the composite scaffolds display statistically significant variability. In addition, composite scaffolds where mechanically compressed and statistically analyzed. Similar to the microstructures, almost all of their resultant properties displayed significant variability though most composite scaffolds were similar. These results suggest that additional research to improve control of the freeze casting technique is required before scaffolds and composite scaffolds can reliably be reproduced for commercial or medical applications. Copyright © 2015 Elsevier B.V. All rights reserved.
Lung cancer mortality and exposure to polycyclic aromatic hydrocarbons in British coke oven workers.
Miller, Brian G; Doust, Emma; Cherrie, John W; Hurley, J Fintan
2013-10-16
Workers on coke oven plants may be exposed to potentially carcinogenic polycyclic aromatic hydrocarbons (PAHs), particularly during work on the ovens tops. Two cohorts, employees of National Smokeless Fuels (NSF) and the British Steel Corporation (BSC) totalling more than 6,600 British coke plant workers employed in 1967, had been followed up to mid-1987 for mortality. Previous analyses suggested an excess in lung cancer risk of around 25%, or less when compared with Social Class IV ('partly skilled').Analyses based on internal comparisons within the cohorts identified statistical associations with estimates of individual exposures, up to the start of follow-up, to benzene-soluble materials (BSM), widely used as a metric for mixtures of PAHs. Some associations were also found with times spent in certain coke ovens jobs with specific exposure scenarios, but results were not consistent across the two cohorts and limitations in the exposure estimates were noted. The present study was designed to reanalyse the existing data on lung cancer mortality, incorporating revised and improved exposure estimates to BSM and to benzo[a]pyrene (B[a]P), including increments during the follow-up and a lag for latency. Mean annual average concentrations of both BSM and B[a]P were estimated by analysis of variance (ANOVA) from concentration measurements at all NSF and six BSC plants, and summarised by job and plant, with a temporal trend (for the BSM only). These were combined with subjects' work histories, to produce exposure estimates in each year of follow-up, with a 10-year lag to allow for latency. Exposures to BSM and to B[a]P were sufficiently uncorrelated to permit analysis in relation to each variable separately.Lung cancer death risks during the follow-up were analysed in relation to the estimated time-dependent exposures, both continuous and grouped, using Cox regression models, with adjustment for age. Changing the exposure estimates changed the estimated relative risks compared with earlier results, but the new analyses showed no significant trends with continuous measures of exposure to either BSM or B[a]P, nor with time spent on ovens tops. Analyses with grouped exposures showed mixed results. Across all BSC plants, the relative risk coefficient for working 5 or more years on ovens tops, where the exposures were highest, was 1.81, which was statistically significant. However, results for those with 0-5 years on ovens tops did not suggest a trend; the evidence for an underlying relationship was thus suggestive but not strong. The new results are in line with previous findings; they show some signs consistent with an effect of coke ovens work on lung cancer risk, especially on ovens tops, but the preponderant absence of significant results, and the inconsistencies between results for NSF and BSC, highlight how little evidence there is in these data of any effect.
Lung cancer mortality and exposure to polycyclic aromatic hydrocarbons in British coke oven workers
2013-01-01
Background Workers on coke oven plants may be exposed to potentially carcinogenic polycyclic aromatic hydrocarbons (PAHs), particularly during work on the ovens tops. Two cohorts, employees of National Smokeless Fuels (NSF) and the British Steel Corporation (BSC) totalling more than 6,600 British coke plant workers employed in 1967, had been followed up to mid-1987 for mortality. Previous analyses suggested an excess in lung cancer risk of around 25%, or less when compared with Social Class IV (‘partly skilled’). Analyses based on internal comparisons within the cohorts identified statistical associations with estimates of individual exposures, up to the start of follow-up, to benzene-soluble materials (BSM), widely used as a metric for mixtures of PAHs. Some associations were also found with times spent in certain coke ovens jobs with specific exposure scenarios, but results were not consistent across the two cohorts and limitations in the exposure estimates were noted. The present study was designed to reanalyse the existing data on lung cancer mortality, incorporating revised and improved exposure estimates to BSM and to benzo[a]pyrene (B[a]P), including increments during the follow-up and a lag for latency. Methods Mean annual average concentrations of both BSM and B[a]P were estimated by analysis of variance (ANOVA) from concentration measurements at all NSF and six BSC plants, and summarised by job and plant, with a temporal trend (for the BSM only). These were combined with subjects’ work histories, to produce exposure estimates in each year of follow-up, with a 10-year lag to allow for latency. Exposures to BSM and to B[a]P were sufficiently uncorrelated to permit analysis in relation to each variable separately. Lung cancer death risks during the follow-up were analysed in relation to the estimated time-dependent exposures, both continuous and grouped, using Cox regression models, with adjustment for age. Results Changing the exposure estimates changed the estimated relative risks compared with earlier results, but the new analyses showed no significant trends with continuous measures of exposure to either BSM or B[a]P, nor with time spent on ovens tops. Analyses with grouped exposures showed mixed results. Across all BSC plants, the relative risk coefficient for working 5 or more years on ovens tops, where the exposures were highest, was 1.81, which was statistically significant. However, results for those with 0–5 years on ovens tops did not suggest a trend; the evidence for an underlying relationship was thus suggestive but not strong. Conclusions The new results are in line with previous findings; they show some signs consistent with an effect of coke ovens work on lung cancer risk, especially on ovens tops, but the preponderant absence of significant results, and the inconsistencies between results for NSF and BSC, highlight how little evidence there is in these data of any effect. PMID:24131617
Assessment of the beryllium lymphocyte proliferation test using statistical process control.
Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M
2006-10-01
Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.
A new statistical method for design and analyses of component tolerance
NASA Astrophysics Data System (ADS)
Movahedi, Mohammad Mehdi; Khounsiavash, Mohsen; Otadi, Mahmood; Mosleh, Maryam
2017-03-01
Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
Graham, Matthew R; Jaeger, Jef R; Prendini, Lorenzo; Riddle, Brett R
2013-12-01
The distribution of Beck's Desert Scorpion, Paruroctonus becki (Gertsch and Allred, 1965), spans the 'warm' Mojave Desert and the western portion of the 'cold' Great Basin Desert. We used genetic analyses and species distribution modeling to test whether P. becki persisted in the Great Basin Desert during the Last Glacial Maximum (LGM), or colonized the area as glacial conditions retreated and the climate warmed. Phylogenetic and network analyses of mitochondrial cytochrome c oxidase 1 (cox1), 16S rDNA, and nuclear internal transcribed spacer (ITS-2) DNA sequences uncovered five geographically-structured groups in P. becki with varying degrees of statistical support. Molecular clock estimates and the geographical arrangement of three of the groups suggested that Pliocene geological events in the tectonically dynamic Eastern California Shear Zone may have driven diversification by vicariance. Diversification was estimated to have continued through the Pleistocene, during which a group endemic to the western Great Basin diverged from a related group in the eastern Mojave Desert and western Colorado Plateau. Demographic and network analyses suggested that P. becki underwent a recent expansion in the Great Basin. According to a landscape interpolation of genetic distances, this expansion appears to have occurred from the northwest, implying that P. becki may have persisted in part of the Great Basin during the LGM. This prediction is supported by species distribution models which suggest that climate was unsuitable throughout most of the Great Basin during the LGM, but that small patches of suitable climate may have remained in areas of the Lahontan Trough. Published by Elsevier Inc.
Luo, Xiong-Jian; Mattheisen, Manuel; Li, Ming; Huang, Liang; Rietschel, Marcella; Børglum, Anders D.; Als, Thomas D.; van den Oord, Edwin J.; Aberg, Karolina A.; Mors, Ole; Mortensen, Preben Bo; Luo, Zhenwu; Degenhardt, Franziska; Cichon, Sven; Schulze, Thomas G.; Nöthen, Markus M.; Su, Bing; Zhao, Zhongming; Gan, Lin; Yao, Yong-Gang
2015-01-01
Genome-wide association studies have identified multiple risk variants and loci that show robust association with schizophrenia. Nevertheless, it remains unclear how these variants confer risk to schizophrenia. In addition, the driving force that maintains the schizophrenia risk variants in human gene pool is poorly understood. To investigate whether expression-associated genetic variants contribute to schizophrenia susceptibility, we systematically integrated brain expression quantitative trait loci and genome-wide association data of schizophrenia using Sherlock, a Bayesian statistical framework. Our analyses identified ZNF323 as a schizophrenia risk gene (P = 2.22×10–6). Subsequent analyses confirmed the association of the ZNF323 and its expression-associated single nucleotide polymorphism rs1150711 in independent samples (gene-expression: P = 1.40×10–6; single-marker meta-analysis in the combined discovery and replication sample comprising 44123 individuals: P = 6.85×10−10). We found that the ZNF323 was significantly downregulated in hippocampus and frontal cortex of schizophrenia patients (P = .0038 and P = .0233, respectively). Evidence for pleiotropic effects was detected (association of rs1150711 with lung function and gene expression of ZNF323 in lung: P = 6.62×10–5 and P = 9.00×10–5, respectively) with the risk allele (T allele) for schizophrenia acting as protective allele for lung function. Subsequent population genetics analyses suggest that the risk allele (T) of rs1150711 might have undergone recent positive selection in human population. Our findings suggest that the ZNF323 is a schizophrenia susceptibility gene whose expression may influence schizophrenia risk. Our study also illustrates a possible mechanism for maintaining schizophrenia risk variants in the human gene pool. PMID:25759474
Souza, Isys Mascarenhas; Funch, Ligia Silveira; de Queiroz, Luciano Paganucci
2014-01-01
Abstract Hymenaea is a genus of the Resin-producing Clade of the tribe Detarieae (Leguminosae: Caesalpinioideae) with 14 species. Hymenaea courbaril is the most widespread species of the genus, ranging from southern Mexico to southeastern Brazil. As currently circumscribed, Hymenaea courbaril is a polytypic species with six varieties: var. altissima, var. courbaril, var. longifolia, var. stilbocarpa, var. subsessilis, and var. villosa. These varieties are distinguishable mostly by traits related to leaflet shape and indumentation, and calyx indumentation. We carried out morphometric analyses of 14 quantitative (continuous) leaf characters in order to assess the taxonomy of Hymenaea courbaril under the Unified Species Concept framework. Cluster analysis used the Unweighted Pair Group Method with Arithmetic Mean (UPGMA) based on Bray-Curtis dissimilarity matrices. Principal Component Analyses (PCA) were carried out based on the same morphometric matrix. Two sets of Analyses of Similarity and Non Parametric Multivariate Analysis of Variance were carried out to evaluate statistical support (1) for the major groups recovered using UPGMA and PCA, and (2) for the varieties. All analyses recovered three major groups coincident with (1) var. altissima, (2) var. longifolia, and (3) all other varieties. These results, together with geographical and habitat information, were taken as evidence of three separate metapopulation lineages recognized here as three distinct species. Nomenclatural adjustments, including reclassifying formerly misapplied types, are proposed. PMID:25009440
Souza, Isys Mascarenhas; Funch, Ligia Silveira; de Queiroz, Luciano Paganucci
2014-01-01
Hymenaea is a genus of the Resin-producing Clade of the tribe Detarieae (Leguminosae: Caesalpinioideae) with 14 species. Hymenaea courbaril is the most widespread species of the genus, ranging from southern Mexico to southeastern Brazil. As currently circumscribed, Hymenaea courbaril is a polytypic species with six varieties: var. altissima, var. courbaril, var. longifolia, var. stilbocarpa, var. subsessilis, and var. villosa. These varieties are distinguishable mostly by traits related to leaflet shape and indumentation, and calyx indumentation. We carried out morphometric analyses of 14 quantitative (continuous) leaf characters in order to assess the taxonomy of Hymenaea courbaril under the Unified Species Concept framework. Cluster analysis used the Unweighted Pair Group Method with Arithmetic Mean (UPGMA) based on Bray-Curtis dissimilarity matrices. Principal Component Analyses (PCA) were carried out based on the same morphometric matrix. Two sets of Analyses of Similarity and Non Parametric Multivariate Analysis of Variance were carried out to evaluate statistical support (1) for the major groups recovered using UPGMA and PCA, and (2) for the varieties. All analyses recovered three major groups coincident with (1) var. altissima, (2) var. longifolia, and (3) all other varieties. These results, together with geographical and habitat information, were taken as evidence of three separate metapopulation lineages recognized here as three distinct species. Nomenclatural adjustments, including reclassifying formerly misapplied types, are proposed.
Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194
Research design and statistical methods in Indian medical journals: a retrospective survey.
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.
COGNATE: comparative gene annotation characterizer.
Wilbrandt, Jeanne; Misof, Bernhard; Niehuis, Oliver
2017-07-17
The comparison of gene and genome structures across species has the potential to reveal major trends of genome evolution. However, such a comparative approach is currently hampered by a lack of standardization (e.g., Elliott TA, Gregory TR, Philos Trans Royal Soc B: Biol Sci 370:20140331, 2015). For example, testing the hypothesis that the total amount of coding sequences is a reliable measure of potential proteome diversity (Wang M, Kurland CG, Caetano-Anollés G, PNAS 108:11954, 2011) requires the application of standardized definitions of coding sequence and genes to create both comparable and comprehensive data sets and corresponding summary statistics. However, such standard definitions either do not exist or are not consistently applied. These circumstances call for a standard at the descriptive level using a minimum of parameters as well as an undeviating use of standardized terms, and for software that infers the required data under these strict definitions. The acquisition of a comprehensive, descriptive, and standardized set of parameters and summary statistics for genome publications and further analyses can thus greatly benefit from the availability of an easy to use standard tool. We developed a new open-source command-line tool, COGNATE (Comparative Gene Annotation Characterizer), which uses a given genome assembly and its annotation of protein-coding genes for a detailed description of the respective gene and genome structure parameters. Additionally, we revised the standard definitions of gene and genome structures and provide the definitions used by COGNATE as a working draft suggestion for further reference. Complete parameter lists and summary statistics are inferred using this set of definitions to allow down-stream analyses and to provide an overview of the genome and gene repertoire characteristics. COGNATE is written in Perl and freely available at the ZFMK homepage ( https://www.zfmk.de/en/COGNATE ) and on github ( https://github.com/ZFMK/COGNATE ). The tool COGNATE allows comparing genome assemblies and structural elements on multiples levels (e.g., scaffold or contig sequence, gene). It clearly enhances comparability between analyses. Thus, COGNATE can provide the important standardization of both genome and gene structure parameter disclosure as well as data acquisition for future comparative analyses. With the establishment of comprehensive descriptive standards and the extensive availability of genomes, an encompassing database will become possible.
2013-01-01
Background Malnutrition is one of the principal causes of child mortality in developing countries including Bangladesh. According to our knowledge, most of the available studies, that addressed the issue of malnutrition among under-five children, considered the categorical (dichotomous/polychotomous) outcome variables and applied logistic regression (binary/multinomial) to find their predictors. In this study malnutrition variable (i.e. outcome) is defined as the number of under-five malnourished children in a family, which is a non-negative count variable. The purposes of the study are (i) to demonstrate the applicability of the generalized Poisson regression (GPR) model as an alternative of other statistical methods and (ii) to find some predictors of this outcome variable. Methods The data is extracted from the Bangladesh Demographic and Health Survey (BDHS) 2007. Briefly, this survey employs a nationally representative sample which is based on a two-stage stratified sample of households. A total of 4,460 under-five children is analysed using various statistical techniques namely Chi-square test and GPR model. Results The GPR model (as compared to the standard Poisson regression and negative Binomial regression) is found to be justified to study the above-mentioned outcome variable because of its under-dispersion (variance < mean) property. Our study also identify several significant predictors of the outcome variable namely mother’s education, father’s education, wealth index, sanitation status, source of drinking water, and total number of children ever born to a woman. Conclusions Consistencies of our findings in light of many other studies suggest that the GPR model is an ideal alternative of other statistical models to analyse the number of under-five malnourished children in a family. Strategies based on significant predictors may improve the nutritional status of children in Bangladesh. PMID:23297699
The cost of an emergency department visit and its relationship to emergency department volume.
Bamezai, Anil; Melnick, Glenn; Nawathe, Amar
2005-05-01
This article addresses 2 questions: (1) to what extent do emergency departments (EDs) exhibit economies of scale; and (2) to what extent do publicly available accounting data understate the marginal cost of an outpatient ED visit? Understanding the appropriate role for EDs in the overall health care system is crucially dependent on answers to these questions. The literature on these issues is sparse and somewhat dated and fails to differentiate between trauma and nontrauma hospitals. We believe a careful review of these questions is necessary because several changes (greater managed care penetration, increased price competition, cost of compliance with Emergency Medical Treatment and Active Labor Act regulations, and so on) may have significantly altered ED economics in recent years. We use a 2-pronged approach, 1 based on descriptive analyses of publicly available accounting data and 1 based on statistical cost models estimated from a 9-year panel of hospital data, to address the above-mentioned questions. Neither the descriptive analyses nor the statistical models support the existence of significant scale economies. Furthermore, the marginal cost of outpatient ED visits, even without the emergency physician component, appear quite high--in 1998 dollars, US295 dollars and US412 dollars for nontrauma and trauma EDs, respectively. These statistical estimates exceed the accounting estimates of per-visit costs by a factor of roughly 2. Our findings suggest that the marginal cost of an outpatient ED visit is higher than is generally believed. Hospitals thus need to carefully review how EDs fit within their overall operations and cost structure and may need to pay special attention to policies and procedures that guide the delivery of nonurgent care through the ED.
Increased mortality associated with extreme-heat exposure in King County, Washington, 1980-2010
NASA Astrophysics Data System (ADS)
Isaksen, Tania Busch; Fenske, Richard A.; Hom, Elizabeth K.; Ren, You; Lyons, Hilary; Yost, Michael G.
2016-01-01
Extreme heat has been associated with increased mortality, particularly in temperate climates. Few epidemiologic studies have considered the Pacific Northwest region in their analyses. This study quantified the historical (May to September, 1980-2010) heat-mortality relationship in the most populous Pacific Northwest County, King County, Washington. A relative risk (RR) analysis was used to explore the relationship between heat and all-cause mortality on 99th percentile heat days, while a time series analysis, using a piece-wise linear model fit, was used to estimate the effect of heat intensity on mortality, adjusted for temporal trends. For all ages, all causes, we found a 10 % (1.10 (95 % confidence interval (CI), 1.06, 1.14)) increase in the risk of death on a heat day versus non-heat day. When considering the intensity effect of heat on all-cause mortality, we found a 1.69 % (95 % CI, 0.69, 2.70) increase in the risk of death per unit of humidex above 36.0 °C. Mortality stratified by cause and age produced statistically significant results using both types of analyses for: all-cause, non-traumatic, circulatory, cardiovascular, cerebrovascular, and diabetes causes of death. All-cause mortality was statistically significantly modified by the type of synoptic weather type. These results demonstrate that heat, expressed as humidex, is associated with increased mortality on heat days, and that risk increases with heat's intensity. While age was the only individual-level characteristic found to modify mortality risks, statistically significant increases in diabetes-related mortality for the 45-64 age group suggests that underlying health status may contribute to these risks.
Outlier Removal and the Relation with Reporting Errors and Quality of Psychological Research
Bakker, Marjan; Wicherts, Jelte M.
2014-01-01
Background The removal of outliers to acquire a significant result is a questionable research practice that appears to be commonly used in psychology. In this study, we investigated whether the removal of outliers in psychology papers is related to weaker evidence (against the null hypothesis of no effect), a higher prevalence of reporting errors, and smaller sample sizes in these papers compared to papers in the same journals that did not report the exclusion of outliers from the analyses. Methods and Findings We retrieved a total of 2667 statistical results of null hypothesis significance tests from 153 articles in main psychology journals, and compared results from articles in which outliers were removed (N = 92) with results from articles that reported no exclusion of outliers (N = 61). We preregistered our hypotheses and methods and analyzed the data at the level of articles. Results show no significant difference between the two types of articles in median p value, sample sizes, or prevalence of all reporting errors, large reporting errors, and reporting errors that concerned the statistical significance. However, we did find a discrepancy between the reported degrees of freedom of t tests and the reported sample size in 41% of articles that did not report removal of any data values. This suggests common failure to report data exclusions (or missingness) in psychological articles. Conclusions We failed to find that the removal of outliers from the analysis in psychological articles was related to weaker evidence (against the null hypothesis of no effect), sample size, or the prevalence of errors. However, our control sample might be contaminated due to nondisclosure of excluded values in articles that did not report exclusion of outliers. Results therefore highlight the importance of more transparent reporting of statistical analyses. PMID:25072606
Goldman, S A
1996-10-01
Neurotoxicity in relation to concomitant administration of lithium and neuroleptic drugs, particularly haloperidol, has been an ongoing issue. This study examined whether use of lithium with neuroleptic drugs enhances neurotoxicity leading to permanent sequelae. The Spontaneous Reporting System database of the United States Food and Drug Administration and extant literature were reviewed for spectrum cases of lithium/neuroleptic neurotoxicity. Groups taking lithium alone (Li), lithium/haloperidol (LiHal) and lithium/ nonhaloperidol neuroleptics (LiNeuro), each paired for recovery and sequelae, were established for 237 cases. Statistical analyses included pairwise comparisons of lithium levels using the Wilcoxon Rank Sum procedure and logistic regression to analyze the relationship between independent variables and development of sequelae. The Li and Li-Neuro groups showed significant statistical differences in median lithium levels between recovery and sequelae pairs, whereas the LiHal pair did not differ significantly. Lithium level was associated with sequelae development overall and within the Li and LiNeuro groups; no such association was evident in the LiHal group. On multivariable logistic regression analysis, lithium level and taking lithium/haloperidol were significant factors in the development of sequelae, with multiple possibly confounding factors (e.g., age, sex) not statistically significant. Multivariable logistic regression analyses with neuroleptic dose as five discrete dose ranges or actual dose did not show an association between development of sequelae and dose. Database limitations notwithstanding, the lack of apparent impact of serum lithium level on the development of sequelae in patients treated with haloperidol contrasts notably with results in the Li and LiNeuro groups. These findings may suggest a possible effect of pharmacodynamic factors in lithium/neuroleptic combination therapy.
Statistical analyses support power law distributions found in neuronal avalanches.
Klaus, Andreas; Yu, Shan; Plenz, Dietmar
2011-01-01
The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.
Liu, Shiyuan
2013-01-01
Purpose To compare the diagnostic performances of computer tomography angiography (CTA) and magnetic resonance angiography (MRA) for detection and assessment of stenosis in patients with autologuous hemodialysis access. Materials and Methods Search of PubMed, MEDLINE, EMBASE and Cochrane Library database from January 1984 to May 2013 for studies comparing CTA or MRA with DSA or surgery for autologuous hemodialysis access. Eligible studies were in English language, aimed to detect more than 50% stenosis or occlusion of autologuous vascular access in hemodialysis patients with CTA and MRA technology and provided sufficient data about diagnosis performance. Methodological quality was assessed by the Quality Assessment of Diagnostic Studies (QUADAS) instrument. Sensitivities (SEN), specificities (SPE), positive likelihood ratio (PLR), negative likelihood values (NLR), diagnostic odds ratio (DOR) and areas under the receiver operator characteristic curve (AUC) were pooled statistically. Potential threshold effect, heterogeneity and publication bias was evaluated. The clinical utility of CTA and MRA in detection of stenosis was also investigated. Result Sixteen eligible studies were included, with a total of 500 patients. Both CTA and MRA were accurate modality (sensitivity, 96.2% and 95.4%, respectively; specificity, 97.1 and 96.1%, respectively; DOR [diagnostic odds ratio], 393.69 and 211.47, respectively) for hemodialysis vascular access. No significant difference was detected between the diagnostic performance of CTA (AUC, 0.988) and MRA (AUC, 0.982). Meta-regression analyses and subgroup analyses revealed no statistical difference. The Deek’s funnel plots suggested a publication bias. Conclusion Diagnostic performance of CTA and MRA for detecting stenosis of hemodialysis vascular access had no statistical difference. Both techniques may function as an alternative or an important complement to conventional digital subtraction angiography (DSA) and may be able to help guide medical management. PMID:24194928
Statistical Literacy in the Data Science Workplace
ERIC Educational Resources Information Center
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H
2017-04-20
Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.
Statistical Application and Cost Saving in a Dental Survey.
Chyou, Po-Huang; Schroeder, Dixie; Schwei, Kelsey; Acharya, Amit
2017-06-01
To effectively achieve a robust survey response rate in a timely manner, an alternative approach to survey distribution, informed by statistical modeling, was applied to efficiently and cost-effectively achieve the targeted rate of return. A prospective environmental scan surveying adoption of health information technology utilization within their practices was undertaken in a national pool of dental professionals (N=8000) using an alternative method of sampling. The piloted approach to rate of cohort sampling targeted a response rate of 400 completed surveys from among randomly targeted eligible providers who were contacted using replicated subsampling leveraging mailed surveys. Two replicated subsample mailings (n=1000 surveys/mailings) were undertaken to project the true response rate and estimate the total number of surveys required to achieve the final target. Cost effectiveness and non-response bias analyses were performed. The final mailing required approximately 24% fewer mailings compared to targeting of the entire cohort, with a final survey capture exceeding the expected target. An estimated $5000 in cost savings was projected by applying the alternative approach. Non-response analyses found no evidence of bias relative to demographics, practice demographics, or topically-related survey questions. The outcome of this pilot study suggests that this approach to survey studies will accomplish targeted enrollment in a cost effective manner. Future studies are needed to validate this approach in the context of other survey studies. © 2017 Marshfield Clinic.
Cano, Miguel Ángel; Sánchez, Mariana; Trepka, Mary Jo; Dillon, Frank R; Sheehan, Diana M; Rojas, Patria; Kanamori, Mariano J; Huang, Hui; Auf, Rehab; De La Rosa, Mario
2017-03-01
Identifying and understanding determinants of alcohol use behavior among Hispanic immigrants is an increasingly significant public health concern. Although prior research has examined associations of cultural stressors with alcohol use among Hispanics, few studies have tested these associations among recent adult immigrants. As such, this study aimed to examine (a) the association of immigration stress on alcohol use severity among recently immigrated Hispanic adults (≤ 1 year in the United States) and (b) the moderating effects of gender, immigration status, and social support. A hierarchical multiple regression and moderation analyses were conducted on a sample of 527 participants in South Florida. Results indicated that, after controlling for demographic variables, preimmigration drinking behavior, and dimensions of social support, the association of higher immigration stress with higher alcohol use severity was statistically significant. Moderation analyses indicated that immigration stress had a statistically significant association with alcohol use severity among men, but not women. Also, dimensions of social support consistently reduced the deleterious effect of immigration stress on alcohol use severity. This study adds to the scarce literature on cultural stressors and alcohol use among recent Hispanic immigrants. Findings suggest that it may be important to design gender-specific interventions and that increasing levels of social support may offset the effects of immigration stress on alcohol use. © 2016 Wiley Periodicals, Inc.
Wöhl, C; Siebert, H; Blättner, B
2017-08-01
Among residents of nursing homes, physical activity might be beneficial in maintaining health-related quality of life because impairment is caused in particular by functional decline. The aim is the evaluation of the effectiveness of universal preventive interventions directed at increasing physical activity on activities of daily living in nursing home residents. Relevant studies were identified through database searching in MEDLINE, the Cochrane library, EMBASE, CINAHL, PsycINFO and PEDro. Two review authors independently selected articles, assessed the risk of bias and extracted data. Results were combined in random effects meta-analyses. By including 14 primary studies, nursing home residents participating in physical activities showed a statistically significant greater physical functioning compared to controls (standardized mean difference [SMD] = 0.48, 95% confidence interval [95% CI] 0.26-0.71, p < 0.0001). Subgroup analyses suggest that especially nursing home residents with severe physical and cognitive impairment might benefit from participation in physical activities. Results after non-training periods substantiate the necessity of a sustained implementation. Due to the high risk of bias in included studies, the results must be interpreted with caution. Physical activity for nursing home residents can be effective. Considering the low-quality evidence, performance of high-quality studies is essential in order to verify the statistical results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erdmann, Christine A.; Apte, Michael G.
Using the US EPA 100 office-building BASE Study dataset, they conducted multivariate logistic regression analyses to quantify the relationship between indoor CO{sub 2} concentrations (dCO{sub 2}) and mucous membrane (MM) and lower respiratory system (LResp) building related symptoms, adjusting for age, sex, smoking status, presence of carpet in workspace, thermal exposure, relative humidity, and a marker for entrained automobile exhaust. In addition, they tested the hypothesis that certain environmentally-mediated health conditions (e.g., allergies and asthma) confer increased susceptibility to building related symptoms within office buildings. Adjusted odds ratios (ORs) for statistically significant, dose-dependent associations (p < 0.05) for dry eyes,more » sore throat, nose/sinus congestion, and wheeze symptoms with 100 ppm increases in dCO{sub 2} ranged from 1.1 to 1.2. These results suggest that increases in the ventilation rates per person among typical office buildings will, on average, reduce the prevalence of several building related symptoms by up to 70%, even when these buildings meet the existing ASHRAE ventilation standards for office buildings. Building occupants with certain environmentally-mediated health conditions are more likely to experience building related symptoms than those without these conditions (statistically significant ORs ranged from 2 to 11).« less
Thomas, Jennifer J; Vartanian, Lenny R; Brownell, Kelly D
2009-05-01
Eating disorder not otherwise specified (EDNOS) is the most prevalent eating disorder (ED) diagnosis. In this meta-analysis, the authors aimed to inform Diagnostic and Statistical Manual of Mental Disorders revisions by comparing the psychopathology of EDNOS with that of the officially recognized EDs: anorexia nervosa (AN), bulimia nervosa (BN), and binge eating disorder (BED). A comprehensive literature search identified 125 eligible studies (published and unpublished) appearing in the literature from 1987 to 2007. Random effects analyses indicated that whereas EDNOS did not differ significantly from AN and BED on eating pathology or general psychopathology, BN exhibited greater eating and general psychopathology than EDNOS. Moderator analyses indicated that EDNOS groups who met all diagnostic criteria for AN except for amenorrhea did not differ significantly from full syndrome cases. Similarly, EDNOS groups who met all criteria for BN or BED except for binge frequency did not differ significantly from full syndrome cases. Results suggest that EDNOS represents a set of disorders associated with substantial psychological and physiological morbidity. Although certain EDNOS subtypes could be incorporated into existing Diagnostic and Statistical Manual of Mental Disorders (4th ed.; American Psychiatric Association, 1994) categories, others-such as purging disorder and non-fat-phobic AN-may be best conceptualized as distinct syndromes. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
On the reliable probing of discrete ‘plasma bullet’ propagation
NASA Astrophysics Data System (ADS)
Svarnas, P.; Gazeli, K.; Gkelios, A.; Amanatides, E.; Mataras, D.
2018-04-01
This report is devoted to the imaging of the spatiotemporal evolution of ‘plasma bullets’ during their propagation at atmospheric pressure. Although numerous studies have been realized on this topic with high gating rate cameras, triggering issues and statistical analyses of single-shot events over different cycles of the driving high voltage have not been discussed properly. The present work demonstrates the related difficulties faced due to the inherently erratic propagation of the bullets. A way of capturing and statistically analysing discrete bullet events is introduced, which is reliable even when low gating rate cameras are used and multiple bullets are formed within the voltage cycle. The method is based on plasma observations by means of two photoelectron multiplier tubes. It is suggested that these signals correlate better with bullet propagation events than the driving voltage or bullet current waveforms do, and allow either the elimination of issues arising from erratic propagation and hardware delays or at least the quantification of certain uncertainties. Herein, the entire setup, the related concept and the limits of accuracy are discussed in detail. Snapshots of the bullets are captured and commented on, with the bullets being produced by a sinusoidally driven single-electrode plasma jet reactor operating with helium. Finally, the instantaneous velocities of bullets on the order of 104-105 m s-1 are measured and propagation phases are distinguished in good agreement with the bibliography.
Statistical Application and Cost Saving in a Dental Survey
Chyou, Po-Huang; Schroeder, Dixie; Schwei, Kelsey; Acharya, Amit
2017-01-01
Objective To effectively achieve a robust survey response rate in a timely manner, an alternative approach to survey distribution, informed by statistical modeling, was applied to efficiently and cost-effectively achieve the targeted rate of return. Design A prospective environmental scan surveying adoption of health information technology utilization within their practices was undertaken in a national pool of dental professionals (N=8000) using an alternative method of sampling. The piloted approach to rate of cohort sampling targeted a response rate of 400 completed surveys from among randomly targeted eligible providers who were contacted using replicated subsampling leveraging mailed surveys. Methods Two replicated subsample mailings (n=1000 surveys/mailings) were undertaken to project the true response rate and estimate the total number of surveys required to achieve the final target. Cost effectiveness and non-response bias analyses were performed. Results The final mailing required approximately 24% fewer mailings compared to targeting of the entire cohort, with a final survey capture exceeding the expected target. An estimated $5000 in cost savings was projected by applying the alternative approach. Non-response analyses found no evidence of bias relative to demographics, practice demographics, or topically-related survey questions. Conclusion The outcome of this pilot study suggests that this approach to survey studies will accomplish targeted enrollment in a cost effective manner. Future studies are needed to validate this approach in the context of other survey studies. PMID:28373286
Wright, Aidan G C; Simms, Leonard J
2014-01-01
The current study examines the relations among contemporary models of pathological and normal range personality traits. Specifically, we report on (a) conjoint exploratory factor analyses of the Computerized Adaptive Test of Personality Disorder static form (CAT-PD-SF) with the Personality Inventory for the Diagnostic and Statistical Manual of Mental Disorders, fifth edition and NEO Personality Inventory-3 First Half, and (b) unfolding hierarchical analyses of the three measures in a large general psychiatric outpatient sample (n = 628; 64% Female). A five-factor solution provided conceptually coherent alignment among the CAT-PD-SF, PID-5, and NEO-PI-3FH scales. Hierarchical solutions suggested that higher-order factors bear strong resemblance to dimensions that emerge from structural models of psychopathology (e.g., Internalizing and Externalizing spectra). These results demonstrate that the CAT-PD-SF adheres to the consensual structure of broad trait domains at the five-factor level. Additionally, patterns of scale loadings further inform questions of structure and bipolarity of facet and domain level constructs. Finally, hierarchical analyses strengthen the argument for using broad dimensions that span normative and pathological functioning to scaffold a quantitatively derived phenotypic structure of psychopathology to orient future research on explanatory, etiological, and maintenance mechanisms.
Genetic co-structuring in host-parasite systems: Empirical data from raccoons and raccoon ticks
Dharmarajan, Guha; Beasley, James C.; Beatty, William S.; ...
2016-03-31
Many aspects of parasite biology critically depend on their hosts, and understanding how host-parasite populations are co-structured can help improve our understanding of the ecology of parasites, their hosts, and host-parasite interactions. Here, this study utilized genetic data collected from raccoons (Procyon lotor), and a specialist parasite, the raccoon tick (Ixodes texanus), to test for genetic co-structuring of host-parasite populations at both landscape and host scales. At the landscape scale, our analyses revealed a significant correlation between genetic and geographic distance matrices (i.e., isolation by distance) in ticks, but not their hosts. While there are several mechanisms that could leadmore » to a stronger pattern of isolation by distance in tick vs. raccoon datasets, our analyses suggest that at least one reason for the above pattern is the substantial increase in statistical power (due to the ≈8-fold increase in sample size) afforded by sampling parasites. Host-scale analyses indicated higher relatedness between ticks sampled from related vs. unrelated raccoons trapped within the same habitat patch, a pattern likely driven by increased contact rates between related hosts. Lastly, by utilizing fine-scale genetic data from both parasites and hosts, our analyses help improve our understanding of epidemiology and host ecology.« less
Waits, L P; Sullivan, J; O'Brien, S J; Ward, R H
1999-10-01
The bear family (Ursidae) presents a number of phylogenetic ambiguities as the evolutionary relationships of the six youngest members (ursine bears) are largely unresolved. Recent mitochondrial DNA analyses have produced conflicting results with respect to the phylogeny of ursine bears. In an attempt to resolve these issues, we obtained 1916 nucleotides of mitochondrial DNA sequence data from six gene segments for all eight bear species and conducted maximum likelihood and maximum parsimony analyses on all fragments separately and combined. All six single-region gene trees gave different phylogenetic estimates; however, only for control region data was this significantly incongruent with the results from the combined data. The optimal phylogeny for the combined data set suggests that the giant panda is most basal followed by the spectacled bear. The sloth bear is the basal ursine bear, and there is weak support for a sister taxon relationship of the American and Asiatic black bears. The sun bear is sister taxon to the youngest clade containing brown bears and polar bears. Statistical analyses of alternate hypotheses revealed a lack of strong support for many of the relationships. We suggest that the difficulties surrounding the resolution of the evolutionary relationships of the Ursidae are linked to the existence of sequential rapid radiation events in bear evolution. Thus, unresolved branching orders during these time periods may represent an accurate representation of the evolutionary history of bear species. Copyright 1999 Academic Press.
Evolution of the oligopeptide transporter family.
Gomolplitinant, Kenny M; Saier, Milton H
2011-03-01
The oligopeptide transporter (OPT) family of peptide and iron-siderophore transporters includes members from both prokaryotes and eukaryotes but with restricted distribution in the latter domain. Eukaryotic members were found only in fungi and plants with a single slime mold homologue clustering with the fungal proteins. All functionally characterized eukaryotic peptide transporters segregate from the known iron-siderophore transporters on a phylogenetic tree. Prokaryotic members are widespread, deriving from many different phyla. Although they belong only to the iron-siderophore subdivision, genome context analyses suggest that many of them are peptide transporters. OPT family proteins have 16 or occasionally 17 transmembrane-spanning α-helical segments (TMSs). We provide statistical evidence that the 16-TMS topology arose via three sequential duplication events followed by a gene-fusion event for proteins with a seventeenth TMS. The proposed pathway is as follows: 2 TMSs → 4 TMSs → 8 TMSs → 16 TMSs → 17 TMSs. The seventeenth C-terminal TMS, which probably arose just once, is found in just one phylogenetic group of these homologues. Analyses for orthology revealed that a few phylogenetic clusters consist exclusively of orthologues but most have undergone intermixing, suggestive of horizontal transfer. It appears that in this family horizontal gene transfer was frequent among prokaryotes, rare among eukaryotes and largely absent between prokaryotes and eukaryotes as well as between plants and fungi. These observations provide guides for future structural and functional analyses of OPT family members.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Knorr, Ulla; Koefoed, Pernille; Soendergaard, Mia H Greisen; Vinberg, Maj; Gether, Ulrik; Gluud, Christian; Wetterslev, Jørn; Winkel, Per; Kessing, Lars V
2016-04-01
Brain-derived neurotrophic factor (BDNF) seems to play an important role in the course of depression including the response to antidepressants in patients with depression. We aimed to study the effect of an antidepressant intervention on peripheral BDNF in healthy individuals with a family history of depression. We measured changes in BDNF messenger RNA (mRNA) expression and whole-blood BDNF levels in 80 healthy first-degree relatives of patients with depression randomly allocated to receive daily tablets of escitalopram 10 mg versus placebo for 4 weeks. We found no statistically significant difference between the escitalopram and the placebo group in the change in BDNF mRNA expression and whole-blood BDNF levels. Post hoc analyses showed a statistically significant negative correlation between plasma escitalopram concentration and change in whole-blood BDNF levels in the escitalopram-treated group. The results of this randomised trial suggest that escitalopram 10 mg has no effect on peripheral BDNF levels in healthy individuals.
Teixeira, R D; Scheltinga, D M; Trauth, S E; Colli, G R; Báo, S N
2002-06-01
The ultrastructure of the spermatozoa of Cnemidophorus gularis gularis, Cnemidophorus ocellifer, and Kentropyx altamazonica is described for the first time. Mature spermatozoa of Cnemidophorus spp. and K. altamazonica differ in the occurrence of a perforatorial base plate, the enlargement of axonemal fibers 3 and 8, and shape of mitochondria. The comparisons of the ultrastructure sperm of Cnemidophorus spp. and K. altamazonica with Ameiva ameiva [J. Morphol. (2002) in press] suggest that Ameiva and Cnemidophorus are more similar to each other than either is to Kentropyx. Statistical analyses reveal that sperm of all three species studied are significantly different in the following dimensions: head, acrosome, distal centriole length, and nuclear shoulders width. There was no variable statistically different between the Cnemidophorus spp. only. The length of the tail, midpiece, entire sperm, and nuclear rostrum are significantly different between K. altamazonica and Cnemidophorus spp. Our results indicate that sperm ultrastructure presents intra and intergeneric variability.
Effect of tulle on the mechanical properties of a maxillofacial silicone elastomer.
Gunay, Yumushan; Kurtoglu, Cem; Atay, Arzu; Karayazgan, Banu; Gurbuz, Cihan Cem
2008-11-01
The purpose of this research was to investigate if physical properties could be improved by incorporating a tulle reinforcement material into a maxillofacial silicone elastomer. A-2186 silicone elastomer was used in this study. The study group consisted of 20 elastomer specimens incorporated with tulle and fabricated in dumbbell-shaped silicone patterns using ASTM D412 and D624 standards. The control group consisted of 20 elastomer specimens fabricated without tulle. Tensile strength, ultimate elongation, and tear strength of all specimens were measured and analyzed. Statistical analyses were performed using Mann-Whitney U test with a statistical significance at 95% confidence level. It was found that the tensile and tear strengths of tulle-incorporated maxillofacial silicone elastomer were higher than those without tulle incorporation (p < 0.05). Therefore, findings of this study suggested that tulle successfully reinforced a maxillofacial silicone elastomer by providing it with better mechanical properties and augmented strength--especially for the delicate edges of maxillofacial prostheses.
NASA Astrophysics Data System (ADS)
Afifah, M. R. Nurul; Aziz, A. Che; Roslan, M. Kamal
2015-09-01
Sediment samples were collected from the shallow marine from Kuala Besar, Kelantan outwards to the basin floor of South China Sea which consisted of quaternary bottom sediments. Sixty five samples were analysed for their grain size distribution and statistical relationships. Basic statistical analysis like mean, standard deviation, skewness and kurtosis were calculated and used to differentiate the depositional environment of the sediments and to derive the uniformity of depositional environment either from the beach or river environment. The sediments of all areas were varied in their sorting ranging from very well sorted to poorly sorted, strongly negative skewed to strongly positive skewed, and extremely leptokurtic to very platykurtic in nature. Bivariate plots between the grain-size parameters were then interpreted and the Coarsest-Median (CM) pattern showed the trend suggesting relationships between sediments influenced by three ongoing hydrodynamic factors namely turbidity current, littoral drift and waves dynamic, which functioned to control the sediments distribution pattern in various ways.
The Matching Relation and Situation-Specific Bias Modulation in Professional Football Play Selection
Stilling, Stephanie T; Critchfield, Thomas S
2010-01-01
The utility of a quantitative model depends on the extent to which its fitted parameters vary systematically with environmental events of interest. Professional football statistics were analyzed to determine whether play selection (passing versus rushing plays) could be accounted for with the generalized matching equation, and in particular whether variations in play selection across game situations would manifest as changes in the equation's fitted parameters. Statistically significant changes in bias were found for each of five types of game situations; no systematic changes in sensitivity were observed. Further analyses suggested relationships between play selection bias and both turnover probability (which can be described in terms of punishment) and yards-gained variance (which can be described in terms of variable-magnitude reinforcement schedules). The present investigation provides a useful demonstration of association between face-valid, situation-specific effects in a domain of everyday interest, and a theoretically important term of a quantitative model of behavior. Such associations, we argue, are an essential focus in translational extensions of quantitative models. PMID:21119855
Assessing knowledge on fibromyalgia among Internet users.
Moretti, Felipe Azevedo; Heymann, Roberto Ezequiel; Marvulle, Valdecir; Pollak, Daniel Feldman; Riera, Rachel
2011-01-01
To assess knowledge on fibromyalgia in a sample of patients, their families, and professionals interested on the theme from some Brazilian states. Analysis of the results of an electronic fibromyalgia knowledge questionnaire completed by 362 adults who had access to the the support group for fibromyalgia site (www.unifesp.br/grupos/fibromialgia). The answers were grouped according to age, sex, years of schooling, and type of interest in the condition. 92% of the responders were women and 62% had higher educational level. The worst results were observed in the "joint protection and energy conservation" domain, followed by the "medication in fibromyalgia" domain. The best results were recorded in the "exercises in fibromyalgia" domain. The answers differed significantly between sexes, and women achieved a higher percentage of correct answers. The female sex accounted for a statistically superior result in five statistical analyses (four questions and one domain). The study suggests the need for a strategic planning for an educational approach to fibromyalgia in Brazil.
Passive smoking: directions for health education among Malaysian college students.
Kurtz, M E; Johnson, S M; Ross-Lee, B
1992-01-01
This study investigated knowledge, attitudes, and preventive efforts of Malaysian college students regarding health risks associated with passive smoking, as well as possible directions for intervention and health education programs. Students responded anonymously to a structured written questionnaire. Statistical analyses were conducted to examine (1) differences in knowledge, attitudes, and preventive efforts between smokers and nonsmokers and between men and women; (2) the relationship between smoking by parents, siblings, and friends, and students' knowledge, attitudes, and preventive efforts; and (3) relationships between knowledge, attitudes, and preventive efforts. Peer groups and siblings had a substantial influence on students' attitudes toward passive smoking and their preventive efforts when exposed to passive smoke. A regression analysis revealed a statistically significant linear dependence of preventive efforts on knowledge and attitudes, with the attitude component playing the dominant role. This research suggests that educational efforts on passive smoking, directed toward young college students in developing countries such as Malaysia, should concentrate heavily on changing attitudes and reducing the effects of peer group and sibling influences.
Huang, Rong; Tian, Sai; Cai, Rongrong; Sun, Jie; Xia, Wenqing; Dong, Xue; Shen, Yanjue; Wang, Shaohua
2017-08-01
Saitohin (STH) Q7R polymorphism has been reported to influence the individual's susceptibility to Alzheimer's disease (AD); however, conclusions remain controversial. Therefore, we performed this meta-analysis to explore the association between STH Q7R polymorphism and AD risk. Systematic literature searches were performed in the PubMed, Embase, Cochrane Library and Web of Science for studies published before 31 August 2016. Pooled odds ratios (ORs) and 95% confidence intervals (CIs) were calculated to assess the strength of the association using a fixed- or random-effects model. Subgroup analyses, Galbraith plot and sensitivity analyses were also performed. All statistical analyses were performed with STATA Version 12.0. A total of 19 case-control studies from 17 publications with 4387 cases and 3972 controls were included in our meta-analysis. The results showed that the Q7R polymorphism was significantly associated with an increased risk of AD in a recessive model (RR versus QQ+QR, OR = 1.27, 95% CI = 1.01-1.60, P = 0.040). After excluding the four studies not carried out in caucasians, the overall association was unchanged in all comparison models. Further subgroup analyses stratified by the time of AD onset, and the quality of included studies provided statistical evidence of significant increased risk of AD in RR versus QQ+QR model only in late-onset subjects (OR = 1.56, 95% CI = 1.07-2.26, P = 0.021) and in studies with high quality (OR = 1.37, 95% CI = 1.01-1.86, P = 0.043). This meta-analysis suggests that the RR genotype in saitohin Q7R polymorphism may be a human-specific risk factor for AD, especially among late-onset AD subjects and caucasian populations. © 2017 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.
Arizpe, Joseph; Kravitz, Dwight J; Walsh, Vincent; Yovel, Galit; Baker, Chris I
2016-01-01
The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis.
Alcohol Control Policies and Alcohol Consumption by Youth: A Multi-National Study
Paschall, Mallie J.; Grube, Joel W.; Kypri, Kypros
2009-01-01
Aims The study examined relationships between alcohol control policies and adolescent alcohol use in 26 countries. Design Cross-sectional analyses of alcohol policy ratings based on the Alcohol Policy Index (API), per capita consumption, and national adolescent survey data. Setting Data are from 26 countries. Participants Adolescents (15-17 years old) who participated in the 2003 ESPAD (European countries) or national secondary school surveys in Spain, Canada, Australia, New Zealand and the USA. Measurements Alcohol control policy ratings based on the API; prevalence of alcohol use, heavy drinking, and first drink by age 13 based on national secondary school surveys; per capita alcohol consumption for each country in 2003. Analysis Correlational and linear regression analyses were conducted to examine relationships between alcohol control policy ratings and past-30-day prevalence of adolescent alcohol use, heavy drinking, and having first drink by age 13. Per capita consumption of alcohol was included as a covariate in regression analyses. Findings More comprehensive API ratings and alcohol availability and advertising control ratings were inversely related to the past-30-day prevalence of alcohol use and prevalence rates for drinking 3-5 times and 6 or more times in the past 30 days. Alcohol advertising control was also inversely related to the prevalence of past-30-day heavy drinking and having first drink by age 13. Most of the relationships between API, alcohol availability and advertising control and drinking prevalence rates were attenuated and no longer statistically significant when controlling for per capita consumption in regression analyses, suggesting that alcohol use in the general population may confound or mediate observed relationships between alcohol control policies and youth alcohol consumption. Several of the inverse relationships remained statistically significant when controlling for per capita consumption. Conclusions More comprehensive and stringent alcohol control policies, particularly policies affecting alcohol availability and marketing, are associated with lower prevalence and frequency of adolescent alcohol consumption and age of first alcohol use. PMID:19832785
Arizpe, Joseph; Kravitz, Dwight J.; Walsh, Vincent; Yovel, Galit; Baker, Chris I.
2016-01-01
The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis. PMID:26849447
Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.
Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew
2012-08-08
Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
Yang, Xin-Wei; Wang, Zhi-Ming; Jin, Tai-Yi
2006-05-01
This study was conducted to assess occupational stress in different gender, age, work duration, educational level and marital status group. A test of occupational stress in different gender, age, work duration, educational level and marital status group, was carried out with revised occupational stress inventory (OSI-R) for 4278 participants. The results of gender show that there are heavier occupational role, stronger interpersonal and physical strain in male than that in female, and the differences are statistically significant (P < 0.01). The score of recreation in the male is higher than that in female, but the score of self-care in the female is higher than that in male, and the differences are statistically significant (P < 0.01). Difference in the scores of occupational role, personal resource among various age groups is significant (P < 0.01). Vocational, interpersonal strain scores among various age groups is significant (P < 0.05). The results of educational level analyses suggest that the difference in the scores of occupational stress and strain among various educational levels show statistically significant (P < 0.05), whereas there are no statistic significance of coping resources among the groups (P > 0.05). The occupational stress so as to improve the work ability of different groups. Different measure should be taken to reduce the occupational stress so as to improve the work ability of different groups.
Siren, J; Ovaskainen, O; Merilä, J
2017-10-01
The genetic variance-covariance matrix (G) is a quantity of central importance in evolutionary biology due to its influence on the rate and direction of multivariate evolution. However, the predictive power of empirically estimated G-matrices is limited for two reasons. First, phenotypes are high-dimensional, whereas traditional statistical methods are tuned to estimate and analyse low-dimensional matrices. Second, the stability of G to environmental effects and over time remains poorly understood. Using Bayesian sparse factor analysis (BSFG) designed to estimate high-dimensional G-matrices, we analysed levels variation and covariation in 10,527 expressed genes in a large (n = 563) half-sib breeding design of three-spined sticklebacks subject to two temperature treatments. We found significant differences in the structure of G between the treatments: heritabilities and evolvabilities were higher in the warm than in the low-temperature treatment, suggesting more and faster opportunity to evolve in warm (stressful) conditions. Furthermore, comparison of G and its phenotypic equivalent P revealed the latter is a poor substitute of the former. Most strikingly, the results suggest that the expected impact of G on evolvability-as well as the similarity among G-matrices-may depend strongly on the number of traits included into analyses. In our results, the inclusion of only few traits in the analyses leads to underestimation in the differences between the G-matrices and their predicted impacts on evolution. While the results highlight the challenges involved in estimating G, they also illustrate that by enabling the estimation of large G-matrices, the BSFG method can improve predicted evolutionary responses to selection. © 2017 John Wiley & Sons Ltd.
Jouet, Agathe; McMullan, Mark; van Oosterhout, Cock
2015-06-01
Plant immune genes, or resistance genes, are involved in a co-evolutionary arms race with a diverse range of pathogens. In agronomically important grasses, such R genes have been extensively studied because of their role in pathogen resistance and in the breeding of resistant cultivars. In this study, we evaluate the importance of recombination, mutation and selection on the evolution of the R gene complex Rp1 of Sorghum, Triticum, Brachypodium, Oryza and Zea. Analyses show that recombination is widespread, and we detected 73 independent instances of sequence exchange, involving on average 1567 of 4692 nucleotides analysed (33.4%). We were able to date 24 interspecific recombination events and found that four occurred postspeciation, which suggests that genetic introgression took place between different grass species. Other interspecific events seemed to have been maintained over long evolutionary time, suggesting the presence of balancing selection. Significant positive selection (i.e. a relative excess of nonsynonymous substitutions (dN /dS >1)) was detected in 17-95 codons (0.42-2.02%). Recombination was significantly associated with areas with high levels of polymorphism but not with an elevated dN /dS ratio. Finally, phylogenetic analyses show that recombination results in a general overestimation of the divergence time (mean = 14.3%) and an alteration of the gene tree topology if the tree is not calibrated. Given that the statistical power to detect recombination is determined by the level of polymorphism of the amplicon as well as the number of sequences analysed, it is likely that many studies have underestimated the importance of recombination relative to the mutation rate. © 2015 John Wiley & Sons Ltd.
Preliminary testing of flow-ecology hypotheses developed for the GCP LCC region
Brewer, Shannon K.; Davis, Mary
2014-01-01
The Ecological Limits of Hydrological Alteration (ELOHA) framework calls for the development of flow-ecology hypotheses to support protection of the flow regime from ecologically harmful alteration due to human activities. As part of a larger instream flow project for the Gulf Coast Prairie Landscape Conservation Cooperative (GCP LCC), regional flow-ecology hypotheses were developed for fish, mussels, birds, and riparian vegetation (Davis and Brewer 20141). The objective of this study was to assess the usefulness of existing ecological and hydrological data to test these hypotheses or others that may be developed in the future. Several databases related to biological collections and hydrologic data from Oklahoma, Texas, and Louisiana were compiled. State fish-community data from Oklahoma and Louisiana were summarized and paired with existing USGS gage data having at least a 40-year period of record that could be separated into reference and current conditions for comparison. The objective of this study was not to conduct exhaustive analyses of these data, the hypotheses, or analyses interpretation, but rather to use these data to determine if existing data were adequate to statistically test the regional flow-ecology hypotheses. The regional flow-ecology hypotheses were developed for the GCP LCC by a committee chaired by Shannon Brewer and Mary Davis (Davis and Brewer 2014). Existing data were useful for informing the hypotheses and suggest support for some hypotheses, but also highlight the need for additional testing and development as some results contradicted hypotheses. Results presented here suggest existing data are adequate to support some flow-ecology hypotheses; however, lack of sampling effort reported with the fish collections and the need for ecoregion-specific analyses suggest more data would be beneficial to analyses in some ecoregions. Additional fish sampling data from Texas and Louisiana will be available for future analyses and may ameliorate some of the data concerns and improve hypothesis interpretation. If the regional hydrologic model currently under development by the U.S. Geological Survey for the South-Central Climate Science Center is improved to produce daily hydrographs, it will enable use of fish data at ungaged locations. In future efforts, exhaustive analyses using these data, in addition to the development of more complex multivariate hypotheses, would be beneficial to understanding data gaps, particularly as relevant to species of conservation concern.
Rate, Andrew W
2018-06-15
Urban environments are dynamic and highly heterogeneous, and multiple additions of potential contaminants are likely on timescales which are short relative to natural processes. The likely sources and location of soil or sediment contamination in urban environment should therefore be detectable using multielement geochemical composition combined with rigorously applied multivariate statistical techniques. Soil, wetland sediment, and street dust was sampled along intersecting transects in Robertson Park in metropolitan Perth, Western Australia. Samples were analysed for near-total concentrations of multiple elements (including Cd, Ce, Co, Cr, Cu, Fe, Gd, La, Mn, Nd, Ni, Pb, Y, and Zn), as well as pH, and electrical conductivity. Samples at some locations within Robertson Park had high concentrations of potentially toxic elements (Pb above Health Investigation Limits; As, Ba, Cu, Mn, Ni, Pb, V, and Zn above Ecological Investigation Limits). However, these concentrations carry low risk due to the main land use as recreational open space, the low proportion of samples exceeding guideline values, and a tendency for the highest concentrations to be located within the less accessible wetland basin. The different spatial distributions of different groups of contaminants was consistent with different inputs of contaminants related to changes in land use and technology over the history of the site. Multivariate statistical analyses reinforced the spatial information, with principal component analysis identifying geochemical associations of elements which were also spatially related. A multivariate linear discriminant model was able to discriminate samples into a-priori types, and could predict sample type with 84% accuracy based on multielement composition. The findings suggest substantial advantages of characterising a site using multielement and multivariate analyses, an approach which could benefit investigations of other sites of concern. Copyright © 2018 Elsevier B.V. All rights reserved.
Lajus, Dmitry; Sukhikh, Natalia; Alekseev, Victor
2015-01-01
Interest in cryptic species has increased significantly with current progress in genetic methods. The large number of cryptic species suggests that the resolution of traditional morphological techniques may be insufficient for taxonomical research. However, some species now considered to be cryptic may, in fact, be designated pseudocryptic after close morphological examination. Thus the “cryptic or pseudocryptic” dilemma speaks to the resolution of morphological analysis and its utility for identifying species. We address this dilemma first by systematically reviewing data published from 1980 to 2013 on cryptic species of Copepoda and then by performing an in-depth morphological study of the former Eurytemora affinis complex of cryptic species. Analyzing the published data showed that, in 5 of 24 revisions eligible for systematic review, cryptic species assignment was based solely on the genetic variation of forms without detailed morphological analysis to confirm the assignment. Therefore, some newly described cryptic species might be designated pseudocryptic under more detailed morphological analysis as happened with Eurytemora affinis complex. Recent genetic analyses of the complex found high levels of heterogeneity without morphological differences; it is argued to be cryptic. However, next detailed morphological analyses allowed to describe a number of valid species. Our study, using deep statistical analyses usually not applied for new species describing, of this species complex confirmed considerable differences between former cryptic species. In particular, fluctuating asymmetry (FA), the random variation of left and right structures, was significantly different between forms and provided independent information about their status. Our work showed that multivariate statistical approaches, such as principal component analysis, can be powerful techniques for the morphological discrimination of cryptic taxons. Despite increasing cryptic species designations, morphological techniques have great potential in determining copepod taxonomy. PMID:26120427
Expression of Vascular Notch Ligand Delta-Like 4 and Inflammatory Markers in Breast Cancer
Jubb, Adrian M.; Soilleux, Elizabeth J.; Turley, Helen; Steers, Graham; Parker, Andrew; Low, Irene; Blades, Jennifer; Li, Ji-Liang; Allen, Paul; Leek, Russell; Noguera-Troise, Irene; Gatter, Kevin C.; Thurston, Gavin; Harris, Adrian L.
2010-01-01
Delta-like ligand 4 (Dll4) is a Notch ligand that is predominantly expressed in the endothelium. Evidence from xenografts suggests that inhibiting Dll4 may overcome resistance to antivascular endothelial growth factor therapy. The aims of this study were to characterize the expression of Dll4 in breast cancer and assess whether it is associated with inflammatory markers and prognosis. We examined 296 breast adenocarcinomas and 38 ductal carcinoma in situ tissues that were represented in tissue microarrays. Additional whole sections representing 10 breast adenocarcinomas, 10 normal breast tissues, and 16 angiosarcomas were included. Immunohistochemistry was then performed by using validated antibodies against Dll4, CD68, CD14, Dendritic Cell-Specific Intercellular adhesion molecule-3-Grabbing Non-integrin (DC-SIGN), CD123, neutrophil elastase, CD31, and carbonic anhydrase 9. Dll4 was selectively expressed by intratumoral endothelial cells in 73% to 100% of breast adenocarcinomas, 18% of in situ ductal carcinomas, and all lactating breast cases, but not normal nonlactating breast. High intensity of endothelial Dll4 expression was a statistically significant adverse prognostic factor in univariate (P = 0.002 and P = 0.01) and multivariate analyses (P = 0.03 and P = 0.04) of overall survival and relapse-free survival, respectively. Among the inflammatory markers, only CD68 and DC-SIGN were significant prognostic factors in univariate (but not multivariate) analyses of overall survival (P = 0.01 and 0.002, respectively). In summary, Dll4 was expressed by endothelium associated with breast cancer cells. In these retrospective subset analyses, endothelial Dll4 expression was a statistically significant multivariate prognostic factor. PMID:20167860
Dwan, Kerry; Altman, Douglas G.; Clarke, Mike; Gamble, Carrol; Higgins, Julian P. T.; Sterne, Jonathan A. C.; Williamson, Paula R.; Kirkham, Jamie J.
2014-01-01
Background Most publications about selective reporting in clinical trials have focussed on outcomes. However, selective reporting of analyses for a given outcome may also affect the validity of findings. If analyses are selected on the basis of the results, reporting bias may occur. The aims of this study were to review and summarise the evidence from empirical cohort studies that assessed discrepant or selective reporting of analyses in randomised controlled trials (RCTs). Methods and Findings A systematic review was conducted and included cohort studies that assessed any aspect of the reporting of analyses of RCTs by comparing different trial documents, e.g., protocol compared to trial report, or different sections within a trial publication. The Cochrane Methodology Register, Medline (Ovid), PsycInfo (Ovid), and PubMed were searched on 5 February 2014. Two authors independently selected studies, performed data extraction, and assessed the methodological quality of the eligible studies. Twenty-two studies (containing 3,140 RCTs) published between 2000 and 2013 were included. Twenty-two studies reported on discrepancies between information given in different sources. Discrepancies were found in statistical analyses (eight studies), composite outcomes (one study), the handling of missing data (three studies), unadjusted versus adjusted analyses (three studies), handling of continuous data (three studies), and subgroup analyses (12 studies). Discrepancy rates varied, ranging from 7% (3/42) to 88% (7/8) in statistical analyses, 46% (36/79) to 82% (23/28) in adjusted versus unadjusted analyses, and 61% (11/18) to 100% (25/25) in subgroup analyses. This review is limited in that none of the included studies investigated the evidence for bias resulting from selective reporting of analyses. It was not possible to combine studies to provide overall summary estimates, and so the results of studies are discussed narratively. Conclusions Discrepancies in analyses between publications and other study documentation were common, but reasons for these discrepancies were not discussed in the trial reports. To ensure transparency, protocols and statistical analysis plans need to be published, and investigators should adhere to these or explain discrepancies. Please see later in the article for the Editors' Summary PMID:24959719
Escorza-Treviño, S; Dizon, A E
2000-08-01
Mitochondrial DNA (mtDNA) control-region sequences and microsatellite loci length polymorphisms were used to estimate phylogeographical patterns (historical patterns underlying contemporary distribution), intraspecific population structure and gender-biased dispersal of Phocoenoides dalli dalli across its entire range. One-hundred and thirteen animals from several geographical strata were sequenced over 379 bp of mtDNA, resulting in 58 mtDNA haplotypes. Analysis using F(ST) values (based on haplotype frequencies) and phi(ST) values (based on frequencies and genetic distances between haplotypes) yielded statistically significant separation (bootstrap values P < 0.05) among most of the stocks currently used for management purposes. A minimum spanning network of haplotypes showed two very distinctive clusters, differentially occupied by western and eastern populations, with some common widespread haplotypes. This suggests some degree of phyletic radiation from west to east, superimposed on gene flow. Highly male-biased migration was detected for several population comparisons. Nuclear microsatellite DNA markers (119 individuals and six loci) provided additional support for population subdivision and gender-biased dispersal detected in the mtDNA sequences. Analysis using F(ST) values (based on allelic frequencies) yielded statistically significant separation between some, but not all, populations distinguished by mtDNA analysis. R(ST) values (based on frequencies of and genetic distance between alleles) showed no statistically significant subdivision. Again, highly male-biased dispersal was detected for all population comparisons, suggesting, together with morphological and reproductive data, the existence of sexual selection. Our molecular results argue for nine distinct dalli-type populations that should be treated as separate units for management purposes.
NASA Astrophysics Data System (ADS)
Telesca, Luciano; Lovallo, Michele; Lopez, Carmen; Marti Molist, Joan
2016-03-01
A detailed statistical investigation of the seismicity occurred at El Hierro volcano (Canary Islands) from 2011 to 2014 has been performed by analysing the time variation of four parameters: the Gutenberg-Richter b-value, the local coefficient of variation, the scaling exponent of the magnitude distribution and the main periodicity of the earthquake sequence calculated by using the Schuster's test. These four parameters are good descriptors of the time and magnitude distributions of the seismic sequence, and their variation indicate dynamical changes in the volcanic system. These variations can be attributed to the causes and types of seismicity, thus allowing to distinguish between different host-rock fracturing processes caused by intrusions of magma at different depths and overpressures. The statistical patterns observed among the studied unrest episodes and between them and the eruptive episode of 2011-2012 indicate that the response of the host rock to the deformation imposed by magma intrusion did not differ significantly from one episode to the other, thus suggesting that no significant local stress changes induced by magma intrusion occurred when comparing between all them. Therefore, despite the studied unrest episodes were caused by intrusions of magma at different depths and locations below El Hierro island, the mechanical response of the lithosphere was similar in all cases. This suggests that the reason why the first unrest culminated in an eruption while the other did not, may be related to the role of the regional/local tectonics acting at that moment, rather than to the forceful of magma intrusion.
Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos
2017-06-01
Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. 'Dignity therapy' was used as search term. Studies with patients with advanced life-threatening diseases were included. Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients' anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre-post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy.
Silverman, Michael J
2007-01-01
Educational and therapeutic objectives are often paired with music to facilitate the recall of information. The purpose of this study was to isolate and determine the effect of paired pitch, rhythm, and speech on undergraduate's memory as measured by sequential digit recall performance. Participants (N = 120) listened to 4 completely counterbalanced treatment conditions each consisting of 9 randomized monosyllabic digits paired with speech, pitch, rhythm, and the combination of pitch and rhythm. No statistically significant learning or order effects were found across the 4 trials. A 3-way repeated-measures ANOVA indicated a statistically significant difference in digit recall performance across treatment conditions, positions, groups, and treatment by position. No other comparisons resulted in statistically significant differences. Participants were able to recall digits from the rhythm condition most accurately while recalling digits from the speech and pitch only conditions the least accurately. Consistent with previous research, the music major participants scored significantly higher than non-music major participants and the main effect associated with serial position indicated that recall performance was best during primacy and recency positions. Analyses indicated an interaction between serial position and treatment condition, also a result consistent with previous research. The results of this study suggest that pairing information with rhythm can facilitate recall but pairing information with pitch or the combination of pitch and rhythm may not enhance recall more than speech when participants listen to an unfamiliar musical selection only once. Implications for practice in therapy and education are made as well as suggestions for future research.
ERIC Educational Resources Information Center
Tractenberg, Rochelle E.
2017-01-01
Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards "big" data: while automated analyses can exploit massive amounts of data, the interpretation--and possibly more importantly, the replication--of results are…
Using DEWIS and R for Multi-Staged Statistics e-Assessments
ERIC Educational Resources Information Center
Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.
2016-01-01
We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…
Jeffrey P. Prestemon
2009-01-01
Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T; Pereira, Carol; Rosenkranz, Susan L; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu Jeanne; Wang, Rui; Lok, Judith; Evans, Scott R
2017-03-15
The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Glass-Kaastra, Shiona K.; Pearl, David L.; Reid-Smith, Richard J.; McEwen, Beverly; Slavic, Durda; McEwen, Scott A.; Fairles, Jim
2014-01-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection. PMID:24688133
Glass-Kaastra, Shiona K; Pearl, David L; Reid-Smith, Richard J; McEwen, Beverly; Slavic, Durda; McEwen, Scott A; Fairles, Jim
2014-04-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection.
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
Mitchem, Dorian G.; Zietsch, Brendan P.; Wright, Margaret J.; Martin, Nicholas G.; Hewitt, John K.; Keller, Matthew C.
2015-01-01
Theories in both evolutionary and social psychology suggest that a positive correlation should exist between facial attractiveness and general intelligence, and several empirical observations appear to corroborate this expectation. Using highly reliable measures of facial attractiveness and IQ in a large sample of identical and fraternal twins and their siblings, we found no evidence for a phenotypic correlation between these traits. Likewise, neither the genetic nor the environmental latent factor correlations were statistically significant. We supplemented our analyses of new data with a simple meta-analysis that found evidence of publication bias among past studies of the relationship between facial attractiveness and intelligence. In view of these results, we suggest that previously published reports may have overestimated the strength of the relationship and that the theoretical bases for the predicted attractiveness-intelligence correlation may need to be reconsidered. PMID:25937789
A quasi-experimental study of maternal smoking during pregnancy and offspring academic achievement
D'Onofrio, Brian M.; Singh, Amber L.; Iliadou, Anastasia; Lambe, Mats; Hultman, Christina M.; Neiderhiser, Jenae M.; Långström, Niklas; Lichtenstein, Paul
2013-01-01
Maternal smoking during pregnancy (SDP) is associated with lower academic achievement in offspring. The current study, which was based on all births in Sweden from 1983 through 1991, explored the possible causal processes underlying the association between SDP and offspring school grades and a standardized assessment of mathematic proficiency at age 15. The analyses compared relatives who varied in their exposure to SDP and who varied in their genetic relatedness. Although SDP was statistically associated with academic achievement when comparing unrelated individuals, the results suggest that SDP does not cause poorer academic performance, as full siblings differentially exposed to SDP did not differ in their academic scores. The pattern of results suggests that genetic factors shared by parents and their offspring explain significant variance in why offspring opposed to SDP have lower levels of academic achievement. Nevertheless, SDP impacts pregnancy-related outcomes. Reducing SDP, therefore, remains a major public health issue. PMID:20331655
Laser-induced tissue fluorescence in radiofrequency tissue-fusion characterization.
Su, Lei; Fonseca, Martina B; Arya, Shobhit; Kudo, Hiromi; Goldin, Robert; Hanna, George B; Elson, Daniel S
2014-01-01
Heat-induced tissue fusion is an important procedure in modern surgery and can greatly reduce trauma, complications, and mortality during minimally invasive surgical blood vessel anastomosis, but it may also have further benefits if applied to other tissue types such as small and large intestine anastomoses. We present a tissue-fusion characterization technology using laser-induced fluorescence spectroscopy, which provides further insight into tissue constituent variations at the molecular level. In particular, an increase of fluorescence intensity in 450- to 550-nm range for 375- and 405-nm excitation suggests that the collagen cross-linking in fused tissues increased. Our experimental and statistical analyses showed that, by using fluorescence spectral data, good fusion could be differentiated from other cases with an accuracy of more than 95%. This suggests that the fluorescence spectroscopy could be potentially used as a feedback control method in online tissue-fusion monitoring.
An empirical generative framework for computational modeling of language acquisition.
Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
2010-06-01
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.
Ratio-based lengths of intervals to improve fuzzy time series forecasting.
Huarng, Kunhuang; Yu, Tiffany Hui-Kuang
2006-04-01
The objective of this study is to explore ways of determining the useful lengths of intervals in fuzzy time series. It is suggested that ratios, instead of equal lengths of intervals, can more properly represent the intervals among observations. Ratio-based lengths of intervals are, therefore, proposed to improve fuzzy time series forecasting. Algebraic growth data, such as enrollments and the stock index, and exponential growth data, such as inventory demand, are chosen as the forecasting targets, before forecasting based on the various lengths of intervals is performed. Furthermore, sensitivity analyses are also carried out for various percentiles. The ratio-based lengths of intervals are found to outperform the effective lengths of intervals, as well as the arbitrary ones in regard to the different statistical measures. The empirical analysis suggests that the ratio-based lengths of intervals can also be used to improve fuzzy time series forecasting.
Silva, Bruna Mariáh da S E; Morales, Gundisalvo P; Gutjahr, Ana Lúcia N; Freitas Faial, Kelson do C; Carneiro, Bruno S
2018-03-14
In this study, trace element concentrations were measured in chelipod and gill samples of the crab U. cordatus by induced coupled plasma optical emission spectrometry (ICP OES). The element average concentrations between the structures were statistically compared. Gill concentrations of Cu and Zn were higher in female crabs, while in chelipods, Pb concentrations were higher in males. The concentration of Zn in crabs from Curuçá City were higher than the recommended by health agencies, but the provisional tolerable daily intake value (PTDI), for Zn and Cu, showed only 10 and 23% contribution, respectively. The bioaccumulation factor was higher than 1 for Cu (gills and chelipods) and Zn (only for chelipods), which suggests bioaccumulation for these elements. Further metallomic and oxidative stress analyses are suggested, in order to evaluate possible protein and/or enzymatic biomarkers of toxicity.
Ferro, Ana; Morais, Samantha; Rota, Matteo; Pelucchi, Claudio; Bertuccio, Paola; Bonzi, Rossella; Galeone, Carlotta; Zhang, Zuo-Feng; Matsuo, Keitaro; Ito, Hidemi; Hu, Jinfu; Johnson, Kenneth C; Yu, Guo-Pei; Palli, Domenico; Ferraroni, Monica; Muscat, Joshua; Malekzadeh, Reza; Ye, Weimin; Song, Huan; Zaridze, David; Maximovitch, Dmitry; Fernández de Larrea, Nerea; Kogevinas, Manolis; Vioque, Jesus; Navarrete-Muñoz, Eva M; Pakseresht, Mohammadreza; Pourfarzi, Farhad; Wolk, Alicja; Orsini, Nicola; Bellavia, Andrea; Håkansson, Niclas; Mu, Lina; Pastorino, Roberta; Kurtz, Robert C; Derakhshan, Mohammad H; Lagiou, Areti; Lagiou, Pagona; Boffetta, Paolo; Boccia, Stefania; Negri, Eva; La Vecchia, Carlo; Peleteiro, Bárbara; Lunet, Nuno
2018-05-01
Individual participant data pooled analyses allow access to non-published data and statistical reanalyses based on more homogeneous criteria than meta-analyses based on systematic reviews. We quantified the impact of publication-related biases and heterogeneity in data analysis and presentation in summary estimates of the association between alcohol drinking and gastric cancer. We compared estimates obtained from conventional meta-analyses, using only data available in published reports from studies that take part in the Stomach Cancer Pooling (StoP) Project, with individual participant data pooled analyses including the same studies. A total of 22 studies from the StoP Project assessed the relation between alcohol intake and gastric cancer, 19 had specific data for levels of consumption and 18 according to cancer location; published reports addressing these associations were available from 18, 5 and 5 studies, respectively. The summary odds ratios [OR, (95%CI)] estimate obtained with published data for drinkers vs. non-drinkers was 10% higher than the one obtained with individual StoP data [18 vs. 22 studies: 1.21 (1.07-1.36) vs. 1.10 (0.99-1.23)] and more heterogeneous (I 2 : 63.6% vs 54.4%). In general, published data yielded less precise summary estimates (standard errors up to 2.6 times higher). Funnel plot analysis suggested publication bias. Meta-analyses of the association between alcohol drinking and gastric cancer tended to overestimate the magnitude of the effects, possibly due to publication bias. Additionally, individual participant data pooled analyses yielded more precise estimates for different levels of exposure or cancer subtypes. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Kadhi, T.; Holley, D.; Rudley, D.; Garrison, P.; Green, T.
2010-01-01
The following report gives the statistical findings of the 2010 Thurgood Marshall School of Law (TMSL) Texas Bar results. This data was pre-existing and was given to the Evaluator by email from the Dean. Then, in-depth statistical analyses were run using the SPSS 17 to address the following questions: 1. What are the statistical descriptors of the…
A statistical package for computing time and frequency domain analysis
NASA Technical Reports Server (NTRS)
Brownlow, J.
1978-01-01
The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.
Sherrouse, Benson C.; Semmens, Darius J.; Clement, Jessica M.
2014-01-01
Despite widespread recognition that social-value information is needed to inform stakeholders and decision makers regarding trade-offs in environmental management, it too often remains absent from ecosystem service assessments. Although quantitative indicators of social values need to be explicitly accounted for in the decision-making process, they need not be monetary. Ongoing efforts to map such values demonstrate how they can also be made spatially explicit and relatable to underlying ecological information. We originally developed Social Values for Ecosystem Services (SolVES) as a tool to assess, map, and quantify nonmarket values perceived by various groups of ecosystem stakeholders. With SolVES 2.0 we have extended the functionality by integrating SolVES with Maxent maximum entropy modeling software to generate more complete social-value maps from available value and preference survey data and to produce more robust models describing the relationship between social values and ecosystems. The current study has two objectives: (1) evaluate how effectively the value index, a quantitative, nonmonetary social-value indicator calculated by SolVES, reproduces results from more common statistical methods of social-survey data analysis and (2) examine how the spatial results produced by SolVES provide additional information that could be used by managers and stakeholders to better understand more complex relationships among stakeholder values, attitudes, and preferences. To achieve these objectives, we applied SolVES to value and preference survey data collected for three national forests, the Pike and San Isabel in Colorado and the Bridger–Teton and the Shoshone in Wyoming. Value index results were generally consistent with results found through more common statistical analyses of the survey data such as frequency, discriminant function, and correlation analyses. In addition, spatial analysis of the social-value maps produced by SolVES provided information that was useful for explaining relationships between stakeholder values and forest uses. Our results suggest that SolVES can effectively reproduce information derived from traditional statistical analyses while adding spatially explicit, social-value information that can contribute to integrated resource assessment, planning, and management of forests and other ecosystems.
Methods for meta-analysis of multiple traits using GWAS summary statistics.
Ray, Debashree; Boehnke, Michael
2018-03-01
Genome-wide association studies (GWAS) for complex diseases have focused primarily on single-trait analyses for disease status and disease-related quantitative traits. For example, GWAS on risk factors for coronary artery disease analyze genetic associations of plasma lipids such as total cholesterol, LDL-cholesterol, HDL-cholesterol, and triglycerides (TGs) separately. However, traits are often correlated and a joint analysis may yield increased statistical power for association over multiple univariate analyses. Recently several multivariate methods have been proposed that require individual-level data. Here, we develop metaUSAT (where USAT is unified score-based association test), a novel unified association test of a single genetic variant with multiple traits that uses only summary statistics from existing GWAS. Although the existing methods either perform well when most correlated traits are affected by the genetic variant in the same direction or are powerful when only a few of the correlated traits are associated, metaUSAT is designed to be robust to the association structure of correlated traits. metaUSAT does not require individual-level data and can test genetic associations of categorical and/or continuous traits. One can also use metaUSAT to analyze a single trait over multiple studies, appropriately accounting for overlapping samples, if any. metaUSAT provides an approximate asymptotic P-value for association and is computationally efficient for implementation at a genome-wide level. Simulation experiments show that metaUSAT maintains proper type-I error at low error levels. It has similar and sometimes greater power to detect association across a wide array of scenarios compared to existing methods, which are usually powerful for some specific association scenarios only. When applied to plasma lipids summary data from the METSIM and the T2D-GENES studies, metaUSAT detected genome-wide significant loci beyond the ones identified by univariate analyses. Evidence from larger studies suggest that the variants additionally detected by our test are, indeed, associated with lipid levels in humans. In summary, metaUSAT can provide novel insights into the genetic architecture of a common disease or traits. © 2017 WILEY PERIODICALS, INC.
Mitochondrial genomes suggest that hexapods and crustaceans are mutually paraphyletic
Cook, Charles E; Yue, Qiaoyun; Akam, Michael
2005-01-01
For over a century the relationships between the four major groups of the phylum Arthropoda (Chelicerata, Crustacea, Hexapoda and Myriapoda) have been debated. Recent molecular evidence has confirmed a close relationship between the Crustacea and the Hexapoda, and has included the suggestion of a paraphyletic Hexapoda. To test this hypothesis we have sequenced the complete or near-complete mitochondrial genomes of three crustaceans (Parhyale hawaiensis, Squilla mantis and Triops longicaudatus), two collembolans (Onychiurus orientalis and Podura aquatica) and the insect Thermobia domestica. We observed rearrangement of transfer RNA genes only in O. orientalis, P. aquatica and P. hawaiensis. Of these, only the rearrangement in O. orientalis, an apparent autapomorphy for the collembolan family Onychiuridae, was phylogenetically informative. We aligned the nuclear and amino acid sequences from the mitochondrial protein-encoding genes of these taxa with their homologues from other arthropod taxa for phylogenetic analysis. Our dataset contains many more Crustacea than previous molecular phylogenetic analyses of the arthropods. Neighbour-joining, maximum-likelihood and Bayesian posterior probabilities all suggest that crustaceans and hexapods are mutually paraphyletic. A crustacean clade of Malacostraca and Branchiopoda emerges as sister to the Insecta sensu stricto and the Collembola group with the maxillopod crustaceans. Some, but not all, analyses strongly support this mutual paraphyly but statistical tests do not reject the null hypotheses of a monophyletic Hexapoda or a monophyletic Crustacea. The dual monophyly of the Hexapoda and Crustacea has rarely been questioned in recent years but the idea of both groups' paraphyly dates back to the nineteenth century. We suggest that the mutual paraphyly of both groups should seriously be considered. PMID:16024395
Lewis, Gregory F.; Furman, Senta A.; McCool, Martha F.; Porges, Stephen W.
2011-01-01
Three frequently used RSA metrics are investigated to document violations of assumptions for parametric analyses, moderation by respiration, influences of nonstationarity, and sensitivity to vagal blockade. Although all metrics are highly correlated, new findings illustrate that the metrics are noticeably different on the above dimensions. Only one method conforms to the assumptions for parametric analyses, is not moderated by respiration, is not influenced by nonstationarity, and reliably generates stronger effect sizes. Moreover, this method is also the most sensitive to vagal blockade. Specific features of this method may provide insights into improving the statistical characteristics of other commonly used RSA metrics. These data provide the evidence to question, based on statistical grounds, published reports using particular metrics of RSA. PMID:22138367
Confidence crisis of results in biomechanics research.
Knudson, Duane
2017-11-01
Many biomechanics studies have small sample sizes and incorrect statistical analyses, so reporting of inaccurate inferences and inflated magnitude of effects are common in the field. This review examines these issues in biomechanics research and summarises potential solutions from research in other fields to increase the confidence in the experimental effects reported in biomechanics. Authors, reviewers and editors of biomechanics research reports are encouraged to improve sample sizes and the resulting statistical power, improve reporting transparency, improve the rigour of statistical analyses used, and increase the acceptance of replication studies to improve the validity of inferences from data in biomechanics research. The application of sports biomechanics research results would also improve if a larger percentage of unbiased effects and their uncertainty were reported in the literature.
Bennett, Bradley C; Husby, Chad E
2008-03-28
Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.
Validating Future Force Performance Measures (Army Class): Concluding Analyses
2016-06-01
32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness
FHWA statistical program : a customer's guide to using highway statistics
DOT National Transportation Integrated Search
1995-08-01
The appropriate level of spatial and temporal data aggregation for highway vehicle emissions analyses is one of several important analytical questions that has received considerable interest following passage of the Clean Air Act Amendments (CAAA) of...
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Freudenstein, John V; Chase, Mark W
2015-03-01
The largest subfamily of orchids, Epidendroideae, represents one of the most significant diversifications among flowering plants in terms of pollination strategy, vegetative adaptation and number of species. Although many groups in the subfamily have been resolved, significant relationships in the tree remain unclear, limiting conclusions about diversification and creating uncertainty in the classification. This study brings together DNA sequences from nuclear, plastid and mitochrondrial genomes in order to clarify relationships, to test associations of key characters with diversification and to improve the classification. Sequences from seven loci were concatenated in a supermatrix analysis for 312 genera representing most of epidendroid diversity. Maximum-likelihood and parsimony analyses were performed on this matrix and on subsets of the data to generate trees and to investigate the effect of missing values. Statistical character-associated diversification analyses were performed. Likelihood and parsimony analyses yielded highly resolved trees that are in strong agreement and show significant support for many key clades. Many previously proposed relationships among tribes and subtribes are supported, and some new relationships are revealed. Analyses of subsets of the data suggest that the relatively high number of missing data for the full analysis is not problematic. Diversification analyses show that epiphytism is most strongly associated with diversification among epidendroids, followed by expansion into the New World and anther characters that are involved with pollinator specificity, namely early anther inflexion, cellular pollinium stalks and the superposed pollinium arrangement. All tested characters show significant association with speciation in Epidendroideae, suggesting that no single character accounts for the success of this group. Rather, it appears that a succession of key features appeared that have contributed to diversification, sometimes in parallel. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A duplicate gene rooting of seed plants and the phylogenetic position of flowering plants
Mathews, Sarah; Clements, Mark D.; Beilstein, Mark A.
2010-01-01
Flowering plants represent the most significant branch in the tree of land plants, with respect to the number of extant species, their impact on the shaping of modern ecosystems and their economic importance. However, unlike so many persistent phylogenetic problems that have yielded to insights from DNA sequence data, the mystery surrounding the origin of angiosperms has deepened with the advent and advance of molecular systematics. Strong statistical support for competing hypotheses and recent novel trees from molecular data suggest that the accuracy of current molecular trees requires further testing. Analyses of phytochrome amino acids using a duplicate gene-rooting approach yield trees that unite cycads and angiosperms in a clade that is sister to a clade in which Gingko and Cupressophyta are successive sister taxa to gnetophytes plus Pinaceae. Application of a cycads + angiosperms backbone constraint in analyses of a morphological dataset yields better resolved trees than do analyses in which extant gymnosperms are forced to be monophyletic. The results have implications both for our assessment of uncertainty in trees from sequence data and for our use of molecular constraints as a way to integrate insights from morphological and molecular evidence. PMID:20047866
Fan, Liping; Fu, Danhui; Hong, Jinquan; He, Wenqian; Zeng, Feng; Lin, Qiuyan; Xie, Qianling
2018-01-01
The current study sought to evaluate whether blood transfusions affect survival of elderly patients with primary diffuse large B-cell lymphoma (DLBCL). A total of 104 patients aged 60 years and over were enrolled and divided into two groups: 24 patients who received transfusions and 80 patients who did not. Statistical analyses showed significant differences in LDH levels, platelet (Plt) counts, and hemoglobin (Hb) and albumin (Alb) levels between the two groups. Univariate analyses showed that LDH level ≥ 245 IU/L, cell of origin (germinal center/nongerminal center), and blood transfusion were associated with both overall survival (OS) and progression-free survival (PFS). Higher IPI (3–5), Alb level < 35 g/L, and rituximab usage were associated with OS. Appearance of B symptoms was associated with PFS. Multivariate analyses showed that cell of origin and rituximab usage were independent factors for OS and LDH level was an independent factor for PFS. Blood transfusion was an independent factor for PFS, but not for OS. Our preliminary results suggested that elderly patients with primary DLBCL may benefit from a restrictive blood transfusion strategy. PMID:29750167
Cross-Sectional Analysis of Longitudinal Mediation Processes.
O'Laughlin, Kristine D; Martin, Monica J; Ferrer, Emilio
2018-01-01
Statistical mediation analysis can help to identify and explain the mechanisms behind psychological processes. Examining a set of variables for mediation effects is a ubiquitous process in the social sciences literature; however, despite evidence suggesting that cross-sectional data can misrepresent the mediation of longitudinal processes, cross-sectional analyses continue to be used in this manner. Alternative longitudinal mediation models, including those rooted in a structural equation modeling framework (cross-lagged panel, latent growth curve, and latent difference score models) are currently available and may provide a better representation of mediation processes for longitudinal data. The purpose of this paper is twofold: first, we provide a comparison of cross-sectional and longitudinal mediation models; second, we advocate using models to evaluate mediation effects that capture the temporal sequence of the process under study. Two separate empirical examples are presented to illustrate differences in the conclusions drawn from cross-sectional and longitudinal mediation analyses. Findings from these examples yielded substantial differences in interpretations between the cross-sectional and longitudinal mediation models considered here. Based on these observations, researchers should use caution when attempting to use cross-sectional data in place of longitudinal data for mediation analyses.
Intracolonial genetic variation in the scleractinian coral Seriatopora hystrix
NASA Astrophysics Data System (ADS)
Maier, E.; Buckenmaier, A.; Tollrian, R.; Nürnberger, B.
2012-06-01
In recent years, increasing numbers of studies revealed intraorganismal genetic variation, primarily in modular organisms like plants or colonial marine invertebrates. Two underlying mechanisms are distinguished: Mosaicism is caused by somatic mutation, whereas chimerism originates from allogeneic fusion. We investigated the occurrence of intracolonial genetic variation at microsatellite loci in five natural populations of the scleractinian coral Seriatopora hystrix on the Great Barrier Reef. This coral is a widely distributed, brooding species that is at present a target of intensive population genetic research on reproduction and dispersal patterns. From each of 155 S. hystrix colonies, either two or three samples were genotyped at five or six loci. Twenty-seven (~17%) genetically heterogeneous colonies were found. Statistical analyses indicated the occurrence of both mosaicism and chimerism. In most cases, intracolonial variation was found only at a single allele. Our analyses suggest that somatic mutations present a major source of genetic heterogeneity within a single colony. Moreover, we observed large, apparently stable chimeric colonies that harbored clearly distinct genotypes and contrast these findings with the patterns typically observed in laboratory-based experiments. We discuss the error that mosaicism and chimerism introduce into population genetic analyses.
Evaluation of atpB nucleotide sequences for phylogenetic studies of ferns and other pteridophytes.
Wolf, P
1997-10-01
Inferring basal relationships among vascular plants poses a major challenge to plant systematists. The divergence events that describe these relationships occurred long ago and considerable homoplasy has since accrued for both molecular and morphological characters. A potential solution is to examine phylogenetic analyses from multiple data sets. Here I present a new source of phylogenetic data for ferns and other pteridophytes. I sequenced the chloroplast gene atpB from 23 pteridophyte taxa and used maximum parsimony to infer relationships. A 588-bp region of the gene appeared to contain a statistically significant amount of phylogenetic signal and the resulting trees were largely congruent with similar analyses of nucleotide sequences from rbcL. However, a combined analysis of atpB plus rbcL produced a better resolved tree than did either data set alone. In the shortest trees, leptosporangiate ferns formed a monophyletic group. Also, I detected a well-supported clade of Psilotaceae (Psilotum and Tmesipteris) plus Ophioglossaceae (Ophioglossum and Botrychium). The demonstrated utility of atpB suggests that sequences from this gene should play a role in phylogenetic analyses that incorporate data from chloroplast genes, nuclear genes, morphology, and fossil data.
NASA Astrophysics Data System (ADS)
Kim, H.-K.; Woo, J.-H.; Park, R. S.; Song, C. H.; Kim, J.-H.; Ban, S.-J.; Park, J.-H.
2013-09-01
Plant functional type (PFT) distributions affect the results of biogenic emission modeling as well as O3 and PM simulations using chemistry-transport models (CTMs). This paper analyzes the variations of both surface biogenic VOC emissions and O3 concentrations due to changes in the PFT distributions in the Seoul Metropolitan Areas, Korea. Also, this paper attempts to provide important implications for biogenic emissions modeling studies for CTM simulations. MM5-MEGAN-SMOKE-CMAQ model simulations were implemented over the Seoul Metropolitan Areas in Korea to predict surface O3 concentrations for the period of 1 May to 31 June 2008. Starting from MEGAN biogenic emissions analysis with three different sources of PFT input data, US EPA CMAQ O3 simulation results were evaluated by surface O3 monitoring datasets and further considered on the basis of geospatial and statistical analyses. The three PFT datasets considered were "(1)KORPFT", developed with a region specific vegetation database; (2) CDP, adopted from US NCAR; and (3) MODIS, reclassified from the NASA Terra and Aqua combined land cover products. Comparisons of MEGAN biogenic emission results with the three different PFT data showed that broadleaf trees (BT) are the most significant contributor, followed by needleleaf trees (NT), shrub (SB), and herbaceous plants (HB) to the total biogenic volatile organic compounds (BVOCs). In addition, isoprene from BT and terpene from NT were recognized as significant primary and secondary BVOC species in terms of BVOC emissions distributions and O3-forming potentials in the study domain. Multiple regression analyses with the different PFT data (δO3 vs. δPFTs) suggest that KORPFT can provide reasonable information to the framework of MEGAN biogenic emissions modeling and CTM O3 predictions. Analyses of the CMAQ performance statistics suggest that deviations of BT areas can significantly affect CMAQ isoprene and O3 predictions. From further evaluations of the isoprene and O3 prediction results, we explored the PFT area-loss artifact that occurs due to geographical disparity between the PFT and leaf area index distributions, and can cause increased bias in CMAQ O3. Thus, the PFT-loss artifact must be a source of limitation in the MEGAN biogenic emission modeling and the CTM O3 simulation results. Time changes of CMAQ O3 distributions with the different PFT scenarios suggest that hourly and local impacts from the different PFT distributions on occasional inter-deviations of O3 are quite noticeable, reaching up to 10 ppb. Exponentially diverging hourly BVOC emissions and O3 concentrations with increasing ambient temperature suggest that the use of representative PFT distributions becomes more critical for O3 air quality modeling (or forecasting) in support of air quality decision-making and human health studies.
The impact of tobacco prices on smoking onset in Vietnam: duration analyses of retrospective data.
Guindon, G Emmanuel
2014-01-01
The benefits of preventing smoking onset are well known, and even just delaying smoking onset conveys benefits. Tobacco control policies are of critical importance to low-income countries with high smoking rates such as Vietnam where smoking prevalence is greater than 55 % in young men between the ages of 25 and 45. Using a survey of teens and young adults, I conducted duration analyses to explore the impact of tobacco price on smoking onset. The results suggest that tobacco prices in Vietnam have a statistically significant and fairly substantial effect on the onset of smoking. Increases in average tobacco prices, measured by an index of tobacco prices and by the prices of two popular brands, are found to delay smoking onset. Of particular interest is the finding that Vietnamese youth are more sensitive to changes in prices of a popular international brand that has had favourable tax treatment since the late 1990s.
Systematics and distribution of Cristaria plicata (Bivalvia, Unionidae) from the Russian Far East
Klishko, Olga K.; Lopes-Lima, Manuel; Froufe, Elsa; Bogan, Arthur E.; Abakumova, Vera Y.
2016-01-01
Abstract The number of anodontine bivalve species placed in the genus Cristaria (Bivalvia, Unionidae) from the Russian Far East is still not stable among authors. Some recognize only one valid species Cristaria plicata (Leach, 1815) while others accept two additional species, Cristaria tuberculata Schumacher, 1817 and Cristaria herculea (Middendorff, 1847). In the present study, these taxonomic doubts are addressed using analyses of mitochondrial DNA sequences and shell morphometry. No significant differences have been revealed by the COI DNA sequences or the main statistical morphometric indices from the three Cristaria forms. In the specimens analysed, changes in shell morphometry with age suggest that original descriptions of the different forms may be attributed solely to differences in age and sex. We consider that Cristaria plicata, Cristaria tuberculata and Cristaria herculea from the Russian Far East should be considered as a single species, namely Cristaria plicata (Leach, 1815), with Cristaria tuberculata and Cristaria herculea as junior synonyms. The geographic range of Cristaria plicata and its conservation status are also presented here. PMID:27110206
Analysis of the impact of trap-neuter-return programs on populations of feral cats.
Foley, Patrick; Foley, Janet E; Levy, Julie K; Paik, Terry
2005-12-01
To evaluate 2 county trap-neuter-return (TNR) programs for feral cat population management via mathematical modeling. Theoretical population model. Feral cats assessed from 1992 to 2003 in San Diego County, California (n = 14,452), and from 1998 to 2004 in Alachua County, Florida (11,822). Data were analyzed with a mathematical Ricker model to describe population dynamics of the feral cats and modifications to the dynamics that occurred as a result of the TNR programs. In both counties, results of analyses did not indicate a consistent reduction in per capita growth, the population multiplier, or the proportion of female cats that were pregnant. Success of feral cat management programs that use TNR can be monitored with an easily collected set of data and statistical analyses facilitated by population modeling techniques. Results may be used to suggest possible future monitoring and modification of TNR programs, which could result in greater success controlling and reducing feral cat populations.
Rowell, Candace; Kuiper, Nora; Preud'Homme, Hugues
2016-07-01
The knowledge-base of bottled water leachate is highly contradictory due to varying methodologies and limited multi-elemental and/or molecular analyses; understanding the range of contaminants and their pathways is required. This study determined the leaching potential and leaching kinetics of trace elements, using consistent comprehensive quantitative and semi-quantitative (79 elements total) analyses, and BPA, using isotopic dilution and MEPS pre-concentration with UHPLC-ESI-QTOF. Statistical methods were used to determine confounders and predictors of leaching and human health risk throughout 12days of UV exposure and after exposure to elevated temperature. Various types of water were used to assess the impact of water quality. Results suggest Sb leaching is primarily dependent upon water quality, not container type. Bottle type is a predictor of elemental leaching for Pb, Ba, Cr, Cu, Mn and Sr; BPA was detected in samples from polycarbonate containers. Health risks from the consumption of bottled water increase after UV exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rindermann, Heiner; Thompson, James
2011-06-01
Traditional economic theories stress the relevance of political, institutional, geographic, and historical factors for economic growth. In contrast, human-capital theories suggest that peoples' competences, mediated by technological progress, are the deciding factor in a nation's wealth. Using three large-scale assessments, we calculated cognitive-competence sums for the mean and for upper- and lower-level groups for 90 countries and compared the influence of each group's intellectual ability on gross domestic product. In our cross-national analyses, we applied different statistical methods (path analyses, bootstrapping) and measures developed by different research groups to various country samples and historical periods. Our results underscore the decisive relevance of cognitive ability--particularly of an intellectual class with high cognitive ability and accomplishments in science, technology, engineering, and math--for national wealth. Furthermore, this group's cognitive ability predicts the quality of economic and political institutions, which further determines the economic affluence of the nation. Cognitive resources enable the evolution of capitalism and the rise of wealth.
[Musculoskeletal disorders in piano students of a conservatory].
Bruno, S; Lorusso, A; Caputo, F; Pranzo, S; L'Abbate, N
2006-01-01
A four parts questionnaire administered to piano students of Conservatory "T. Schipa" of Lecce, southern Italy, was used to determine the prevalence of instrument-related problems. Among 121 responders, 48 (39.6%) were considered affected according to pre-established criteria. Univariate analyses showed statistical differences for mean age, number of hours spent playing per week, interval without breaks, lack of sport practice and acceptability of "No pain, no gain" criteria in students with music-related pains compared with not affected pianists. No association with hand site was found in pianists with only upper limbs diseases. The multivariate analyses performed by logistic regression confirmed the independent association for the risk factors age, lack of sport practice and acceptability of "No pain, no gain" criteria. Differently from several studies older students were more frequently affected and no difference in the prevalence rate was found in females. Findings suggest a probable causal contribution of fixed postures in the development of PRMDs in pianists in addition to repetitive movements of upper limbs.
Time pressure and regulations on hospital-in-the-home (HITH) nurses: An on-the-road study.
Cœugnet, Stéphanie; Forrierre, Justine; Naveteur, Janick; Dubreucq, Catherine; Anceaux, Françoise
2016-05-01
This study investigated both causal factors and consequences of time pressure in hospital-in-the-home (HITH) nurses. These nurses may experience additional stress from the time pressure they encounter while driving to patients' homes, which may result in greater risk taking while driving. From observation in natural settings, data related to the nurses' driving behaviours and emotions were collected and analysed statistically; semi-directed interviews with the nurses were analysed qualitatively. The results suggest that objective time constraints alone do not necessarily elicit subjective time pressure. The challenges and uncertainty associated with healthcare and the driving period contribute to the emergence of this time pressure, which has a negative impact on both the nurses' driving and their emotions. Finally, the study focuses on anticipated and in situ regulations. These findings provide guidelines for organizational and technical solutions allowing the reduction of time pressure among HITH nurses. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Scholz, Jonathan; Triantafyllou, Christina; Whitfield-Gabrieli, Susan; Brown, Emery N; Saxe, Rebecca
2009-01-01
In functional magnetic resonance imaging (fMRI) studies, a cortical region in the right temporo-parietal junction (RTPJ) is recruited when participants read stories about people's thoughts ('Theory of Mind'). Both fMRI and lesion studies suggest that a region near the RTPJ is associated with attentional reorienting in response to an unexpected stimulus. Do Theory of Mind and attentional reorienting recruit a single population of neurons, or are there two neighboring but distinct neural populations in the RTPJ? One recent study compared these activations, and found evidence consistent with a single common region. However, the apparent overlap may have been due to the low resolution of the previous technique. We tested this hypothesis using a high-resolution protocol, within-subjects analyses, and more powerful statistical methods. Strict conjunction analyses revealed that the area of overlap was small and on the periphery of each activation. In addition, a bootstrap analysis identified a reliable 6-10 mm spatial displacement between the peak activations of the two tasks; the same magnitude and direction of displacement was observed in within-subjects comparisons. In all, these results suggest that there are neighboring but distinct regions within the RTPJ implicated in Theory of Mind and orienting attention.
Hydrological alteration along the Missouri River Basin: A time series approach
Pegg, M.A.; Pierce, C.L.; Roy, A.
2003-01-01
Human alteration of large rivers is common-place, often resulting in significant changes in flow characteristics. We used a time series approach to examine daily mean flow data from locations throughout the main-stem Missouri River. Data from a pre-alteration period (1925-1948) were compared with a post-alteration period (1967-1996), with separate analyses conducted using either data from the entire year or restricted to the spring fish spawning period (1 April-30 June). Daily mean flows were significantly higher during the post-alteration period at all locations. Flow variability was markedly reduced during the post-alteration period as a probable result of flow regulation and climatological shifts. Daily mean flow during the spring fish spawning period was significantly lower during the post-alteration period at the most highly altered locations in the middle portion of the river, but unchanged at the least altered locations in the upper and lower portions of the river. Our data also corroborate other analyses, using alternate statistical approaches, that suggest similar changes to the Missouri River system. Our results suggest human alterations on the Missouri River, particularly in the middle portion most strongly affected by impoundments and channelization, have resulted in changes to the natural flow regime.
Li, Huanjie; Nickerson, Lisa D; Nichols, Thomas E; Gao, Jia-Hong
2017-03-01
Two powerful methods for statistical inference on MRI brain images have been proposed recently, a non-stationary voxelation-corrected cluster-size test (CST) based on random field theory and threshold-free cluster enhancement (TFCE) based on calculating the level of local support for a cluster, then using permutation testing for inference. Unlike other statistical approaches, these two methods do not rest on the assumptions of a uniform and high degree of spatial smoothness of the statistic image. Thus, they are strongly recommended for group-level fMRI analysis compared to other statistical methods. In this work, the non-stationary voxelation-corrected CST and TFCE methods for group-level analysis were evaluated for both stationary and non-stationary images under varying smoothness levels, degrees of freedom and signal to noise ratios. Our results suggest that, both methods provide adequate control for the number of voxel-wise statistical tests being performed during inference on fMRI data and they are both superior to current CSTs implemented in popular MRI data analysis software packages. However, TFCE is more sensitive and stable for group-level analysis of VBM data. Thus, the voxelation-corrected CST approach may confer some advantages by being computationally less demanding for fMRI data analysis than TFCE with permutation testing and by also being applicable for single-subject fMRI analyses, while the TFCE approach is advantageous for VBM data. Hum Brain Mapp 38:1269-1280, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Gaylor, David W; Lutz, Werner K; Conolly, Rory B
2004-01-01
Statistical analyses of nonmonotonic dose-response curves are proposed, experimental designs to detect low-dose effects of J-shaped curves are suggested, and sample sizes are provided. For quantal data such as cancer incidence rates, much larger numbers of animals are required than for continuous data such as biomarker measurements. For example, 155 animals per dose group are required to have at least an 80% chance of detecting a decrease from a 20% incidence in controls to an incidence of 10% at a low dose. For a continuous measurement, only 14 animals per group are required to have at least an 80% chance of detecting a change of the mean by one standard deviation of the control group. Experimental designs based on three dose groups plus controls are discussed to detect nonmonotonicity or to estimate the zero equivalent dose (ZED), i.e., the dose that produces a response equal to the average response in the controls. Cell proliferation data in the nasal respiratory epithelium of rats exposed to formaldehyde by inhalation are used to illustrate the statistical procedures. Statistically significant departures from a monotonic dose response were obtained for time-weighted average labeling indices with an estimated ZED at a formaldehyde dose of 5.4 ppm, with a lower 95% confidence limit of 2.7 ppm. It is concluded that demonstration of a statistically significant bi-phasic dose-response curve, together with estimation of the resulting ZED, could serve as a point-of departure in establishing a reference dose for low-dose risk assessment.
Search for Trends and Periodicities in Inter-hemispheric Sea Surface Temperature Difference
NASA Astrophysics Data System (ADS)
Rajesh, R.; Tiwari, R. K.
2018-02-01
Understanding the role of coupled solar and internal ocean dynamics on hemispheric climate variability is critical to climate modelling. We have analysed here 165 year long annual northern hemispheric (NH) and southern hemispheric (SH) sea surface temperature (SST) data employing spectral and statistical techniques to identify the imprints of solar and ocean-atmospheric processes, if any. We reconstructed the eigen modes of NH-SST and SH-SST to reveal non-linear oscillations superimposed on the monotonic trend. Our analysis reveals that the first eigen mode of NH-SST and SH-SST representing long-term trend of SST variability accounts for 15-23% variance. Interestingly, these components are matching with first eigen mode (99% variance) of the total solar irradiance (TSI) suggesting possible impact of solar activity on long-term SST variation. Furthermore, spectral analysis of SSA reconstructed signal revealed statistically significant periodicities of 63 ± 5, 22 ± 2, 10 ± 1, 7.6, 6.3, 5.2, 4.7, and 4.2 years in both NH-SST and SH-SST data. The major harmonics centred at 63 ± 5, 22 ± 2, and 10 ± 1 years are similar to solar periodicities and hence may represent solar forcing, while the components peaking at around 7.6, 6.3, 5.2, 4.7, and 4.2 years apparently falls in the frequency bands of El-Nino-Southern Oscillations linked to the oceanic internal processes. Our analyses also suggest evidence for the amplitude modulation of 9-11 and 21-22 year solar cycles, respectively, by 104 and 163 years in northern and southern hemispheric SST data. The absence of the above periodic oscillations in CO2 fails to suggest its role on observed inter-hemispheric SST difference. The cross-plot analysis also revealed strong influence of solar activity on linear trend of NH- and SH-SST in addition to small contribution from CO2. Our study concludes that (1) the long-term trends in northern and southern hemispheric SST variability show considerable synchronicity with cyclic warming and cooling phases and (2) the difference in cyclic forcing and non-linear modulations stemming from solar variability as a possible source of hemispheric SST differences.
Stamate, Mirela Cristina; Todor, Nicolae; Cosgarea, Marcel
2015-01-01
The clinical utility of otoacoustic emissions as a noninvasive objective test of cochlear function has been long studied. Both transient otoacoustic emissions and distorsion products can be used to identify hearing loss, but to what extent they can be used as predictors for hearing loss is still debated. Most studies agree that multivariate analyses have better test performances than univariate analyses. The aim of the study was to determine transient otoacoustic emissions and distorsion products performance in identifying normal and impaired hearing loss, using the pure tone audiogram as a gold standard procedure and different multivariate statistical approaches. The study included 105 adult subjects with normal hearing and hearing loss who underwent the same test battery: pure-tone audiometry, tympanometry, otoacoustic emission tests. We chose to use the logistic regression as a multivariate statistical technique. Three logistic regression models were developed to characterize the relations between different risk factors (age, sex, tinnitus, demographic features, cochlear status defined by otoacoustic emissions) and hearing status defined by pure-tone audiometry. The multivariate analyses allow the calculation of the logistic score, which is a combination of the inputs, weighted by coefficients, calculated within the analyses. The accuracy of each model was assessed using receiver operating characteristics curve analysis. We used the logistic score to generate receivers operating curves and to estimate the areas under the curves in order to compare different multivariate analyses. We compared the performance of each otoacoustic emission (transient, distorsion product) using three different multivariate analyses for each ear, when multi-frequency gold standards were used. We demonstrated that all multivariate analyses provided high values of the area under the curve proving the performance of the otoacoustic emissions. Each otoacoustic emission test presented high values of area under the curve, suggesting that implementing a multivariate approach to evaluate the performances of each otoacoustic emission test would serve to increase the accuracy in identifying the normal and impaired ears. We encountered the highest area under the curve value for the combined multivariate analysis suggesting that both otoacoustic emission tests should be used in assessing hearing status. Our multivariate analyses revealed that age is a constant predictor factor of the auditory status for both ears, but the presence of tinnitus was the most important predictor for the hearing level, only for the left ear. Age presented similar coefficients, but tinnitus coefficients, by their high value, produced the highest variations of the logistic scores, only for the left ear group, thus increasing the risk of hearing loss. We did not find gender differences between ears for any otoacoustic emission tests, but studies still debate this question as the results are contradictory. Neither gender, nor environment origin had any predictive value for the hearing status, according to the results of our study. Like any other audiological test, using otoacoustic emissions to identify hearing loss is not without error. Even when applying multivariate analysis, perfect test performance is never achieved. Although most studies demonstrated the benefit of using the multivariate analysis, it has not been incorporated into clinical decisions maybe because of the idiosyncratic nature of multivariate solutions or because of the lack of the validation studies.
STAMATE, MIRELA CRISTINA; TODOR, NICOLAE; COSGAREA, MARCEL
2015-01-01
Background and aim The clinical utility of otoacoustic emissions as a noninvasive objective test of cochlear function has been long studied. Both transient otoacoustic emissions and distorsion products can be used to identify hearing loss, but to what extent they can be used as predictors for hearing loss is still debated. Most studies agree that multivariate analyses have better test performances than univariate analyses. The aim of the study was to determine transient otoacoustic emissions and distorsion products performance in identifying normal and impaired hearing loss, using the pure tone audiogram as a gold standard procedure and different multivariate statistical approaches. Methods The study included 105 adult subjects with normal hearing and hearing loss who underwent the same test battery: pure-tone audiometry, tympanometry, otoacoustic emission tests. We chose to use the logistic regression as a multivariate statistical technique. Three logistic regression models were developed to characterize the relations between different risk factors (age, sex, tinnitus, demographic features, cochlear status defined by otoacoustic emissions) and hearing status defined by pure-tone audiometry. The multivariate analyses allow the calculation of the logistic score, which is a combination of the inputs, weighted by coefficients, calculated within the analyses. The accuracy of each model was assessed using receiver operating characteristics curve analysis. We used the logistic score to generate receivers operating curves and to estimate the areas under the curves in order to compare different multivariate analyses. Results We compared the performance of each otoacoustic emission (transient, distorsion product) using three different multivariate analyses for each ear, when multi-frequency gold standards were used. We demonstrated that all multivariate analyses provided high values of the area under the curve proving the performance of the otoacoustic emissions. Each otoacoustic emission test presented high values of area under the curve, suggesting that implementing a multivariate approach to evaluate the performances of each otoacoustic emission test would serve to increase the accuracy in identifying the normal and impaired ears. We encountered the highest area under the curve value for the combined multivariate analysis suggesting that both otoacoustic emission tests should be used in assessing hearing status. Our multivariate analyses revealed that age is a constant predictor factor of the auditory status for both ears, but the presence of tinnitus was the most important predictor for the hearing level, only for the left ear. Age presented similar coefficients, but tinnitus coefficients, by their high value, produced the highest variations of the logistic scores, only for the left ear group, thus increasing the risk of hearing loss. We did not find gender differences between ears for any otoacoustic emission tests, but studies still debate this question as the results are contradictory. Neither gender, nor environment origin had any predictive value for the hearing status, according to the results of our study. Conclusion Like any other audiological test, using otoacoustic emissions to identify hearing loss is not without error. Even when applying multivariate analysis, perfect test performance is never achieved. Although most studies demonstrated the benefit of using the multivariate analysis, it has not been incorporated into clinical decisions maybe because of the idiosyncratic nature of multivariate solutions or because of the lack of the validation studies. PMID:26733749
Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.
Groppe, David M; Urbach, Thomas P; Kutas, Marta
2011-12-01
Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.
Impact of ontology evolution on functional analyses.
Groß, Anika; Hartung, Michael; Prüfer, Kay; Kelso, Janet; Rahm, Erhard
2012-10-15
Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statistical analyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.
Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.
1985-12-27
Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan
Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis
Maguire, Kelly; Sheriff, Glenn
2011-01-01
Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146
Improving preschoolers' mathematics achievement with tablets: a randomized controlled trial
NASA Astrophysics Data System (ADS)
Schacter, John; Jo, Booil
2017-09-01
With a randomized field experiment of 433 preschoolers, we tested a tablet mathematics program designed to increase young children's mathematics learning. Intervention students played Math Shelf, a comprehensive iPad preschool and year 1 mathematics app, while comparison children received research-based hands-on mathematics instruction delivered by their classroom teachers. After 22 weeks, there was a large and statistically significant effect on mathematics achievement for Math Shelf students (Cohen's d = .94). Moderator analyses demonstrated an even larger effect for low achieving children (Cohen's d = 1.27). These results suggest that early education teachers can improve their students' mathematics outcomes by integrating experimentally proven tablet software into their daily routines.
Unusual properties of aqueous solutions of L-proline: A molecular dynamics study
NASA Astrophysics Data System (ADS)
Civera, Monica; Sironi, Maurizio; Fornili, Sandro L.
2005-11-01
Aqueous solutions of the bioprotectant proline are simulated for solute molar fractions ranging from 2.0 × 10 -3 to 2.3 × 10 -1. Statistical analyses show that proline affects the water structure more strongly than glycine betaine and trimethylamine- N-oxide, two of the most effective bioprotectants widely diffuse in nature, and as strongly as tert-butyl alcohol, a protein denaturant which at high concentration self-aggregates. No evidence is found, however, that proline self-aggregates as it has been previously suggested to explain experimental findings on concentrated proline solutions. Nevertheless, the behavior of the diffusion coefficients of proline and water vs. solute concentration qualitatively agrees with such results.
NASA Astrophysics Data System (ADS)
Coletta, Vincent P.; Phillips, Jeffrey A.; Savinainen, Antti; Steinert, Jeffrey J.
2008-09-01
In a recent article, Ates and Cataloglu (2007 Eur. J. Phys. 28 1161-71), in analysing results for a course in introductory mechanics for prospective science teachers, found no statistically significant correlation between students' pre-instruction scores on the Lawson classroom test of scientific reasoning ability (CTSR) and post-instruction scores on the force concept inventory (FCI). As a possible explanation, the authors suggest that the FCI does not probe for skills required to determine reasoning abilities. Our previously published research directly contradicts the authors' finding. We summarize our research and present a likely explanation for their observation of no correlation.
[Psychological results of mental performance in sleep deprivation].
Dahms, P; Schaad, G; Gorges, W; von Restorff, W
1996-01-01
To quantify the effects of sleep periods that have different lengths of time during continuous operations (CONOPS) 2 independent groups of subjects performed several cognitive tasks for 3 days. The 72 h trial period contained three 60-min sleep periods for the 10 subjects of the experimental group and three sleep periods of 4 h each for the 14 subjects of the control group. With the exception of only one subtest the statistical analyses of the test results of the 2 groups show no significant differences in cognitive performance. It is suggested that high motivation is responsible for comparable performance of the subjects, which was essentially obtained by a monetary pay system for successful test performance.
Comparison of open versus closed group interventions for sexually abused adolescent girls.
Tourigny, Marc; Hébert, Martine
2007-01-01
A first aim of this study is to evaluate the efficacy of an open group therapy for sexually abused teenagers using a quasi-experimental pretest/posttest treatment design. A second aim was to explore whether differential gains were linked to an open versus a closed group format. Results indicate that sexually abused girls involved in an open group therapy showed significant gains relative to teenagers of the control group girls for the majority of the variables considered. Analyses contrasting the two formats of group therapy fail to identify statistical differences suggesting that both open and closed group formats are likely to be associated with the same significant gains for sexually abused teenagers.
Statistical reporting of clinical pharmacology research.
Ring, Arne; Schall, Robert; Loke, Yoon K; Day, Simon
2017-06-01
Research in clinical pharmacology covers a wide range of experiments, trials and investigations: clinical trials, systematic reviews and meta-analyses of drug usage after market approval, the investigation of pharmacokinetic-pharmacodynamic relationships, the search for mechanisms of action or for potential signals for efficacy and safety using biomarkers. Often these investigations are exploratory in nature, which has implications for the way the data should be analysed and presented. Here we summarize some of the statistical issues that are of particular importance in clinical pharmacology research. © 2017 The British Pharmacological Society.
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
2011-01-01
Background Studies on allele length polymorphism designate several glacial refugia for Norway spruce (Picea abies) in the South Carpathian Mountains, but infer only limited expansion from these refugia after the last glaciation. To better understand the genetic dynamics of a South Carpathian spruce lineage, we compared ancient DNA from 10,700 and 11,000-year-old spruce pollen and macrofossils retrieved from Holocene lake sediment in the Retezat Mountains with DNA extracted from extant material from the same site. We used eight primer pairs that amplified short and variable regions of the spruce cpDNA. In addition, from the same lake sediment we obtained a 15,000-years-long pollen accumulation rate (PAR) record for spruce that helped us to infer changes in population size at this site. Results We obtained successful amplifications for Norway spruce from 17 out of 462 pollen grains tested, while the macrofossil material provided 22 DNA sequences. Two fossil sequences were found to be unique to the ancient material. Population genetic statistics showed higher genetic diversity in the ancient individuals compared to the extant ones. Similarly, statistically significant Ks and Kst values showed a considerable level of differentiation between extant and ancient populations at the same loci. Lateglacial and Holocene PAR values suggested that population size of the ancient population was small, in the range of 1/10 or 1/5 of the extant population. PAR analysis also detected two periods of rapid population growths (from ca. 11,100 and 3900 calibrated years before present (cal yr BP)) and three bottlenecks (around 9180, 7200 and 2200 cal yr BP), likely triggered by climatic change and human impact. Conclusion Our results suggest that the paternal lineages observed today in the Retezat Mountains persisted at this site at least since the early Holocene. Combination of the results from the genetic and the PAR analyses furthermore suggests that the higher level of genetic variation found in the ancient populations and the loss of ancient allele types detected in the extant individuals were likely due to the repeated bottlenecks during the Holocene; however our limited sample size did not allow us to exclude sampling effect. This study demonstrates how past population size changes inferred from PAR records can be efficiently used in combination with ancient DNA studies. The joint application of palaeoecological and population genetics analyses proved to be a powerful tool to understand the influence of past population demographic changes on the haplotype diversity and genetic composition of forest tree species. PMID:21392386
NASA Astrophysics Data System (ADS)
Emoto, K.; Saito, T.; Shiomi, K.
2017-12-01
Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.
Proliferative changes in the bronchial epithelium of former smokers treated with retinoids.
Hittelman, Walter N; Liu, Diane D; Kurie, Jonathan M; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C; Walsh, Garrett; Roth, Jack A; Minna, John; Ro, Jae Y; Broxson, Anita; Hong, Waun Ki; Lee, J Jack
2007-11-07
Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and alpha-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and alpha-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67-positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per-biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index > or = 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and alpha-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = -0.72, 95% CI = -1.24 to -0.20; P = .007) compared with placebo, and after 13-cis-RA and alpha-tocopherol treatment (coefficient estimate = -0.66, 95% CI = -1.15 to -0.17; P = .008). In per-subject analyses, treatment with 13-cis-RA and alpha-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments.
Proliferative Changes in the Bronchial Epithelium of Former Smokers Treated With Retinoids
Hittelman, Walter N.; Liu, Diane D.; Kurie, Jonathan M.; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C.; Walsh, Garrett; Roth, Jack A.; Minna, John; Ro, Jae Y.; Broxson, Anita; Hong, Waun Ki; Lee, J. Jack
2012-01-01
Background Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and α-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Methods Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and α-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67–positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per–biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index ≥ 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. Results In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and α-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = −0.72, 95% CI = −1.24 to −0.20; P = .007) compared with placebo, and after 13-cis-RA and α-tocopherol treatment (coefficient estimate = −0.66, 95% CI = −1.15 to −0.17; P = .008). Conclusions In per-subject analyses, treatment with 13-cis-RA and α-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments. PMID:17971525