Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
Maternal Involvement and Academic Achievement.
ERIC Educational Resources Information Center
Lopez, Linda C.; Holmes, William M.
The potential impact of several maternal involvement behaviors on teachers' ratings of children's academic skills was examined through statistical analyses. Data, based on mothers' responses to selected questions concerning maternal involvement and on teachers' ratings on the Classroom Behavior Inventory, were obtained for 115 kindergarten…
A Meta-Analysis: The Relationship between Father Involvement and Student Academic Achievement
ERIC Educational Resources Information Center
Jeynes, William H.
2015-01-01
A meta-analysis was undertaken, including 66 studies, to determine the relationship between father involvement and the educational outcomes of urban school children. Statistical analyses were done to determine the overall impact and specific components of father involvement. The possible differing effects of paternal involvement by race were also…
Using DEWIS and R for Multi-Staged Statistics e-Assessments
ERIC Educational Resources Information Center
Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.
2016-01-01
We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…
ERIC Educational Resources Information Center
Jeynes, William H.
2007-01-01
A meta-analysis is undertaken, including 52 studies, to determine the influence of parental involvement on the educational outcomes of urban secondary school children. Statistical analyses are done to determine the overall impact of parental involvement as well as specific components of parental involvement. Four different measures of educational…
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Zyluk, A; Walaszek, I
2012-06-01
The Levine questionnaire is a disease-oriented instrument developed for outcome measurement of carpal tunnel syndrome (CTS) management. The objective of this study was to compare Levine scores in patients with unilateral CTS, involving the dominant or non-dominant hand, before and after carpal tunnel release. Records of 144 patients, 126 women (87%) and 18 men (13%) aged a mean of 58 years with unilateral CTS, treated operatively, were analysed. The dominant hand was involved in 100 patients (69%), the non-dominant in 44 (31%). The parameters were analysed pre-operatively, and at 1 and 6 months post-operatively. A comparison of Levine scores in patients with the involvement of the dominant or non-dominant hand showed no statistically significant differences at baseline and any of the follow-up measurements. Statistically significant differences were noted in total grip strength at baseline and at 6 month assessments and in key-pinch strength at 1 and 6 months.
Code of Federal Regulations, 2010 CFR
2010-01-01
... and diary entries, maps, graphs, pamphlets, notes, charts, tabulations, analyses, statistical or... involved in legal proceedings. (i) Official business means the authorized business of the Department. (j...
Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points
ERIC Educational Resources Information Center
Ekol, George
2015-01-01
This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…
Family Early Literacy Practices Questionnaire: A Validation Study for a Spanish-Speaking Population
ERIC Educational Resources Information Center
Lewis, Kandia
2012-01-01
The purpose of the current study was to evaluate the psychometric validity of a Spanish translated version of a family involvement questionnaire (the FELP) using a mixed-methods design. Thus, statistical analyses (i.e., factor analysis, reliability analysis, and item analysis) and qualitative analyses (i.e., focus group data) were assessed.…
SMARTE'S SITE CHARACTERIZATION TOOL
Site Characterization involves collecting environmental data to evaluate the nature and extent of contamination. Environmental data could consist of chemical analyses of soil, sediment, water or air samples. Typically site characterization data are statistically evaluated for thr...
Community Health Centers: Providers, Patients, and Content of Care
... Statistics (NCHS). NAMCS uses a multistage probability sample design involving geographic primary sampling units (PSUs), physician practices ... 05 level. To account for the complex sample design during variance estimation, all analyses were performed using ...
Statistical methods for the beta-binomial model in teratology.
Yamamoto, E; Yanagimoto, T
1994-01-01
The beta-binomial model is widely used for analyzing teratological data involving littermates. Recent developments in statistical analyses of teratological data are briefly reviewed with emphasis on the model. For statistical inference of the parameters in the beta-binomial distribution, separation of the likelihood introduces an likelihood inference. This leads to reducing biases of estimators and also to improving accuracy of empirical significance levels of tests. Separate inference of the parameters can be conducted in a unified way. PMID:8187716
2013-01-01
Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257
Ratio index variables or ANCOVA? Fisher's cats revisited.
Tu, Yu-Kang; Law, Graham R; Ellison, George T H; Gilthorpe, Mark S
2010-01-01
Over 60 years ago Ronald Fisher demonstrated a number of potential pitfalls with statistical analyses using ratio variables. Nonetheless, these pitfalls are largely overlooked in contemporary clinical and epidemiological research, which routinely uses ratio variables in statistical analyses. This article aims to demonstrate how very different findings can be generated as a result of less than perfect correlations among the data used to generate ratio variables. These imperfect correlations result from measurement error and random biological variation. While the former can often be reduced by improvements in measurement, random biological variation is difficult to estimate and eliminate in observational studies. Moreover, wherever the underlying biological relationships among epidemiological variables are unclear, and hence the choice of statistical model is also unclear, the different findings generated by different analytical strategies can lead to contradictory conclusions. Caution is therefore required when interpreting analyses of ratio variables whenever the underlying biological relationships among the variables involved are unspecified or unclear. (c) 2009 John Wiley & Sons, Ltd.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Australian Curriculum Linked Lessons: Statistics
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Students recognise and analyse data and draw inferences. They represent, summarise and interpret data and undertake purposeful investigations involving the collection and interpretation of data… They develop an increasingly sophisticated ability to critically evaluate chance and data concepts and make reasoned judgments and decisions, as well as…
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
Shi, Weiwei; Bugrim, Andrej; Nikolsky, Yuri; Nikolskya, Tatiana; Brennan, Richard J
2008-01-01
ABSTRACT The ideal toxicity biomarker is composed of the properties of prediction (is detected prior to traditional pathological signs of injury), accuracy (high sensitivity and specificity), and mechanistic relationships to the endpoint measured (biological relevance). Gene expression-based toxicity biomarkers ("signatures") have shown good predictive power and accuracy, but are difficult to interpret biologically. We have compared different statistical methods of feature selection with knowledge-based approaches, using GeneGo's database of canonical pathway maps, to generate gene sets for the classification of renal tubule toxicity. The gene set selection algorithms include four univariate analyses: t-statistics, fold-change, B-statistics, and RankProd, and their combination and overlap for the identification of differentially expressed probes. Enrichment analysis following the results of the four univariate analyses, Hotelling T-square test, and, finally out-of-bag selection, a variant of cross-validation, were used to identify canonical pathway maps-sets of genes coordinately involved in key biological processes-with classification power. Differentially expressed genes identified by the different statistical univariate analyses all generated reasonably performing classifiers of tubule toxicity. Maps identified by enrichment analysis or Hotelling T-square had lower classification power, but highlighted perturbed lipid homeostasis as a common discriminator of nephrotoxic treatments. The out-of-bag method yielded the best functionally integrated classifier. The map "ephrins signaling" performed comparably to a classifier derived using sparse linear programming, a machine learning algorithm, and represents a signaling network specifically involved in renal tubule development and integrity. Such functional descriptors of toxicity promise to better integrate predictive toxicogenomics with mechanistic analysis, facilitating the interpretation and risk assessment of predictive genomic investigations.
Tomlinson, Alan; Hair, Mario; McFadyen, Angus
2013-10-01
Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. Copyright © 2013. Published by Elsevier Inc.
Su, J; Zhu, S; Liu, Z; Zhao, Y; Song, C
2017-02-01
To compare the prognosis of elderly patients with early oesophageal carcinoma between radical elective nodal prophylactic irradiation and involved-field irradiation and to estimate the failure modes and adverse effects, then to provide the patients the safe and individual therapeutic regimens. The charts of 96 patients aged 65 and over with early stage oesophageal carcinoma receiving radical radiotherapy in our department were retrospectively analysed. Of all the patients, 49 received elective nodal prophylactic irradiation and the other 47 received involved-field irradiation. After completion of the whole treatment, we analysed short-term effects, tumour local control, overall survival of the patients, failure modes and adverse effects. The 1-, 3-, and 5-year local control rate in elective nodal irradiation and involved-field irradiation groups were 80.6%, 57.4%, 54.0% and 65.4%, 46.5%, 30.5% respectively, and the difference was statistically significant (χ 2 =4.478, P=0.03). The differences of overall survival and progression-free survival were not significant (P>0.05). The difference of 1-, 3-, and 5-years local regional failure rate was statistically significant between elective nodal prophylactic irradiation and involved-field irradiation groups, except for the overall failure and distant metastasis rates. The overall incidence of radiation-induced oesophagitis after elective nodal irradiation or involved-field irradiation was 79.6% and 59.6%, and the difference was statistically significant (χ 2 =4.559, P=0.03). The difference of radiation pneumonitis between elective nodal prophylactic irradiation and involved-field irradiation was not significant (12.2% vs 14.9%; χ 2 =0.144, P=0.7). For elderly patients with early stage oesophageal carcinoma receiving radical radiotherapy, although elective nodal prophylactic irradiation could increase the incidence of radiation-induced oesophagitis, patients could tolerate the treatment and benefit from local control. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
permGPU: Using graphics processing units in RNA microarray association studies.
Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros
2010-06-16
Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Challenges and solutions to pre- and post-randomization subgroup analyses.
Desai, Manisha; Pieper, Karen S; Mahaffey, Ken
2014-01-01
Subgroup analyses are commonly performed in the clinical trial setting with the purpose of illustrating that the treatment effect was consistent across different patient characteristics or identifying characteristics that should be targeted for treatment. There are statistical issues involved in performing subgroup analyses, however. These have been given considerable attention in the literature for analyses where subgroups are defined by a pre-randomization feature. Although subgroup analyses are often performed with subgroups defined by a post-randomization feature--including analyses that estimate the treatment effect among compliers--discussion of these analyses has been neglected in the clinical literature. Such analyses pose a high risk of presenting biased descriptions of treatment effects. We summarize the challenges of doing all types of subgroup analyses described in the literature. In particular, we emphasize issues with post-randomization subgroup analyses. Finally, we provide guidelines on how to proceed across the spectrum of subgroup analyses.
Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q
2015-11-01
To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.
Dishon-Brown, Amanda; Golder, Seana; Renn, Tanya; Winham, Katherine; Higgins, George E; Logan, T K
2017-06-01
Justice-involved women report high rates of victimization across their life span, and these experiences contribute to their involvement in the criminal justice (CJ) system. Within this population, research has identified an overlap among victimization and substance use, a high-risk coping mechanism. Furthermore, research indicates attachment style is related to coping and high-risk behaviors. Research is needed to understand the relationship among these mechanisms as they relate to intimate partner violence (IPV). To address this gap, this study investigated the relationship between attachment, coping, childhood victimization, substance use, and IPV among 406 victimized women on probation/parole. Results of 6 multivariate regression analyses were statistically significant, accounting for 8%-13% of the variance in IPV. Particularly, childhood sexual victimization and negative coping were significant in all analyses. Findings provide practitioners, administrators, and policymakers information about the specific needs of justice-involved women.
Histometric analyses of cancellous and cortical interface in autogenous bone grafting
Netto, Henrique Duque; Olate, Sergio; Klüppel, Leandro; do Carmo, Antonio Marcio Resende; Vásquez, Bélgica; Albergaria-Barbosa, Jose
2013-01-01
Surgical procedures involving the rehabilitation of the maxillofacial region frequently require bone grafts; the aim of this research was to evaluate the interface between recipient and graft with cortical or cancellous contact. 6 adult beagle dogs with 15 kg weight were included in the study. Under general anesthesia, an 8 mm diameter block was obtained from parietal bone of each animal and was put on the frontal bone with a 12 mm 1.5 screws. Was used the lag screw technique from better contact between the recipient and graft. 3-week and 6-week euthanized period were chosen for histometric evaluation. Hematoxylin-eosin was used in a histologic routine technique and histomorphometry was realized with IMAGEJ software. T test was used for data analyses with p<0.05 for statistical significance. The result show some differences in descriptive histology but non statistical differences in the interface between cortical or cancellous bone at 3 or 6 week; as natural, after 6 week of surgery, bone integration was better and statistically superior to 3-week analyses. We conclude that integration of cortical or cancellous bone can be usefully without differences. PMID:23923071
Brannigan, V M; Bier, V M; Berg, C
1992-09-01
Toxic torts are product liability cases dealing with alleged injuries due to chemical or biological hazards such as radiation, thalidomide, or Agent Orange. Toxic tort cases typically rely more heavily than other product liability cases on indirect or statistical proof of injury. There have been numerous theoretical analyses of statistical proof of injury in toxic tort cases. However, there have been only a handful of actual legal decisions regarding the use of such statistical evidence, and most of those decisions have been inconclusive. Recently, a major case from the Fifth Circuit, involving allegations that Benedectin (a morning sickness drug) caused birth defects, was decided entirely on the basis of statistical inference. This paper examines both the conceptual basis of that decision, and also the relationships among statistical inference, scientific evidence, and the rules of product liability in general.
Statistical issues in quality control of proteomic analyses: good experimental design and planning.
Cairns, David A
2011-03-01
Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Borrowing of strength and study weights in multivariate and network meta-analysis.
Jackson, Dan; White, Ian R; Price, Malcolm; Copas, John; Riley, Richard D
2017-12-01
Multivariate and network meta-analysis have the potential for the estimated mean of one effect to borrow strength from the data on other effects of interest. The extent of this borrowing of strength is usually assessed informally. We present new mathematical definitions of 'borrowing of strength'. Our main proposal is based on a decomposition of the score statistic, which we show can be interpreted as comparing the precision of estimates from the multivariate and univariate models. Our definition of borrowing of strength therefore emulates the usual informal assessment. We also derive a method for calculating study weights, which we embed into the same framework as our borrowing of strength statistics, so that percentage study weights can accompany the results from multivariate and network meta-analyses as they do in conventional univariate meta-analyses. Our proposals are illustrated using three meta-analyses involving correlated effects for multiple outcomes, multiple risk factor associations and multiple treatments (network meta-analysis).
Borrowing of strength and study weights in multivariate and network meta-analysis
Jackson, Dan; White, Ian R; Price, Malcolm; Copas, John; Riley, Richard D
2016-01-01
Multivariate and network meta-analysis have the potential for the estimated mean of one effect to borrow strength from the data on other effects of interest. The extent of this borrowing of strength is usually assessed informally. We present new mathematical definitions of ‘borrowing of strength’. Our main proposal is based on a decomposition of the score statistic, which we show can be interpreted as comparing the precision of estimates from the multivariate and univariate models. Our definition of borrowing of strength therefore emulates the usual informal assessment. We also derive a method for calculating study weights, which we embed into the same framework as our borrowing of strength statistics, so that percentage study weights can accompany the results from multivariate and network meta-analyses as they do in conventional univariate meta-analyses. Our proposals are illustrated using three meta-analyses involving correlated effects for multiple outcomes, multiple risk factor associations and multiple treatments (network meta-analysis). PMID:26546254
Illustrating the practice of statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamada, Christina A; Hamada, Michael S
2009-01-01
The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem andmore » incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.« less
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Chouteau, Mathieu; Whibley, Annabel; Joron, Mathieu; Llaurens, Violaine
2016-01-01
Identifying the genetic basis of adaptive variation is challenging in non-model organisms and quantitative real time PCR. is a useful tool for validating predictions regarding the expression of candidate genes. However, comparing expression levels in different conditions requires rigorous experimental design and statistical analyses. Here, we focused on the neotropical passion-vine butterflies Heliconius, non-model species studied in evolutionary biology for their adaptive variation in wing color patterns involved in mimicry and in the signaling of their toxicity to predators. We aimed at selecting stable reference genes to be used for normalization of gene expression data in RT-qPCR analyses from developing wing discs according to the minimal guidelines described in Minimum Information for publication of Quantitative Real-Time PCR Experiments (MIQE). To design internal RT-qPCR controls, we studied the stability of expression of nine candidate reference genes (actin, annexin, eF1α, FK506BP, PolyABP, PolyUBQ, RpL3, RPS3A, and tubulin) at two developmental stages (prepupal and pupal) using three widely used programs (GeNorm, NormFinder and BestKeeper). Results showed that, despite differences in statistical methods, genes RpL3, eF1α, polyABP, and annexin were stably expressed in wing discs in late larval and pupal stages of Heliconius numata. This combination of genes may be used as a reference for a reliable study of differential expression in wings for instance for genes involved in important phenotypic variation, such as wing color pattern variation. Through this example, we provide general useful technical recommendations as well as relevant statistical strategies for evolutionary biologists aiming to identify candidate-genes involved adaptive variation in non-model organisms. PMID:27271971
Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives
NASA Technical Reports Server (NTRS)
Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.
2002-01-01
An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.
Xia, Yinglin; Morrison-Beedy, Dianne; Ma, Jingming; Feng, Changyong; Cross, Wendi; Tu, Xin
2012-01-01
Modeling count data from sexual behavioral outcomes involves many challenges, especially when the data exhibit a preponderance of zeros and overdispersion. In particular, the popular Poisson log-linear model is not appropriate for modeling such outcomes. Although alternatives exist for addressing both issues, they are not widely and effectively used in sex health research, especially in HIV prevention intervention and related studies. In this paper, we discuss how to analyze count outcomes distributed with excess of zeros and overdispersion and introduce appropriate model-fit indices for comparing the performance of competing models, using data from a real study on HIV prevention intervention. The in-depth look at these common issues arising from studies involving behavioral outcomes will promote sound statistical analyses and facilitate research in this and other related areas. PMID:22536496
NASA Astrophysics Data System (ADS)
Saini, K. K.; Sehgal, R. K.; Sethi, B. L.
2008-10-01
In this paper major reliability estimators are analyzed and there comparatively result are discussed. There strengths and weaknesses are evaluated in this case study. Each of the reliability estimators has certain advantages and disadvantages. Inter-rater reliability is one of the best ways to estimate reliability when your measure is an observation. However, it requires multiple raters or observers. As an alternative, you could look at the correlation of ratings of the same single observer repeated on two different occasions. Each of the reliability estimators will give a different value for reliability. In general, the test-retest and inter-rater reliability estimates will be lower in value than the parallel forms and internal consistency ones because they involve measuring at different times or with different raters. Since reliability estimates are often used in statistical analyses of quasi-experimental designs.
Psychopathological Symptoms and Psychological Wellbeing in Mexican Undergraduate Students
Contreras, Mariel; de León, Ana Mariela; Martínez, Estela; Peña, Elsa Melissa; Marques, Luana; Gallegos, Julia
2017-01-01
College life involves a process of adaptation to changes that have an impact on the psycho-emotional development of students. Successful adaptation to this stage involves the balance between managing personal resources and potential stressors that generate distress. This epidemiological descriptive and transversal study estimates the prevalence of psychopathological symptomatology and psychological well-being among 516 college students, 378 (73.26%) women and 138 (26.74%) men, ages between 17 and 24, from the city of Monterrey in Mexico. It describes the relationship between psychopathological symptomatology and psychological well-being, and explores gender differences. For data collection, two measures were used: The Symptom Checklist Revised and the Scale of Psychological Well-being. Statistical analyses used were t test for independent samples, Pearson’s r and regression analysis with the Statistical Package for the Social Sciences (SPSS v21.0). Statistical analyses showed that the prevalence of psychopathological symptoms was 10–13%, being Aggression the highest. The dimension of psychological well-being with the lowest scores was Environmental Mastery. Participants with a higher level of psychological well-being had a lower level of psychopathological symptoms, which shows the importance of early identification and prevention. Gender differences were found on some subscales of the psychopathological symptomatology and of the psychological well-being measures. This study provides a basis for future research and development of resources to promote the psychological well-being and quality of life of university students. PMID:29104876
Detecting most influencing courses on students grades using block PCA
NASA Astrophysics Data System (ADS)
Othman, Osama H.; Gebril, Rami Salah
2014-12-01
One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.
Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N
2016-01-01
Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.
Introduction to bioinformatics.
Can, Tolga
2014-01-01
Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.
[A spatially explicit analysis of traffic accidents involving pedestrians and cyclists in Berlin].
Lakes, Tobia
2017-12-01
In many German cities and counties, sustainable mobility concepts that strengthen pedestrian and cyclist traffic are promoted. From the perspectives of urban development, traffic planning and public healthcare, a spatially differentiated analysis of traffic accident data is decisive. 1) The identification of spatial and temporal patterns of the distribution of accidents involving cyclists and pedestrians, 2) the identification of hotspots and exploration of possible underlying causes and 3) the critical discussion of benefits and challenges of the results and the derivation of conclusions. Spatio-temporal distributions of data from accident statistics in Berlin involving pedestrians and cyclists from 2011 to 2015 were analysed with geographic information systems (GIS). While the total number of accidents remains relatively stable for pedestrian and cyclist accidents, the spatial distribution analysis shows, however, that there are significant spatial clusters (hotspots) of traffic accidents with a strong concentration in the inner city area. In a critical discussion, the benefits of geographic concepts are identified, such as spatially explicit health data (in this case traffic accident data), the importance of the integration of other data sources for the evaluation of the health impact of areas (traffic accident statistics of the police), and the possibilities and limitations of spatial-temporal data analysis (spatial point-density analyses) for the derivation of decision-supported recommendations and for the evaluation of policy measures of health prevention and of health-relevant urban development.
Relating engagement to outcomes in prevention: the case of a parenting program for couples.
Brown, Louis D; Goslin, Megan C; Feinberg, Mark E
2012-09-01
Analyses of program engagement can provide critical insight into how program involvement leads to outcomes. This study examines the relation between participant engagement and program outcomes in Family Foundations (FF), a universal preventive intervention designed to help couples manage the transition to parenthood by improving coparenting relationship quality. Previous intent-to-treat outcome analyses from a randomized trial indicate FF improves parental adjustment, interparental relationships, and parenting. Analyses for the current study use the same sample, and yield statistically reliable relations between participant engagement and interparental relationships but not parental adjustment or parenting. Discussion considers implications for FF and the difficulties researchers face when examining the relation between engagement and outcomes in preventive interventions.
Relating Engagement to Outcomes in Prevention: The Case of a Parenting Program for Couples
Brown, Louis D.; Goslin, Megan C.; Feinberg, Mark E.
2011-01-01
Analyses of program engagement can provide critical insight into how program involvement leads to outcomes. This study examines the relation between participant engagement and program outcomes in Family Foundations (FF), a universal preventive intervention designed to help couples manage the transition to parenthood by improving coparenting relationship quality. Previous intent-to-treat outcome analyses from a randomized trial indicate FF improves parental adjustment, interparental relationships, and parenting. Analyses for the current study use the same sample, and yield statistically reliable relations between participant engagement and interparental relationships but not parental adjustment or parenting. Discussion considers implications for FF and the difficulities researchers face when examining the relation between engagement and outcomes in preventive interventions. PMID:21826536
ParallABEL: an R library for generalized parallelization of genome-wide association studies
2010-01-01
Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914
An audit of the statistics and the comparison with the parameter in the population
NASA Astrophysics Data System (ADS)
Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad
2015-10-01
The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.
MOLSIM: A modular molecular simulation software
Jurij, Reščič
2015-01-01
The modular software MOLSIM for all‐atom molecular and coarse‐grained simulations is presented with focus on the underlying concepts used. The software possesses four unique features: (1) it is an integrated software for molecular dynamic, Monte Carlo, and Brownian dynamics simulations; (2) simulated objects are constructed in a hierarchical fashion representing atoms, rigid molecules and colloids, flexible chains, hierarchical polymers, and cross‐linked networks; (3) long‐range interactions involving charges, dipoles and/or anisotropic dipole polarizabilities are handled either with the standard Ewald sum, the smooth particle mesh Ewald sum, or the reaction‐field technique; (4) statistical uncertainties are provided for all calculated observables. In addition, MOLSIM supports various statistical ensembles, and several types of simulation cells and boundary conditions are available. Intermolecular interactions comprise tabulated pairwise potentials for speed and uniformity and many‐body interactions involve anisotropic polarizabilities. Intramolecular interactions include bond, angle, and crosslink potentials. A very large set of analyses of static and dynamic properties is provided. The capability of MOLSIM can be extended by user‐providing routines controlling, for example, start conditions, intermolecular potentials, and analyses. An extensive set of case studies in the field of soft matter is presented covering colloids, polymers, and crosslinked networks. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:25994597
Siddall, James; Huebner, E Scott; Jiang, Xu
2013-01-01
This study examined the cross-sectional and prospective relationships between three sources of school-related social support (parent involvement, peer support for learning, and teacher-student relationships) and early adolescents' global life satisfaction. The participants were 597 middle school students from 1 large school in the southeastern United States who completed measures of school social climate and life satisfaction on 2 occasions, 5 months apart. The results revealed that school-related experiences in terms of social support for learning contributed substantial amounts of variance to individual differences in adolescents' satisfaction with their lives as a whole. Cross-sectional multiple regression analyses of the differential contributions of the sources of support demonstrated that family and peer support for learning contributed statistically significant, unique variance to global life satisfaction reports. Prospective multiple regression analyses demonstrated that only family support for learning continued to contribute statistically significant, unique variance to the global life satisfaction reports at Time 2. The results suggest that school-related experiences, especially family-school interactions, spill over into adolescents' overall evaluations of their lives at a time when direct parental involvement in schooling and adolescents' global life satisfaction are generally declining. Recommendations for future research and educational policies and practices are discussed. © 2013 American Orthopsychiatric Association.
Gundogdu, Ahmet Gokhan; Onder, Sevgen; Firat, Pinar; Dogan, Riza
2014-06-01
The impacts of epidermal growth factor receptor (EGFR) immunoexpression and RAS immunoexpression on the survival and prognosis of lung adenocarcinoma patients are debated in the literature. Twenty-six patients, who underwent pulmonary resections between 2002 and 2007 in our clinic, and whose pathologic examinations yielded adenocarcinoma, were included in the study. EGFR and RAS expression levels were examined by immunohistochemical methods. The results were compared with the survival, stage of the disease, nodal involvement, lymphovascular invasion, and pleural invasion. Nonparametric bivariate analyses were used for statistical analyses. A significant link between EGFR immunoexpression and survival has been identified while RAS immunoexpression and survival have been proven to be irrelevant. Neither EGFR, nor RAS has displayed a significant link with the stage of the disease, nodal involvement, lymphovascular invasion, or pleural invasion. Positive EGFR immunoexpression affects survival negatively, while RAS immunoexpression has no effect on survival in lung adenocarcinoma patients.
Approach for Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives
NASA Technical Reports Server (NTRS)
Putko, Michele M.; Newman, Perry A.; Taylor, Arthur C., III; Green, Lawrence L.
2001-01-01
This paper presents an implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for a quasi 1-D Euler CFD (computational fluid dynamics) code. Given uncertainties in statistically independent, random, normally distributed input variables, a first- and second-order statistical moment matching procedure is performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, the moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.
Seay, Kristen D.; Kohl, Patricia
2012-01-01
Using data from the National Survey of Child and Adolescent Well-Being II (NSCAW II), this article examines the impact of caregiver substance abuse on children’s exposure to violence in the home in a nationally representative sample of families involved with child protective services (CPS). Logistic regression analyses indicate an increased risk of witnessing mild and severe violence in the home for children whose primary caregiver was abusing alcohol or drugs. However, analyses did not find statistically significant relationships between child report of direct victimization in the home by mild or severe violence and caregiver alcohol or drug abuse. PMID:23440502
Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B
2017-08-01
After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.
Recent evaluations of crack-opening-area in circumferentially cracked pipes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, S.; Brust, F.; Ghadiali, N.
1997-04-01
Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models
Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.
2016-01-01
Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906
A statistical anomaly indicates symbiotic origins of eukaryotic membranes
Bansal, Suneyna; Mittal, Aditya
2015-01-01
Compositional analyses of nucleic acids and proteins have shed light on possible origins of living cells. In this work, rigorous compositional analyses of ∼5000 plasma membrane lipid constituents of 273 species in the three life domains (archaea, eubacteria, and eukaryotes) revealed a remarkable statistical paradox, indicating symbiotic origins of eukaryotic cells involving eubacteria. For lipids common to plasma membranes of the three domains, the number of carbon atoms in eubacteria was found to be similar to that in eukaryotes. However, mutually exclusive subsets of same data show exactly the opposite—the number of carbon atoms in lipids of eukaryotes was higher than in eubacteria. This statistical paradox, called Simpson's paradox, was absent for lipids in archaea and for lipids not common to plasma membranes of the three domains. This indicates the presence of interaction(s) and/or association(s) in lipids forming plasma membranes of eubacteria and eukaryotes but not for those in archaea. Further inspection of membrane lipid structures affecting physicochemical properties of plasma membranes provides the first evidence (to our knowledge) on the symbiotic origins of eukaryotic cells based on the “third front” (i.e., lipids) in addition to the growing compositional data from nucleic acids and proteins. PMID:25631820
Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.
2009-01-01
In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212
STRengthening analytical thinking for observational studies: the STRATOS initiative.
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-12-30
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-01-01
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480
Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS
NASA Technical Reports Server (NTRS)
Long, S. M.; Grosfils, E. B.
2005-01-01
Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.
APOD Data Release of Social Network Footprint for 2015
NASA Astrophysics Data System (ADS)
Nemiroff, Robert J.; Russell, David; Allen, Alice; Connelly, Paul; Lowe, Stuart R.; Petz, Sydney; Haring, Ralf; Bonnell, Jerry T.; APOD Team
2017-01-01
APOD data for 2015 are being made freely available for download and analysis. The data includes page view statistics for the main NASA APOD website at https://apod.nasa.gov, as well as for APOD's social media sites on Facebook, Instagram, Google Plus, and Twitter. General APOD-specific demographic information for each site is included. Popularity statistics that have been archived including Page Views, Likes, Shares, Hearts, and Retweets. The downloadable Excel-type spreadsheet also includes the APOD title and (unlinked) explanation. This data is released not to highlight APOD's popularity but to encourage analyses, with potential examples involving which astronomy topics trend the best and whether popularity is social group dependent.
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
Australian Oceanographic Data Centre Bulletin 16.
1983-05-01
iable that with the quantities of data involved sonic bad data will be archived. In order to exclude this various filtering techniques will be employed. 4...analysed for statistical properties (e.g. burst nican. variance, exceedance and spectral properties) and certain values are correlated with relevant forcing...seconds) < DAY N 0 : 281 z. -15 ,E: o E < INSTRUMENT: MMI 585 .- X AXIS BEARING: 280 0 DATA POINT Z MEAN RESOLVED CURRENT - 15 MAGNITUDE: 7. 1 Cm/s
The MDI Method as a Generalization of Logit, Probit and Hendry Analyses in Marketing.
1980-04-01
model involves nothing more than fitting a normal distribution function ( Hanushek and Jackson (1977)). For a given value of x, the probit model...preference shifts within the soft drink category. --For applications of probit models relevant for marketing, see Hausman and Wise (1978) and Hanushek and...Marketing Research" JMR XIV, Feb. (1977). Hanushek , E.A., and J.E. Jackson, Statistical Methods for Social Scientists. Academic Press, New York (1977
Wingate, Peter H; Thornton, George C; McIntyre, Kelly S; Frame, Jennifer H
2003-02-01
The present study examined relationships between reduction-in-force (RIF) personnel practices, presentation of statistical evidence, and litigation outcomes. Policy capturing methods were utilized to analyze the components of 115 federal district court opinions involving age discrimination disparate treatment allegations and organizational downsizing. Univariate analyses revealed meaningful links between RIF personnel practices, use of statistical evidence, and judicial verdict. The defendant organization was awarded summary judgment in 73% of the claims included in the study. Judicial decisions in favor of the defendant organization were found to be significantly related to such variables as formal performance appraisal systems, termination decision review within the organization, methods of employee assessment and selection for termination, and the presence of a concrete layoff policy. The use of statistical evidence in ADEA disparate treatment litigation was investigated and found to be a potentially persuasive type of indirect evidence. Legal, personnel, and evidentiary ramifications are reviewed, and a framework of downsizing mechanics emphasizing legal defensibility is presented.
[Morphometric evaluation of the lateral fossa during the pre-gyrus period].
Varlam, H; Macovei, G N; Antohe, D St
2002-09-01
During edification of neocortex, the lateral fossa is involved in the process of development of cerebral hemispheres. It changes its shape and, from a shallow depression at the end of the 3rd month, it becomes a triangular surface with marked borders. Finally, in the same time with the appearance of circumvolutions the opercles that limit it come closer and give rise to the lateral sulcus. The evolution of the lateral fossa can be analysed by linear and surface parameters. Morphometric and statistic analyse of these parameters, compared with those of the cerebral hemisphere, allowed us to establish some original criteria for appreciating the growth of the foetal brain.
Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Kratochwill, Thomas R; Levin, Joel R
2014-04-01
In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
42 CFR 405.1064 - ALJ decisions involving statistical samples.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 2 2011-10-01 2011-10-01 false ALJ decisions involving statistical samples. 405... Medicare Coverage Policies § 405.1064 ALJ decisions involving statistical samples. When an appeal from the QIC involves an overpayment issue and the QIC used a statistical sample in reaching its...
42 CFR 405.1064 - ALJ decisions involving statistical samples.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 2 2010-10-01 2010-10-01 false ALJ decisions involving statistical samples. 405... Medicare Coverage Policies § 405.1064 ALJ decisions involving statistical samples. When an appeal from the QIC involves an overpayment issue and the QIC used a statistical sample in reaching its...
Onder, Sevgen; Firat, Pinar; Dogan, Riza
2014-01-01
Background The impacts of epidermal growth factor receptor (EGFR) immunoexpression and RAS immunoexpression on the survival and prognosis of lung adenocarcinoma patients are debated in the literature. Methods Twenty-six patients, who underwent pulmonary resections between 2002 and 2007 in our clinic, and whose pathologic examinations yielded adenocarcinoma, were included in the study. EGFR and RAS expression levels were examined by immunohistochemical methods. The results were compared with the survival, stage of the disease, nodal involvement, lymphovascular invasion, and pleural invasion. Nonparametric bivariate analyses were used for statistical analyses. Results A significant link between EGFR immunoexpression and survival has been identified while RAS immunoexpression and survival have been proven to be irrelevant. Neither EGFR, nor RAS has displayed a significant link with the stage of the disease, nodal involvement, lymphovascular invasion, or pleural invasion. Conclusions Positive EGFR immunoexpression affects survival negatively, while RAS immunoexpression has no effect on survival in lung adenocarcinoma patients. PMID:24977003
General cognitive principles for learning structure in time and space.
Goldstein, Michael H; Waterfall, Heidi R; Lotem, Arnon; Halpern, Joseph Y; Schwade, Jennifer A; Onnis, Luca; Edelman, Shimon
2010-06-01
How are hierarchically structured sequences of objects, events or actions learned from experience and represented in the brain? When several streams of regularities present themselves, which will be learned and which ignored? Can statistical regularities take effect on their own, or are additional factors such as behavioral outcomes expected to influence statistical learning? Answers to these questions are starting to emerge through a convergence of findings from naturalistic observations, behavioral experiments, neurobiological studies, and computational analyses and simulations. We propose that a small set of principles are at work in every situation that involves learning of structure from patterns of experience and outline a general framework that accounts for such learning. (c) 2010 Elsevier Ltd. All rights reserved.
Cormack Research Project: Glasgow University
NASA Technical Reports Server (NTRS)
Skinner, Susan; Ryan, James M.
1998-01-01
The aim of this project was to investigate and improve upon existing methods of analysing data from COMITEL on the Gamma Ray Observatory for neutrons emitted during solar flares. In particular, a strategy for placing confidence intervals on neutron energy distributions, due to uncertainties on the response matrix has been developed. We have also been able to demonstrate the superior performance of one of a range of possible statistical regularization strategies. A method of generating likely models of neutron energy distributions has also been developed as a tool to this end. The project involved solving an inverse problem with noise being added to the data in various ways. To achieve this pre-existing C code was used to run Fortran subroutines which performed statistical regularization on the data.
2017-12-01
response integration . J Abnorm Psychol 92, 276-306. Misaki, M., Phillips, R., Zotev, V., Wong, C.K., Wurfel, B.E., Krueger, F., Feldner, M., Bodurka, J...illustrated schematically in Fig. A1A. The visits were typically scheduled one week apart. Each visit involved a psychological evaluation by a...from multiple tests. Partial correlation analyses were conducted using MATLAB Statistics toolbox. A3. Results A3.1 Psychological measures 11
Five-year follow-up of Community Pediatrics Training Initiative.
Minkovitz, Cynthia S; Goldshore, Matt; Solomon, Barry S; Guyer, Bernard; Grason, Holly
2014-07-01
To compare community involvement of pediatricians exposed to enhanced residency training as part of the Dyson Community Pediatrics Training Initiative (CPTI) with involvement reported by a national sample of pediatricians. A cross-sectional analyses compared 2008-2010 mailed surveys of CPTI graduates 5 years after residency graduation with comparably aged respondents in a 2010 mailed national American Academy of Pediatrics survey of US pediatricians (CPTI: n = 234, response = 56.0%; national sample: n = 243; response = 59.9%). Respondents reported demographic characteristics, practice characteristics (setting, time spent in general pediatrics), involvement in community child health activities in past 12 months, use of ≥1 strategies to influence community child health (eg, educate legislators), and being moderately/very versus not at all/minimally skilled in 6 such activities (eg, identify community needs). χ(2) statistics assessed differences between groups; logistic regression modeled the independent association of CPTI with community involvement adjusting for personal and practice characteristics and perspectives regarding involvement. Compared with the national sample, more CPTI graduates reported involvement in community pediatrics (43.6% vs 31.1%, P < .01) and being moderately/very skilled in 4 of 6 community activities (P < .05). Comparable percentages used ≥1 strategies (52.2% vs 47.3%, P > .05). Differences in involvement remained in adjusted analyses with greater involvement by CPTI graduates (adjusted odds ratio 2.4, 95% confidence interval 1.5-3.7). Five years after residency, compared with their peers, more CPTI graduates report having skills and greater community pediatrics involvement. Enhanced residency training in community pediatrics may lead to a more engaged pediatrician workforce. Copyright © 2014 by the American Academy of Pediatrics.
Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos
2018-01-03
Statistical power assessment is an important component of hypothesis-driven research but until relatively recently (mid-1990s) no methods were available for assessing power in experiments involving continuum data and in particular those involving one-dimensional (1D) time series. The purpose of this study was to describe how continuum-level power analyses can be used to plan hypothesis-driven biomechanics experiments involving 1D data. In particular, we demonstrate how theory- and pilot-driven 1D effect modeling can be used for sample-size calculations for both single- and multi-subject experiments. For theory-driven power analysis we use the minimum jerk hypothesis and single-subject experiments involving straight-line, planar reaching. For pilot-driven power analysis we use a previously published knee kinematics dataset. Results show that powers on the order of 0.8 can be achieved with relatively small sample sizes, five and ten for within-subject minimum jerk analysis and between-subject knee kinematics, respectively. However, the appropriate sample size depends on a priori justifications of biomechanical meaning and effect size. The main advantage of the proposed technique is that it encourages a priori justification regarding the clinical and/or scientific meaning of particular 1D effects, thereby robustly structuring subsequent experimental inquiry. In short, it shifts focus from a search for significance to a search for non-rejectable hypotheses. Copyright © 2017 Elsevier Ltd. All rights reserved.
Interim analyses in 2 x 2 crossover trials.
Cook, R J
1995-09-01
A method is presented for performing interim analyses in long term 2 x 2 crossover trials with serial patient entry. The analyses are based on a linear statistic that combines data from individuals observed for one treatment period with data from individuals observed for both periods. The coefficients in this linear combination can be chosen quite arbitrarily, but we focus on variance-based weights to maximize power for tests regarding direct treatment effects. The type I error rate of this procedure is controlled by utilizing the joint distribution of the linear statistics over analysis stages. Methods for performing power and sample size calculations are indicated. A two-stage sequential design involving simultaneous patient entry and a single between-period interim analysis is considered in detail. The power and average number of measurements required for this design are compared to those of the usual crossover trial. The results indicate that, while there is minimal loss in power relative to the usual crossover design in the absence of differential carry-over effects, the proposed design can have substantially greater power when differential carry-over effects are present. The two-stage crossover design can also lead to more economical studies in terms of the expected number of measurements required, due to the potential for early stopping. Attention is directed toward normally distributed responses.
Periodontal disease and carotid atherosclerosis: A meta-analysis of 17,330 participants.
Zeng, Xian-Tao; Leng, Wei-Dong; Lam, Yat-Yin; Yan, Bryan P; Wei, Xue-Mei; Weng, Hong; Kwong, Joey S W
2016-01-15
The association between periodontal disease and carotid atherosclerosis has been evaluated primarily in single-center studies, and whether periodontal disease is an independent risk factor of carotid atherosclerosis remains uncertain. This meta-analysis aimed to evaluate the association between periodontal disease and carotid atherosclerosis. We searched PubMed and Embase for relevant observational studies up to February 20, 2015. Two authors independently extracted data from included studies, and odds ratios (ORs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was assessed by the chi-squared test (P<0.1 for statistical significance) and quantified by the I(2) statistic. Data analysis was conducted using the Comprehensive Meta-Analysis (CMA) software. Fifteen observational studies involving 17,330 participants were included in the meta-analysis. The overall pooled result showed that periodontal disease was associated with carotid atherosclerosis (OR: 1.27, 95% CI: 1.14-1.41; P<0.001) but statistical heterogeneity was substantial (I(2)=78.90%). Subgroup analysis of adjusted smoking and diabetes mellitus showed borderline significance (OR: 1.08; 95% CI: 1.00-1.18; P=0.05). Sensitivity and cumulative analyses both indicated that our results were robust. Findings of our meta-analysis indicated that the presence of periodontal disease was associated with carotid atherosclerosis; however, further large-scale, well-conducted clinical studies are needed to explore the precise risk of developing carotid atherosclerosis in patients with periodontal disease. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Experimental design matters for statistical analysis: how to handle blocking.
Jensen, Signe M; Schaarschmidt, Frank; Onofri, Andrea; Ritz, Christian
2018-03-01
Nowadays, evaluation of the effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately, the experimental design is often more or less neglected when analysing data. Two data examples were analysed using different modelling strategies. First, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Second, translocation of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. It was shown that results from suboptimal approaches (two-sample t-tests and ordinary ANOVA assuming independent observations) may be both quantitatively and qualitatively different from the results obtained using an appropriate linear mixed model. The simulations demonstrated that the different approaches may lead to differences in coverage percentages of confidence intervals and type 1 error rates, confirming that misleading conclusions can easily happen when an inappropriate statistical approach is chosen. To ensure that experimental data are summarized appropriately, avoiding misleading conclusions, the experimental design should duly be reflected in the choice of statistical approaches and models. We recommend that author guidelines should explicitly point out that authors need to indicate how the statistical analysis reflects the experimental design. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Statistical modelling for recurrent events: an application to sports injuries
Ullah, Shahid; Gabbett, Tim J; Finch, Caroline F
2014-01-01
Background Injuries are often recurrent, with subsequent injuries influenced by previous occurrences and hence correlation between events needs to be taken into account when analysing such data. Objective This paper compares five different survival models (Cox proportional hazards (CoxPH) model and the following generalisations to recurrent event data: Andersen-Gill (A-G), frailty, Wei-Lin-Weissfeld total time (WLW-TT) marginal, Prentice-Williams-Peterson gap time (PWP-GT) conditional models) for the analysis of recurrent injury data. Methods Empirical evaluation and comparison of different models were performed using model selection criteria and goodness-of-fit statistics. Simulation studies assessed the size and power of each model fit. Results The modelling approach is demonstrated through direct application to Australian National Rugby League recurrent injury data collected over the 2008 playing season. Of the 35 players analysed, 14 (40%) players had more than 1 injury and 47 contact injuries were sustained over 29 matches. The CoxPH model provided the poorest fit to the recurrent sports injury data. The fit was improved with the A-G and frailty models, compared to WLW-TT and PWP-GT models. Conclusions Despite little difference in model fit between the A-G and frailty models, in the interest of fewer statistical assumptions it is recommended that, where relevant, future studies involving modelling of recurrent sports injury data use the frailty model in preference to the CoxPH model or its other generalisations. The paper provides a rationale for future statistical modelling approaches for recurrent sports injury. PMID:22872683
Baek, Hye Jin; Lee, Jeong Hyun; Lim, Hyun Kyung; Lee, Ha Young; Baek, Jung Hwan
2014-11-01
To determine the optimal clinical and CT findings for differentiating Kikuchi's disease (KD) and tuberculous lymphadenitis (TB) in patients presenting with cervical lymphadenopathy. From 2006 to 2010, 87 consecutive patients who were finally diagnosed with KD or TB were enrolled. Two radiologists performed independent analysis of contrast-enhanced neck CT images with regard to the involvement pattern, nodal or perinodal changes, and evidence of the previous infection. Significant clinical and CT findings of KD were determined by statistical analyses. Of the 87 patients, 27 (31%) were classified as having KD and 60 (69%) as having TB. Statistically significant findings of KD patients were younger age, presence of fever, involvement of ≥5 nodal levels or the bilateral neck, no or minimal nodal necrosis, marked perinodal infiltration, and no evidence of upper lung lesion or mediastinal lymphadenopathy. The presence of four or more statistically significant clinical and CT findings of KD had the largest area under the receiver-operating characteristic curve (A z = 0.861; 95% confidence intervals 0.801, 0.909), with a sensitivity of 89% and specificity of 83%. CT can be a helpful tool for differentiating KD from TB, especially when it is combined with the clinical findings.
Continuous Covariate Imbalance and Conditional Power for Clinical Trial Interim Analyses
Ciolino, Jody D.; Martin, Renee' H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.
2014-01-01
Oftentimes valid statistical analyses for clinical trials involve adjustment for known influential covariates, regardless of imbalance observed in these covariates at baseline across treatment groups. Thus, it must be the case that valid interim analyses also properly adjust for these covariates. There are situations, however, in which covariate adjustment is not possible, not planned, or simply carries less merit as it makes inferences less generalizable and less intuitive. In this case, covariate imbalance between treatment groups can have a substantial effect on both interim and final primary outcome analyses. This paper illustrates the effect of influential continuous baseline covariate imbalance on unadjusted conditional power (CP), and thus, on trial decisions based on futility stopping bounds. The robustness of the relationship is illustrated for normal, skewed, and bimodal continuous baseline covariates that are related to a normally distributed primary outcome. Results suggest that unadjusted CP calculations in the presence of influential covariate imbalance require careful interpretation and evaluation. PMID:24607294
Harrison, Jay M; Breeze, Matthew L; Harrigan, George G
2011-08-01
Statistical comparisons of compositional data generated on genetically modified (GM) crops and their near-isogenic conventional (non-GM) counterparts typically rely on classical significance testing. This manuscript presents an introduction to Bayesian methods for compositional analysis along with recommendations for model validation. The approach is illustrated using protein and fat data from two herbicide tolerant GM soybeans (MON87708 and MON87708×MON89788) and a conventional comparator grown in the US in 2008 and 2009. Guidelines recommended by the US Food and Drug Administration (FDA) in conducting Bayesian analyses of clinical studies on medical devices were followed. This study is the first Bayesian approach to GM and non-GM compositional comparisons. The evaluation presented here supports a conclusion that a Bayesian approach to analyzing compositional data can provide meaningful and interpretable results. We further describe the importance of method validation and approaches to model checking if Bayesian approaches to compositional data analysis are to be considered viable by scientists involved in GM research and regulation. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pedretti, Daniele; Beckie, Roger Daniel
2014-05-01
Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.
Östberg, Viveca; Låftman, Sara B; Modin, Bitte; Lindfors, Petra
2018-02-20
Bullying involves repeated exposure to negative actions while also invoking a power asymmetry between the involved parties. From a stress perspective, being bullied can be seen as a severe and chronic stressor, and an everyday social-evaluative threat, coupled with a shortage of effective social resources for dealing with this particular stressor. The aim of this study was to investigate whether exposure to bullying among mid-adolescent girls and boys is associated with subjective and objective stress-related outcomes in terms of perceived stress, recurrent pain, and salivary cortisol. The data came from the School Stress and Support Study (TriSSS) including students in grades 8-9 in two schools in Stockholm, Sweden, in 2010 (study sample n = 392; cortisol subsample n = 198). Bullying was self-reported and measured by multiple items. The statistical analyses included binary logistic and linear (OLS) regression. Being bullied was associated with greater perceived stress and an increased risk of recurrent pain, among both boys and girls. Also, bullied students had lower cortisol output (AUC G ) and lower cortisol awakening response (CAR G ) as compared to those who were not bullied. Gender-stratified analyses demonstrated that these associations were statistically significant for boys but not for girls. In conclusion, this study demonstrated that being bullied was related to both subjective and objective stress markers among mid-adolescent girls and boys, pointing to the necessity of continuously working against bullying.
NASA Astrophysics Data System (ADS)
Saez, Núria; Ruiz, Xavier; Pallarés, Jordi; Shevtsova, Valentina
2013-04-01
An accelerometric record from the IVIDIL experiment (ESA Columbus module) has exhaustively been studied. The analysis involved the determination of basic statistical properties as, for instance, the auto-correlation and the power spectrum (second-order statistical analyses). Also, and taking into account the shape of the associated histograms, we address another important question, the non-Gaussian nature of the time series using the bispectrum and the bicoherence of the signals. Extrapolating the above-mentioned results, a computational model of a high-temperature shear cell has been performed. A scalar indicator has been used to quantify the accuracy of the diffusion coefficient measurements in the case of binary mixtures involving photovoltaic silicon or liquid Al-Cu binary alloys. Three different initial arrangements have been considered, the so-called interdiffusion, centred thick layer and the lateral thick layer. Results allow us to conclude that, under the conditions of the present work, the diffusion coefficient is insensitive to the environmental conditions, that is to say, accelerometric disturbances and initial shear cell arrangement.
Friedman, David B
2012-01-01
All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Krief, Peggy; Zellweger, Alessia; Politis Mercier, Maria-Pia; Danuser, Brigitta; Wild, Pascal; Zenoni, Michela; Probst, Isabelle
2018-06-14
Like most industrialised countries, Switzerland has introduced legislation to protect the health of pregnant workers and their unborn children from workplace exposure. This legislation provides for a risk assessment, adaptations to workplaces and, if the danger is not eliminated, preventive leave (prescribed by a gynaecologist). This study's first objective is to analyse the degree to which companies, gynaecologists and midwives implement the law. Its second objective is to understand the obstacles and resources of this implementation, with a focus on how relevant stakeholders perceive protective measures and their involvement with them. Data will be collected using mixed methods: (1) online questionnaires for gynaecologists and midwives; telephone questionnaires with company human resources (HR) managers in the healthcare and food production sectors; (2a) case studies of 6-8 companies in each sector, including interviews with stakeholders such as women workers, HR managers and occupational health physicians; (2b) two focus groups, one involving occupational physicians and hygienists, one involving labour inspectors.Quantitative data will be analysed statistically using STATA software V.15. Qualitative data will be transcribed and thematically analysed using MaxQDA software. The Human Research Ethics Committee of the Canton Vaud (CER-VD) has certified that this research study protocol falls outside of the field of application of the Swiss Federal Act on Research Involving Humans.The publications and recommendations resulting from this study will form the starting point for future improvements to the protection of pregnant women at work and their unborn children.This study started in February 2017 and will continue until January 2020. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Singh, J; Thornton, J M
1990-02-05
Automated methods have been developed to determine the preferred packing arrangement between interacting protein groups. A suite of FORTRAN programs, SIRIUS, is described for calculating and analysing the geometries of interacting protein groups using crystallographically derived atomic co-ordinates. The programs involved in calculating the geometries search for interacting pairs of protein groups using a distance criterion, and then calculate the spatial disposition and orientation of the pair. The second set of programs is devoted to analysis. This involves calculating the observed and expected distributions of the angles and assessing the statistical significance of the difference between the two. A database of the geometries of the 400 combinations of side-chain to side-chain interaction has been created. The approach used in analysing the geometrical information is illustrated here with specific examples of interactions between side-chains, peptide groups and particular types of atom. At the side-chain level, an analysis of aromatic-amino interactions, and the interactions of peptide carbonyl groups with arginine residues is presented. At the atomic level the analyses include the spatial disposition of oxygen atoms around tyrosine residues, and the frequency and type of contact between carbon, nitrogen and oxygen atoms. This information is currently being applied to the modelling of protein interactions.
GenomeGraphs: integrated genomic data visualization with R.
Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine
2009-01-06
Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.
diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.
Lun, Aaron T L; Smyth, Gordon K
2015-08-19
Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.
D'Addabbo, Annarita; Palmieri, Orazio; Maglietta, Rosalia; Latiano, Anna; Mukherjee, Sayan; Annese, Vito; Ancona, Nicola
2011-08-01
A meta-analysis has re-analysed previous genome-wide association scanning definitively confirming eleven genes and further identifying 21 new loci. However, the identified genes/loci still explain only the minority of genetic predisposition of Crohn's disease. To identify genes weakly involved in disease predisposition by analysing chromosomal regions enriched of single nucleotide polymorphisms with modest statistical association. We utilized the WTCCC data set evaluating 1748 CD and 2938 controls. The identification of candidate genes/loci was performed by a two-step procedure: first of all chromosomal regions enriched of weak association signals were localized; subsequently, weak signals clustered in gene regions were identified. The statistical significance was assessed by non parametric permutation tests. The cytoband enrichment analysis highlighted 44 regions (P≤0.05) enriched with single nucleotide polymorphisms significantly associated with the trait including 23 out of 31 previously confirmed and replicated genes. Importantly, we highlight further 20 novel chromosomal regions carrying approximately one hundred genes/loci with modest association. Amongst these we find compelling functional candidate genes such as MAPT, GRB2 and CREM, LCT, and IL12RB2. Our study suggests a different statistical perspective to discover genes weakly associated with a given trait, although further confirmatory functional studies are needed. Copyright © 2011 Editrice Gastroenterologica Italiana S.r.l. All rights reserved.
Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana
2015-11-01
The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.
Predicting early adolescent gang involvement from middle school adaptation.
Dishion, Thomas J; Nelson, Sarah E; Yasui, Miwa
2005-03-01
This study examined the role of adaptation in the first year of middle school (Grade 6, age 11) to affiliation with gangs by the last year of middle school (Grade 8, age 13). The sample consisted of 714 European American (EA) and African American (AA) boys and girls. Specifically, academic grades, reports of antisocial behavior, and peer relations in 6th grade were used to predict multiple measures of gang involvement by 8th grade. The multiple measures of gang involvement included self-, peer, teacher, and counselor reports. Unexpectedly, self-report measures of gang involvement did not correlate highly with peer and school staff reports. The results, however, were similar for other and self-report measures of gang involvement. Mean level analyses revealed statistically reliable differences in 8th-grade gang involvement as a function of the youth gender and ethnicity. Structural equation prediction models revealed that peer nominations of rejection, acceptance, academic failure, and antisocial behavior were predictive of gang involvement for most youth. These findings suggest that the youth level of problem behavior and the school ecology (e.g., peer rejection, school failure) require attention in the design of interventions to prevent the formation of gangs among high-risk young adolescents.
Long QT syndrome in African-Americans.
Fugate, Thomas; Moss, Arthur J; Jons, Christian; McNitt, Scott; Mullally, Jamie; Ouellet, Gregory; Goldenberg, Ilan; Zareba, Wojciech; Robinson, Jennifer L
2010-01-01
We evaluated the risk factors and clinical course of Long QT syndrome (LQTS) in African-American patients. The study involved 41 African-Americans and 3456 Caucasians with a QTc > or = 450 ms from the U.S. portion of the International LQTS Registry. Data included information about the medical history and clinical course of the LQTS patients with end points relating to the occurrence of syncope, aborted cardiac arrest, or LQTS-related sudden cardiac death from birth through age 40 years. The statistical analyses involved Kaplan-Meier time to event graphs and Cox regression models for multivariable risk factor evaluation. The QTc was 29 ms longer in African-Americans than Caucasians. Multivarite Cox analyses with adjustment for decade of birth revealed that the cardiac event rate was similar in African-Americans and Caucasians with LQTS and that beta-blockers were equally effective in reducing cardiac events in the two racial groups. The clinical course of LQTS in African-Americans is similar to that of Caucasians with comparable risk factors and benefit from beta-blocker therapy in the two racial groups.
Size and shape measurement in contemporary cephalometrics.
McIntyre, Grant T; Mossey, Peter A
2003-06-01
The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.
Moisture Forecast Bias Correction in GEOS DAS
NASA Technical Reports Server (NTRS)
Dee, D.
1999-01-01
Data assimilation methods rely on numerous assumptions about the errors involved in measuring and forecasting atmospheric fields. One of the more disturbing of these is that short-term model forecasts are assumed to be unbiased. In case of atmospheric moisture, for example, observational evidence shows that the systematic component of errors in forecasts and analyses is often of the same order of magnitude as the random component. we have implemented a sequential algorithm for estimating forecast moisture bias from rawinsonde data in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The algorithm is designed to remove the systematic component of analysis errors and can be easily incorporated in an existing statistical data assimilation system. We will present results of initial experiments that show a significant reduction of bias in the GEOS DAS moisture analyses.
Wavelet Transform Based Higher Order Statistical Analysis of Wind and Wave Time Histories
NASA Astrophysics Data System (ADS)
Habib Huseni, Gulamhusenwala; Balaji, Ramakrishnan
2017-10-01
Wind, blowing on the surface of the ocean, imparts the energy to generate the waves. Understanding the wind-wave interactions is essential for an oceanographer. This study involves higher order spectral analyses of wind speeds and significant wave height time histories, extracted from European Centre for Medium-Range Weather Forecast database at an offshore location off Mumbai coast, through continuous wavelet transform. The time histories were divided by the seasons; pre-monsoon, monsoon, post-monsoon and winter and the analysis were carried out to the individual data sets, to assess the effect of various seasons on the wind-wave interactions. The analysis revealed that the frequency coupling of wind speeds and wave heights of various seasons. The details of data, analysing technique and results are presented in this paper.
Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bujak, Renata; Yumba Mpanga, Arlette; Markuszewski, Marcin; Jacyna, Julia; Matuszewski, Marcin; Kaliszan, Roman; Markuszewski, Michał J
2015-01-01
Prostate cancer (CaP) is a leading cause of cancer deaths in men worldwide. The alarming statistics, the currently applied biomarkers are still not enough specific and selective. In addition, pathogenesis of CaP development is not totally understood. Therefore, in the present work, metabolomics study related to urinary metabolic fingerprinting analyses has been performed in order to scrutinize potential biomarkers that could help in explaining the pathomechanism of the disease and be potentially useful in its diagnosis and prognosis. Urine samples from CaP patients and healthy volunteers were analyzed with the use of high performance liquid chromatography coupled with time of flight mass spectrometry detection (HPLC-TOF/MS) in positive and negative polarity as well as gas chromatography hyphenated with triple quadruple mass spectrometry detection (GC-QqQ/MS) in a scan mode. The obtained data sets were statistically analyzed using univariate and multivariate statistical analyses. The Principal Component Analysis (PCA) was used to check systems' stability and possible outliers, whereas Partial Least Squares Discriminant Analysis (PLS-DA) was performed for evaluation of quality of the model as well as its predictive ability using statistically significant metabolites. The subsequent identification of selected metabolites using NIST library and commonly available databases allows for creation of a list of putative biomarkers and related biochemical pathways they are involved in. The selected pathways, like urea and tricarboxylic acid cycle, amino acid and purine metabolism, can play crucial role in pathogenesis of prostate cancer disease. Copyright © 2014 Elsevier B.V. All rights reserved.
Pichon, Christophe; du Merle, Laurence; Caliot, Marie Elise; Trieu-Cuot, Patrick; Le Bouguénec, Chantal
2012-04-01
Characterization of small non-coding ribonucleic acids (sRNA) among the large volume of data generated by high-throughput RNA-seq or tiling microarray analyses remains a challenge. Thus, there is still a need for accurate in silico prediction methods to identify sRNAs within a given bacterial species. After years of effort, dedicated software were developed based on comparative genomic analyses or mathematical/statistical models. Although these genomic analyses enabled sRNAs in intergenic regions to be efficiently identified, they all failed to predict antisense sRNA genes (asRNA), i.e. RNA genes located on the DNA strand complementary to that which encodes the protein. The statistical models enabled any genomic region to be analyzed theorically but not efficiently. We present a new model for in silico identification of sRNA and asRNA candidates within an entire bacterial genome. This model was successfully used to analyze the Gram-negative Escherichia coli and Gram-positive Streptococcus agalactiae. In both bacteria, numerous asRNAs are transcribed from the complementary strand of genes located in pathogenicity islands, strongly suggesting that these asRNAs are regulators of the virulence expression. In particular, we characterized an asRNA that acted as an enhancer-like regulator of the type 1 fimbriae production involved in the virulence of extra-intestinal pathogenic E. coli.
Pichon, Christophe; du Merle, Laurence; Caliot, Marie Elise; Trieu-Cuot, Patrick; Le Bouguénec, Chantal
2012-01-01
Characterization of small non-coding ribonucleic acids (sRNA) among the large volume of data generated by high-throughput RNA-seq or tiling microarray analyses remains a challenge. Thus, there is still a need for accurate in silico prediction methods to identify sRNAs within a given bacterial species. After years of effort, dedicated software were developed based on comparative genomic analyses or mathematical/statistical models. Although these genomic analyses enabled sRNAs in intergenic regions to be efficiently identified, they all failed to predict antisense sRNA genes (asRNA), i.e. RNA genes located on the DNA strand complementary to that which encodes the protein. The statistical models enabled any genomic region to be analyzed theorically but not efficiently. We present a new model for in silico identification of sRNA and asRNA candidates within an entire bacterial genome. This model was successfully used to analyze the Gram-negative Escherichia coli and Gram-positive Streptococcus agalactiae. In both bacteria, numerous asRNAs are transcribed from the complementary strand of genes located in pathogenicity islands, strongly suggesting that these asRNAs are regulators of the virulence expression. In particular, we characterized an asRNA that acted as an enhancer-like regulator of the type 1 fimbriae production involved in the virulence of extra-intestinal pathogenic E. coli. PMID:22139924
An updated and expanded meta-analysis of nonresident fathering and child well-being.
Adamsons, Kari; Johnson, Sara K
2013-08-01
Since Amato and Gilbreth's (1999) meta-analysis of nonresident father involvement and child well-being, nonmarital childbirths and nonresident father involvement both have increased. The unknown implications of such changes motivated the present study, a meta-analytic review of 52 studies of nonresident father involvement and child well-being. Consistent with Amato and Gilbreth, we found that positive forms of involvement were associated with benefits for children, with a small but statistically significant effect size. Amounts of father-child contact and financial provision, however, were not associated with child well-being. Going beyond Amato and Gilbreth, we analyzed the associations between different types of fathering and overall child well-being, and between overall father involvement and different types of child well-being. We found that nonresident father involvement was most strongly associated with children's social well-being and also was associated with children's emotional well-being, academic achievement, and behavioral adjustment. The forms of father involvement most strongly associated with child well-being were involvement in child-related activities, having positive father-child relationships, and engaging in multiple forms of involvement. Moderator analyses demonstrated variation in effect sizes based on both study characteristics and demographic variables. We discuss the implications of these findings for policy and practice. © 2013 American Psychological Association
Gómez, Daviel; Hernández, L Ázaro; Yabor, Lourdes; Beemster, Gerrit T S; Tebbe, Christoph C; Papenbrock, Jutta; Lorenzo, José Carlos
2018-03-15
Plant scientists usually record several indicators in their abiotic factor experiments. The common statistical management involves univariate analyses. Such analyses generally create a split picture of the effects of experimental treatments since each indicator is addressed independently. The Euclidean distance combined with the information of the control treatment could have potential as an integrating indicator. The Euclidean distance has demonstrated its usefulness in many scientific fields but, as far as we know, it has not yet been employed for plant experimental analyses. To exemplify the use of the Euclidean distance in this field, we performed an experiment focused on the effects of mannitol on sugarcane micropropagation in temporary immersion bioreactors. Five mannitol concentrations were compared: 0, 50, 100, 150 and 200 mM. As dependent variables we recorded shoot multiplication rate, fresh weight, and levels of aldehydes, chlorophylls, carotenoids and phenolics. The statistical protocol which we then carried out integrated all dependent variables to easily identify the mannitol concentration that produced the most remarkable integral effect. Results provided by the Euclidean distance demonstrate a gradually increasing distance from the control in function of increasing mannitol concentrations. 200 mM mannitol caused the most significant alteration of sugarcane biochemistry and physiology under the experimental conditions described here. This treatment showed the longest statistically significant Euclidean distance to the control treatment (2.38). In contrast, 50 and 100 mM mannitol showed the lowest Euclidean distances (0.61 and 0.84, respectively) and thus poor integrated effects of mannitol. The analysis shown here indicates that the use of the Euclidean distance can contribute to establishing a more integrated evaluation of the contrasting mannitol treatments.
Predictors of persistent pain after total knee arthroplasty: a systematic review and meta-analysis.
Lewis, G N; Rice, D A; McNair, P J; Kluger, M
2015-04-01
Several studies have identified clinical, psychosocial, patient characteristic, and perioperative variables that are associated with persistent postsurgical pain; however, the relative effect of these variables has yet to be quantified. The aim of the study was to provide a systematic review and meta-analysis of predictor variables associated with persistent pain after total knee arthroplasty (TKA). Included studies were required to measure predictor variables prior to or at the time of surgery, include a pain outcome measure at least 3 months post-TKA, and include a statistical analysis of the effect of the predictor variable(s) on the outcome measure. Counts were undertaken of the number of times each predictor was analysed and the number of times it was found to have a significant relationship with persistent pain. Separate meta-analyses were performed to determine the effect size of each predictor on persistent pain. Outcomes from studies implementing uni- and multivariable statistical models were analysed separately. Thirty-two studies involving almost 30 000 patients were included in the review. Preoperative pain was the predictor that most commonly demonstrated a significant relationship with persistent pain across uni- and multivariable analyses. In the meta-analyses of data from univariate models, the largest effect sizes were found for: other pain sites, catastrophizing, and depression. For data from multivariate models, significant effects were evident for: catastrophizing, preoperative pain, mental health, and comorbidities. Catastrophizing, mental health, preoperative knee pain, and pain at other sites are the strongest independent predictors of persistent pain after TKA. © The Author 2014. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Stahmer, Aubyn C; Hurlburt, Michael; Horwitz, Sarah McCue; Landsverk, John; Zhang, Jinjin; Leslie, Laurel K
2009-09-01
To examine developmental and behavioral status of children in child welfare (CW) over time, by intensity of CW involvement using a national probability sample. As part of the National Survey of Child and Adolescent Well-being (NSCAW), data were collected on 1,049 children 12-47 months old investigated by CW agencies for possible abuse or neglect. Analyses used descriptive statistics to characterize developmental and behavioral status across four domains (developmental/cognitive, language, adaptive functioning, and behavior) by intensity of CW involvement (in-home with CW services, in-home with no CW services or out-of-home care) over time. Multivariate analyses were used to examine the relationship between independent variables (age, gender, home environment, race/ethnicity, maltreatment history, intensity of CW involvement) and follow-up domain scores. On average, children improved in developmental/cognitive, communication/language status over time, but these improvements did not differ by intensity of CW involvement. Analyses revealed a positive relationship between the home environment and change in language and adaptive behavior standard scores over time, and few predictors of change in behavioral status. An interaction between intensity of CW involvement and initial developmental/cognitive status was present. Across domains, intensity of CW involvement does not appear to have a significant effect on change in developmental and behavioral status, although out-of-home care does have differential relationships with children's developmental/cognitive status for those with very low initial cognitive/developmental status. Facilitating development in children in CW may require supportive, enriched care environments both for children remaining at home and those in foster care. Toddler and preschool age children known to child welfare are likely to have difficulties with development whether they are removed from their homes or not. It would be helpful if child welfare workers were trained to screen for developmental, language, adaptive behavior and behavioral difficulties in children in foster care, and those remaining at home. Additional support for biological, foster, and kinship caregivers in encouraging development is important for the attainment of critical developmental skills, especially for children with developmental difficulties.
Applying the compound Poisson process model to the reporting of injury-related mortality rates.
Kegler, Scott R
2007-02-16
Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.
[Organizational climate and burnout syndrome].
Lubrańska, Anna
2011-01-01
The paper addresses the issue of organizational climate and burnout syndrome. It has been assumed that burnout syndrome is dependent on work climate (organizational climate), therefore, two concepts were analyzed: by D. Kolb (organizational climate) and by Ch. Maslach (burnout syndrome). The research involved 239 persons (122 woman, 117 men), aged 21-66. In the study Maslach Burnout Inventory (MBI) and Inventory of Organizational Climate were used. The results of statistical methods (correlation analysis, one-variable analysis of variance and regression analysis) evidenced a strong relationship between organizational climate and burnout dimension. As depicted by the results, there are important differences in the level of burnout between the study participants who work in different types of organizational climate. The results of the statistical analyses indicate that the organizational climate determines burnout syndrome. Therefore, creating supportive conditions at the workplace might reduce the risk of burnout.
Marzulli, F; Maguire, H C
1982-02-01
Several guinea-pig predictive test methods were evaluated by comparison of results with those obtained with human predictive tests, using ten compounds that have been used in cosmetics. The method involves the statistical analysis of the frequency with which guinea-pig tests agree with the findings of tests in humans. In addition, the frequencies of false positive and false negative predictive findings are considered and statistically analysed. The results clearly demonstrate the superiority of adjuvant tests (complete Freund's adjuvant) in determining skin sensitizers and the overall superiority of the guinea-pig maximization test in providing results similar to those obtained by human testing. A procedure is suggested for utilizing adjuvant and non-adjuvant test methods for characterizing compounds as of weak, moderate or strong sensitizing potential.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Micromechanical investigation of sand migration in gas hydrate-bearing sediments
NASA Astrophysics Data System (ADS)
Uchida, S.; Klar, A.; Cohen, E.
2017-12-01
Past field gas production tests from hydrate bearing sediments have indicated that sand migration is an important phenomenon that needs to be considered for successful long-term gas production. The authors previously developed the continuum based analytical thermo-hydro-mechanical sand migration model that can be applied to predict wellbore responses during gas production. However, the model parameters involved in the model still needs to be calibrated and studied thoroughly and it still remains a challenge to conduct well-defined laboratory experiments of sand migration, especially in hydrate-bearing sediments. Taking the advantage of capability of micromechanical modelling approach through discrete element method (DEM), this work presents a first step towards quantifying one of the model parameters that governs stresses reduction due to grain detachment. Grains represented by DEM particles are randomly removed from an isotropically loaded DEM specimen and statistical analyses reveal that linear proportionality exists between the normalized volume of detached solids and normalized reduced stresses. The DEM specimen with different porosities (different packing densities) are also considered and statistical analyses show that there is a clear transition between loose sand behavior and dense sand behavior, characterized by the relative density.
Cunningham, Michael R.; Baumeister, Roy F.
2016-01-01
The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.’s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect – contrary to their title. PMID:27826272
Bitik, Berivan; Tufan, Abdurrahman; Sahin, Kubilay; Sucullu Karadag, Yesim; Can Sandikci, Sevinc; Mercan, Ridvan; Ak, Fikri; Karaaslan, Yasar; Ozturk, Mehmet Akif; Goker, Berna; Haznedaroglu, Seminur
2016-01-01
Behçet's syndrome (BS) is a systemic vasculitis, which may involve multiple organ systems simultaneously. Clinical findings in BS often fit into well-recognized patterns, such as the association between papulo-pustular skin lesions and arthritis. We have recently observed a distinct pattern, in which a subtype of neuro-Behçet's syndrome (NBS) is often preceded by specific ophthalmic manifestations of the disease process. The purpose of this study is to evaluate the association between the parenchymal subtype of NBS and posterior uveitis (PU). We have retrospectively reviewed the clinical records of 295 patients with BS, who met the international classification criteria for BS, diagnosed at two major rheumatology clinics from 2010 to 2014. Patient demographics, ophthalmic examinations, clinical and radiologic patterns of neurological involvement were recorded. Manifestations of BS were classified as PU, NBS, vascular involvement, and arthritis. The association between clinical findings was analysed for statistical significance. Of the 295 patients, 100 had PU and 44 had NBS. 30 patients had parenchymal NBS and 14 had vascular NBS. Patients with PU were significantly more likely to have neurological involvement compared to those without PU (p<0.001; Odds Ratio: 3.924; 95% CI: 1.786-8.621). Rate of posterior uveitis was higher in patients with parenchymal NBS when compared to patients with vascular NBS, vascular BS or arthritis (63.3%, 21.4%, 22% and 4.2% respectively, p<0.001). Our findings suggest a clinically and statistically significant association between posterior uveitis and parenchymal type of neurologic involvement in BS. The development of posterior uveitis in a patient with previously diagnosed BS should be recognized as a "warning sign" for predisposition to neurologic involvement. These patients should be informed about the possible signs and symptoms of neurological involvement, which can cause very rapid and irreversible damage unless recognized and treated immediately.
Gandy, M; Karin, E; Jones, M P; McDonald, S; Sharpe, L; Titov, N; Dear, B F
2018-05-13
The evidence for Internet-delivered pain management programs for chronic pain is growing, but there is little empirical understanding of how they effect change. Understanding mechanisms of clinical response to these programs could inform their effective development and delivery. A large sample (n = 396) from a previous randomized controlled trial of a validated internet-delivered psychological pain management program, the Pain Course, was used to examine the influence of three potential psychological mechanisms (pain acceptance, pain self-efficacy, fear of movement/re-injury) on treatment-related change in disability, depression, anxiety and average pain. Analyses involved generalized estimating equation models for clinical outcomes that adjusted for co-occurring change in psychological variables. This was paired with cross-lagged analysis to assess for evidence of causality. Analyses involved two time points, pre-treatment and post-treatment. Changes in pain-acceptance were strongly associated with changes in three (depression, anxiety and average pain) of the four clinical outcomes. Changes in self-efficacy were also strongly associated with two (anxiety and average pain) clinical outcomes. These findings suggest that participants were unlikely to improve in these clinical outcomes without also experiencing increases in their pain self-efficacy and pain acceptance. However, there was no clear evidence from cross-lagged analyses to currently support these psychological variables as direct mechanisms of clinical improvements. There was only statistical evidence to suggest higher levels of self-efficacy moderated improvements in depression. The findings suggest that, while clinical improvements are closely associated with improvements in pain acceptance and self-efficacy, these psychological variables may not drive the treatment effects observed. This study employed robust statistical techniques to assess the psychological mechanisms of an established internet-delivered pain management program. While clinical improvements (e.g. depression, anxiety, pain) were closely associated with improvements in psychological variables (e.g. pain self-efficacy and pain acceptance), these variables do not appear to be treatment mechanisms. © 2018 European Pain Federation - EFIC®.
Matiatos, Ioannis; Alexopoulos, Apostolos; Godelitsas, Athanasios
2014-04-01
The present study involves an integration of the hydrogeological, hydrochemical and isotopic (both stable and radiogenic) data of the groundwater samples taken from aquifers occurring in the region of northeastern Peloponnesus. Special emphasis has been given to health-related ions and isotopes in relation to the WHO and USEPA guidelines, to highlight the concentrations of compounds (e.g., As and Ba) exceeding the drinking water thresholds. Multivariate statistical analyses, i.e. two principal component analyses (PCA) and one discriminant analysis (DA), combined with conventional hydrochemical methodologies, were applied, with the aim to interpret the spatial variations in the groundwater quality and to identify the main hydrogeochemical factors and human activities responsible for the high ion concentrations and isotopic content in the groundwater analysed. The first PCA resulted in a three component model, which explained approximately 82% of the total variance of the data sets and enabled the identification of the hydrogeological processes responsible for the isotopic content i.e., δ(18)Ο, tritium and (222)Rn. The second PCA, involving the trace element presence in the water samples, revealed a four component model, which explained approximately 89% of the total variance of the data sets, giving more insight into the geochemical and anthropogenic controls on the groundwater composition (e.g., water-rock interaction, hydrothermal activity and agricultural activities). Using discriminant analysis, a four parameter (δ(18)O, (Ca+Mg)/(HCO3+SO4), EC and Cl) discriminant function concerning the (222)Rn content was derived, which favoured a classification of the samples according to the concentration of (222)Rn as (222)Rn-safe (<11 Bq·L(-1)) and (222)Rn-contaminated (>11 Bq·L(-1)). The selection of radon builds on the fact that this radiogenic isotope has been generally related to increased health risk when consumed. Copyright © 2014 Elsevier B.V. All rights reserved.
Analysis of longitudinal data from animals where some data are missing in SPSS
Duricki, DA; Soleman, S; Moon, LDF
2017-01-01
Testing of therapies for disease or injury often involves analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly where some data are missing) yet are not used widely by pre-clinical researchers. We provide here an easy to use protocol for analysing longitudinal data from animals and present a click-by-click guide for performing suitable analyses using the statistical package SPSS. We guide readers through analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. We show that repeated measures analysis of covariance failed to detect a treatment effect when a few data points were missing (due to animal drop-out) whereas analysis using an alternative method detected a beneficial effect of treatment; specifically, we demonstrate the superiority of linear models (with various covariance structures) analysed using Restricted Maximum Likelihood estimation (to include all available data). This protocol takes two hours to follow. PMID:27196723
Neal, B; MacMahon, S
1999-01-01
Overviews (meta-analyses) of the major ongoing randomized trials of blood pressure lowering drugs will be conducted to determine the effects of: first, newer versus older classes of blood pressure lowering drugs in patients with hypertension; and second, blood pressure lowering treatments versus untreated or less treated control conditions in patient groups at high risk of cardiovascular events. The principal study outcomes are stroke, coronary heart disease, total cardiovascular events and total cardiovascular deaths. The overviews have been prospectively designed and will be conducted on individual patient data. The analyses will be conducted as a collaboration between the principal investigators of participating trials involving about 270,000 patients. Full data should be available in 2003, with the first round of analyses performed in 1999-2000. The combination of trial results should provide good statistical power to detect even modest differences between the effects on the main study outcomes.
Remote sensing data acquisition, analysis and archival. Volume 1. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stringer, W.J.; Dean, K.G.; Groves, J.E.
1993-03-25
The project specialized in the acquisition and dissemination of satellite imagery and its utilization for case-specific and statistical analyses of offshore environmental conditions, particularly those involving sea ice. During the duration of this contract, 854 Landsat Multispectral Scanner and 2 Landsat Thematic Mapper scenes, 8,576 Advanced Very High Resolution Radiometer images, and 31,000 European, Earth Resources Satellite, Synthetic Aperture Radar images were archived. Direct assistance was provided to eight Minerals Management Service (MMS)-sponsored studies, including analyses of Port Moller circulation, Bowhead whale migration, distribution, population and behavioral studies, Beaufort Sea fisheries, oil spill trajectory model development, and Kasegaluk Lagoon environmentalmore » assessments. In addition, under this Cooperative Agreement several complete studies were undertaken based on analysis of satellite imagery. The topics included: Kasegaluk Lagoon transport, the effect of winter storms on arctic ice, the relationship between ice surface temperatures as measured by buoys and passive microwave imagery, unusual cloud forms following lead-openings, and analyses of Chukchi and Bering sea polynyas.« less
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
NASA Technical Reports Server (NTRS)
Raiman, Laura B.
1992-01-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Rojkova, K; Volle, E; Urbanski, M; Humbert, F; Dell'Acqua, F; Thiebaut de Schotten, M
2016-04-01
In neuroscience, there is a growing consensus that higher cognitive functions may be supported by distributed networks involving different cerebral regions, rather than by single brain areas. Communication within these networks is mediated by white matter tracts and is particularly prominent in the frontal lobes for the control and integration of information. However, the detailed mapping of frontal connections remains incomplete, albeit crucial to an increased understanding of these cognitive functions. Based on 47 high-resolution diffusion-weighted imaging datasets (age range 22-71 years), we built a statistical normative atlas of the frontal lobe connections in stereotaxic space, using state-of-the-art spherical deconvolution tractography. We dissected 55 tracts including U-shaped fibers. We further characterized these tracts by measuring their correlation with age and education level. We reported age-related differences in the microstructural organization of several, specific frontal fiber tracts, but found no correlation with education level. Future voxel-based analyses, such as voxel-based morphometry or tract-based spatial statistics studies, may benefit from our atlas by identifying the tracts and networks involved in frontal functions. Our atlas will also build the capacity of clinicians to further understand the mechanisms involved in brain recovery and plasticity, as well as assist clinicians in the diagnosis of disconnection or abnormality within specific tracts of individual patients with various brain diseases.
NASA Astrophysics Data System (ADS)
Raiman, Laura B.
1992-12-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Characteristics of worker accidents on NYSDOT construction projects.
Mohan, Satish; Zech, Wesley C
2005-01-01
This paper aims at providing cost-effective safety measures to protect construction workers in highway work zones, based on real data. Two types of accidents that occur in work zones were: (a) construction work area accidents, and (b) traffic accidents involving construction worker(s). A detailed analysis of work zone accidents involving 36 fatalities and 3,055 severe injuries to construction workers on New York State Department of Transportation (NYSDOT) construction projects from 1990 to 2001 established that five accident types: (a) Struck/Pinned by Large Equipment, (b) Trip or Fall (elevated), (c) Contact w/Electrical or Gas Utility, (d) Struck-by Moving/Falling Load, and (e) Crane/Lift Device Failure accounted for nearly 96% of the fatal accidents, nearly 63% of the hospital-level injury accidents, and nearly 91% of the total costs. These construction work area accidents had a total cost of $133.8 million. Traffic accidents that involve contractors' employees were also examined. Statistical analyses of the traffic accidents established that five traffic accident types: (a) Work Space Intrusion, (b) Worker Struck-by Vehicle Inside Work Space, (c) Flagger Struck-by Vehicle, (d) Worker Struck-by Vehicle Entering/Exiting Work Space, and (e) Construction Equipment Struck-by Vehicle Inside Work Space accounted for nearly 86% of the fatal, nearly 70% of the hospital-level injury and minor injury traffic accidents, and $45.4 million (79.4%) of the total traffic accident costs. The results of this paper provide real statistics on construction worker related accidents reported on construction work zones. Potential preventions based on real statistics have also been suggested. The ranking of accident types, both within the work area as well as in traffic, will guide the heavy highway contractor and owner agencies in identifying the most cost effective safety preventions.
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
Geyer, Nelouise-Marié; Coetzee, Siedine K; Ellis, Suria M; Uys, Leana R
2018-02-28
This study aimed to describe intrapersonal characteristics (professional values, personality, empathy, and job involvement), work performance as perceived by nurses, and caring behaviors as perceived by patients, and to examine the relationships among these variables. A cross-sectional design was employed. A sample was recruited of 218 nurses and 116 patients in four private hospitals and four public hospitals. Data were collected using self-report measures. Data analysis included descriptive statistics, exploratory and confirmatory factor analyses, hierarchical linear modelling, correlations, and structural equation modeling. Nurses perceived their work performance to be of high quality. Among the intrapersonal characteristics, nurses had high scores for professional values, and moderately high scores for personality, empathy and job involvement. Patients perceived nurses' caring behaviors as moderately high. Professional values of nurses were the only selected intrapersonal characteristic with a statistically significant positive relationship, of practical importance, with work performance as perceived by nurses and with caring behaviors as perceived by patients at ward level. Managers can enhance nurses' work performance and caring behaviors through provision of in-service training that focuses on development of professional values. © 2018 John Wiley & Sons Australia, Ltd.
Process-related factors associated with disciplinary board decisions
2013-01-01
Background In most health care systems disciplinary boards have been organised in order to process patients’ complaints about health professionals. Although, the safe-guarding of the legal rights of the involved parties is a crucial concern, there is limited knowledge about what role the complaint process plays with regard to board decision outcomes. Using complaint cases towards general practitioners, the aim of this study was to identify what process factors are statistically associated with disciplinary actions as seen from the party of the complainant and the defendant general practitioner, respectively. Methods Danish Patient Complaints Board decisions concerning general practitioners completed in 2007 were examined. Information on process factors was extracted from the case files and included complaint delay, complainant’s lawyer involvement, the number of general practitioners involved, event duration, expert witness involvement, case management duration and decision outcome (discipline or no discipline). Multiple logistic regression analyses were performed on compound case decisions eventually involving more general practitioners (as seen from the complainant’s side) and on separated decisions (as seen from the defendant general practitioner’s side). Results From the general practitioner’s side, when the number of general practitioners involved in a complaint case increased, odds of being disciplined significantly decreased (OR=0.661 per additional general practitioner involved, p<0.001). Contrarily, from the complainant’s side, no association could be detected between complaining against a plurality of general practitioners and the odds of at least one general practitioner being disciplined. From both sides, longer case management duration was associated with higher odds of discipline (OR=1.038 per additional month, p=0.010). No association could be demonstrated with regard to complaint delay, lawyer involvement, event duration, or expert witness involvement. There was lawyer involvement in 5% of cases and expert witness involvement in 92% of cases. The mean complaint delay was 3 months and 18 days and the mean case management duration was 14 months and 7 days. Conclusions Certain complaint process factors might be statistically associated with decision outcomes. However, the impact diverges as seen from the different parties. Future studies are merited in order to uncover the judicial mechanisms lying behind. PMID:23294599
Process-related factors associated with disciplinary board decisions.
Birkeland, Søren; Christensen, Rene dePont; Damsbo, Niels; Kragstrup, Jakob
2013-01-07
In most health care systems disciplinary boards have been organised in order to process patients' complaints about health professionals. Although, the safe-guarding of the legal rights of the involved parties is a crucial concern, there is limited knowledge about what role the complaint process plays with regard to board decision outcomes. Using complaint cases towards general practitioners, the aim of this study was to identify what process factors are statistically associated with disciplinary actions as seen from the party of the complainant and the defendant general practitioner, respectively. Danish Patient Complaints Board decisions concerning general practitioners completed in 2007 were examined. Information on process factors was extracted from the case files and included complaint delay, complainant's lawyer involvement, the number of general practitioners involved, event duration, expert witness involvement, case management duration and decision outcome (discipline or no discipline). Multiple logistic regression analyses were performed on compound case decisions eventually involving more general practitioners (as seen from the complainant's side) and on separated decisions (as seen from the defendant general practitioner's side). From the general practitioner's side, when the number of general practitioners involved in a complaint case increased, odds of being disciplined significantly decreased (OR=0.661 per additional general practitioner involved, p<0.001). Contrarily, from the complainant's side, no association could be detected between complaining against a plurality of general practitioners and the odds of at least one general practitioner being disciplined. From both sides, longer case management duration was associated with higher odds of discipline (OR=1.038 per additional month, p=0.010). No association could be demonstrated with regard to complaint delay, lawyer involvement, event duration, or expert witness involvement. There was lawyer involvement in 5% of cases and expert witness involvement in 92% of cases. The mean complaint delay was 3 months and 18 days and the mean case management duration was 14 months and 7 days. Certain complaint process factors might be statistically associated with decision outcomes. However, the impact diverges as seen from the different parties. Future studies are merited in order to uncover the judicial mechanisms lying behind.
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Prospective associations between peer victimization and aggression.
Ostrov, Jamie M
2010-01-01
The current study involved a short-term longitudinal study of young children (M = 44.56 months, SD = 11.88, N = 103) to test the prospective associations between peer victimization and aggression subtypes. Path analyses documented that teacher-reported physical victimization was uniquely associated with increases in observed physical aggression over time. The path model also revealed that teacher-reported relational victimization was uniquely associated with statistically significant increases in observed relational aggression over time. Ways in which these findings extend the extant developmental literature are discussed. © 2010 The Author. Child Development © 2010 Society for Research in Child Development, Inc.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Zamani, Omid; Böttcher, Elke; Rieger, Jörg D; Mitterhuber, Johann; Hawel, Reinhold; Stallinger, Sylvia; Eller, Norbert
2014-06-01
In this observer-blinded, multicenter, non-inferiority study, 489 patients suffering from painful osteoarthritis of the hip or knee were included to investigate safety and tolerability of Dexibuprofen vs. Ibuprofen powder for oral suspension. Only patients who had everyday joint pain for the past 3 months and "moderate" to "severe" global pain intensity in the involved hip/knee of within the last 48 h were enrolled. The treatment period was up to 14 days with a control visit after 3 days. The test product was Dexibuprofen 400 mg powder for oral suspension (daily dose 800 mg) compared to Ibuprofen 400 mg powder for oral suspension (daily dose 1,600 mg). Gastrointestinal adverse drug reactions were reported in 8 patients (3.3 %) in the Dexibuprofen group and in 19 patients (7.8 %) in the Ibuprofen group. Statistically significant non-inferiority was shown for Dexibuprofen. Comparing both groups by a Chi square test showed a statistical significant lower proportion of related gastrointestinal events in the Dexibuprofen group. All analyses of secondary tolerability parameters showed the same result of a significantly better safety profile in this therapy setting for Dexibuprofen compared to Ibuprofen. The sum of pain intensity, pain relief and global assessments showed no significant difference between treatment groups. In summary, analyses revealed at least non-inferiority in terms of efficacy and a statistically significant better safety profile for the Dexibuprofen treatment.
Swahn, Monica H; Culbreth, Rachel; Tumwesigye, Nazarius Mbona; Topalli, Volkan; Wright, Eric; Kasirye, Rogers
2018-05-24
This paper examines problem drinking, alcohol-related violence, and homelessness among youth living in the slums of Kampala—an understudied population at high-risk for both alcohol use and violence. This study is based on a cross-sectional survey conducted in 2014 with youth living in the slums and streets of Kampala, Uganda ( n = 1134), who were attending Uganda Youth Development Link drop-in centers. The analyses for this paper were restricted to youth who reported current alcohol consumption ( n = 346). Problem drinking patterns were assessed among youth involved in alcohol-related violence. Mediation analyses were conducted to examine the impact of homelessness on alcohol-related violence through different measures of problem drinking. Nearly 46% of youth who consumed alcohol were involved in alcohol-related violence. Problem drinkers were more likely to report getting in an accident (χ² = 6.8, df = 1, p = 0.009), having serious problems with parents (χ² = 21.1, df = 1, p < 0.0001) and friends (χ² = 18.2, df = 1, p < 0.0001), being a victim of robbery (χ² = 8.8, df = 1, p = 0.003), and going to a hospital (χ² = 15.6, df = 1, p < 0.0001). For the mediation analyses, statistically significant models were observed for frequent drinking, heavy drinking, and drunkenness. Interventions should focus on delaying and reducing alcohol use in this high-risk population.
The statistical analysis of global climate change studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, J.W.
1992-01-01
The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less
Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.
Festing, M F
2001-01-01
In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.
So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori
2017-04-01
The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.
Tompkins, Adrian M; McCreesh, Nicky
2016-03-31
One year of mobile phone location data from Senegal is analysed to determine the characteristics of journeys that result in an overnight stay, and are thus relevant for malaria transmission. Defining the home location of each person as the place of most frequent calls, it is found that approximately 60% of people who spend nights away from home have regular destinations that are repeatedly visited, although only 10% have 3 or more regular destinations. The number of journeys involving overnight stays peaks at a distance of 50 km, although roughly half of such journeys exceed 100 km. Most visits only involve a stay of one or two nights away from home, with just 4% exceeding one week. A new agent-based migration model is introduced, based on a gravity model adapted to represent overnight journeys. Each agent makes journeys involving overnight stays to either regular or random locations, with journey and destination probabilities taken from the mobile phone dataset. Preliminary simulations show that the agent-based model can approximately reproduce the patterns of migration involving overnight stays.
Toptaş, Tayfun; Peştereli, Elif; Bozkurt, Selen; Erdoğan, Gülgün; Şimşek, Tayup
2018-03-01
To examine correlations among nuclear, architectural, and International Federation of Gynecology and Obstetrics (FIGO) grading systems, and their relationships with lymph node (LN) involvement in endometrioid endometrial cancer. Histopathology slides of 135 consecutive patients were reviewed with respect to tumor grade and LN metastasis. Notable nuclear atypia was defined as grade 3 nuclei. FIGO grade was established by raising the architectural grade (AG) by one grade when the tumor was composed of cells with nuclear grade (NG) 3. Correlations between the grading systems were analyzed using Spearman's rank correlation coefficients, and relationships of grading systems with LN involvement were assessed using logistic regression analysis. Correlation analysis revealed a significant and strongly positive relationship between FIGO and architectural grading systems (r=0.885, p=0.001); however, correlations of nuclear grading with the architectural (r=0.535, p=0.165) and FIGO grading systems (r=0.589, p=0.082) were moderate and statistically non-significant. Twenty-five (18.5%) patients had LN metastasis. LN involvement rates differed significantly between tumors with AG 1 and those with AG 2, and tumors with FIGO grade 1 and those with FIGO grade 2. In contrast, although the difference in LN involvement rates failed to reach statistical significance between tumors with NG 1 and those with NG 2, it was significant between NG 2 and NG 3 (p=0.042). Although all three grading systems were associated with LN involvement in univariate analyses, an independent relationship could not be established after adjustment for other confounders in multivariate analysis. Nuclear grading is significantly correlated with neither architectural nor FIGO grading systems. The differences in LN involvement rates in the nuclear grading system reach significance only in the setting of tumor cells with NG 3; however, none of the grading systems was an independent predictor of LN involvement.
Wegner, S.J.
1989-01-01
Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)
Increasing Transparency Through a Multiverse Analysis.
Steegen, Sara; Tuerlinckx, Francis; Gelman, Andrew; Vanpaemel, Wolf
2016-09-01
Empirical research inevitably includes constructing a data set by processing raw data into a form ready for statistical analysis. Data processing often involves choices among several reasonable options for excluding, transforming, and coding data. We suggest that instead of performing only one analysis, researchers could perform a multiverse analysis, which involves performing all analyses across the whole set of alternatively processed data sets corresponding to a large set of reasonable scenarios. Using an example focusing on the effect of fertility on religiosity and political attitudes, we show that analyzing a single data set can be misleading and propose a multiverse analysis as an alternative practice. A multiverse analysis offers an idea of how much the conclusions change because of arbitrary choices in data construction and gives pointers as to which choices are most consequential in the fragility of the result. © The Author(s) 2016.
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
Johnatty, Sharon E.; Beesley, Jonathan; Chen, Xiaoqing; Spurdle, Amanda B.; deFazio, Anna; Webb, Penelope M; Goode, Ellen L.; Rider, David N.; Vierkant, Robert A.; Anderson, Stephanie; Wu, Anna H.; Pike, Malcolm; Van Den Berg, David; Moysich, Kirsten; Ness, Roberta; Doherty, Jennifer; Rossing, Mary-Anne; Pearce, Celeste Leigh; Chenevix-Trench, Georgia
2009-01-01
Fibroblast growth factor (FGF)-2 (basic) is a potent angiogenic molecule involved in tumour progression, and is one of several growth factors with a central role in ovarian carcinogenesis. We hypothesised that common single nucleotide polymorphisms (SNPs) in the FGF2 gene may alter angiogenic potential and thereby susceptibility to ovarian cancer. We analysed 25 FGF2 tgSNPs using five independent study populations from the United States and Australia. Analysis was restricted to non-Hispanic White women with serous ovarian carcinoma (1269 cases and 2829 controls). There were no statistically significant associations between any FGF2 SNPs and ovarian cancer risk. There were two nominally statistically significant associations between heterozygosity for two FGF2 SNPs (rs308379 and rs308447; p<0.05) and serous ovarian cancer risk in the combined dataset, but rare homozygous estimates did not achieve statistical significance, nor were they consistent with the log additive model of inheritance. Overall genetic variation in FGF2 does not appear to play a role in susceptibility to ovarian cancer. PMID:19456219
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
Jacob, Laurent; Combes, Florence; Burger, Thomas
2018-06-18
We propose a new hypothesis test for the differential abundance of proteins in mass-spectrometry based relative quantification. An important feature of this type of high-throughput analyses is that it involves an enzymatic digestion of the sample proteins into peptides prior to identification and quantification. Due to numerous homology sequences, different proteins can lead to peptides with identical amino acid chains, so that their parent protein is ambiguous. These so-called shared peptides make the protein-level statistical analysis a challenge and are often not accounted for. In this article, we use a linear model describing peptide-protein relationships to build a likelihood ratio test of differential abundance for proteins. We show that the likelihood ratio statistic can be computed in linear time with the number of peptides. We also provide the asymptotic null distribution of a regularized version of our statistic. Experiments on both real and simulated datasets show that our procedures outperforms state-of-the-art methods. The procedures are available via the pepa.test function of the DAPAR Bioconductor R package.
Cavaco, Carina; Pereira, Jorge A M; Taunk, Khushman; Taware, Ravindra; Rapole, Srikanth; Nagarajaram, Hampapathalu; Câmara, José S
2018-05-07
Saliva is possibly the easiest biofluid to analyse and, despite its simple composition, contains relevant metabolic information. In this work, we explored the potential of the volatile composition of saliva samples as biosignatures for breast cancer (BC) non-invasive diagnosis. To achieve this, 106 saliva samples of BC patients and controls in two distinct geographic regions in Portugal and India were extracted and analysed using optimised headspace solid-phase microextraction gas chromatography mass spectrometry (HS-SPME/GC-MS, 2 mL acidified saliva containing 10% NaCl, stirred (800 rpm) for 45 min at 38 °C and using the CAR/PDMS SPME fibre) followed by multivariate statistical analysis (MVSA). Over 120 volatiles from distinct chemical classes, with significant variations among the groups, were identified. MVSA retrieved a limited number of volatiles, viz. 3-methyl-pentanoic acid, 4-methyl-pentanoic acid, phenol and p-tert-butyl-phenol (Portuguese samples) and acetic, propanoic, benzoic acids, 1,2-decanediol, 2-decanone, and decanal (Indian samples), statistically relevant for the discrimination of BC patients in the populations analysed. This work defines an experimental layout, HS-SPME/GC-MS followed by MVSA, suitable to characterise volatile fingerprints for saliva as putative biosignatures for BC non-invasive diagnosis. Here, it was applied to BC samples from geographically distant populations and good disease separation was obtained. Further studies using larger cohorts are therefore very pertinent to challenge and strengthen this proof-of-concept study. Graphical abstract ᅟ.
Research: increasing value, reducing waste 2
Ioannidis, John P A; Greenland, Sander; Hlatky, Mark A; Khoury, Muin J; Macleod, Malcolm R; Moher, David; Schulz, Kenneth F; Tibshirani, Robert
2015-01-01
Correctable weaknesses in the design, conduct, and analysis of biomedical and public health research studies can produce misleading results and waste valuable resources. Small effects can be difficult to distinguish from bias introduced by study design and analyses. An absence of detailed written protocols and poor documentation of research is common. Information obtained might not be useful or important, and statistical precision or power is often too low or used in a misleading way. Insufficient consideration might be given to both previous and continuing studies. Arbitrary choice of analyses and an overemphasis on random extremes might affect the reported findings. Several problems relate to the research workforce, including failure to involve experienced statisticians and methodologists, failure to train clinical researchers and laboratory scientists in research methods and design, and the involvement of stakeholders with conflicts of interest. Inadequate emphasis is placed on recording of research decisions and on reproducibility of research. Finally, reward systems incentivise quantity more than quality, and novelty more than reliability. We propose potential solutions for these problems, including improvements in protocols and documentation, consideration of evidence from studies in progress, standardisation of research efforts, optimisation and training of an experienced and non-conflicted scientific workforce, and reconsideration of scientific reward systems. PMID:24411645
Kriging analysis of mean annual precipitation, Powder River Basin, Montana and Wyoming
Karlinger, M.R.; Skrivan, James A.
1981-01-01
Kriging is a statistical estimation technique for regionalized variables which exhibit an autocorrelation structure. Such structure can be described by a semi-variogram of the observed data. The kriging estimate at any point is a weighted average of the data, where the weights are determined using the semi-variogram and an assumed drift, or lack of drift, in the data. Block, or areal, estimates can also be calculated. The kriging algorithm, based on unbiased and minimum-variance estimates, involves a linear system of equations to calculate the weights. Kriging variances can then be used to give confidence intervals of the resulting estimates. Mean annual precipitation in the Powder River basin, Montana and Wyoming, is an important variable when considering restoration of coal-strip-mining lands of the region. Two kriging analyses involving data at 60 stations were made--one assuming no drift in precipitation, and one a partial quadratic drift simulating orographic effects. Contour maps of estimates of mean annual precipitation were similar for both analyses, as were the corresponding contours of kriging variances. Block estimates of mean annual precipitation were made for two subbasins. Runoff estimates were 1-2 percent of the kriged block estimates. (USGS)
Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi
2017-01-01
Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.
Evolution of the Max and Mlx networks in animals.
McFerrin, Lisa G; Atchley, William R
2011-01-01
Transcription factors (TFs) are essential for the regulation of gene expression and often form emergent complexes to perform vital roles in cellular processes. In this paper, we focus on the parallel Max and Mlx networks of TFs because of their critical involvement in cell cycle regulation, proliferation, growth, metabolism, and apoptosis. A basic-helix-loop-helix-zipper (bHLHZ) domain mediates the competitive protein dimerization and DNA binding among Max and Mlx network members to form a complex system of cell regulation. To understand the importance of these network interactions, we identified the bHLHZ domain of Max and Mlx network proteins across the animal kingdom and carried out several multivariate statistical analyses. The presence and conservation of Max and Mlx network proteins in animal lineages stemming from the divergence of Metazoa indicate that these networks have ancient and essential functions. Phylogenetic analysis of the bHLHZ domain identified clear relationships among protein families with distinct points of radiation and divergence. Multivariate discriminant analysis further isolated specific amino acid changes within the bHLHZ domain that classify proteins, families, and network configurations. These analyses on Max and Mlx network members provide a model for characterizing the evolution of TFs involved in essential networks.
2012-01-01
Background Elucidating the selective and neutral forces underlying molecular evolution is fundamental to understanding the genetic basis of adaptation. Plants have evolved a suite of adaptive responses to cope with variable environmental conditions, but relatively little is known about which genes are involved in such responses. Here we studied molecular evolution on a genome-wide scale in two species of Cardamine with distinct habitat preferences: C. resedifolia, found at high altitudes, and C. impatiens, found at low altitudes. Our analyses focussed on genes that are involved in stress responses to two factors that differentiate the high- and low-altitude habitats, namely temperature and irradiation. Results High-throughput sequencing was used to obtain gene sequences from C. resedifolia and C. impatiens. Using the available A. thaliana gene sequences and annotation, we identified nearly 3,000 triplets of putative orthologues, including genes involved in cold response, photosynthesis or in general stress responses. By comparing estimated rates of molecular substitution, codon usage, and gene expression in these species with those of Arabidopsis, we were able to evaluate the role of positive and relaxed selection in driving the evolution of Cardamine genes. Our analyses revealed a statistically significant higher rate of molecular substitution in C. resedifolia than in C. impatiens, compatible with more efficient positive selection in the former. Conversely, the genome-wide level of selective pressure is compatible with more relaxed selection in C. impatiens. Moreover, levels of selective pressure were heterogeneous between functional classes and between species, with cold responsive genes evolving particularly fast in C. resedifolia, but not in C. impatiens. Conclusions Overall, our comparative genomic analyses revealed that differences in effective population size might contribute to the differences in the rate of protein evolution and in the levels of selective pressure between the C. impatiens and C. resedifolia lineages. The within-species analyses also revealed evolutionary patterns associated with habitat preference of two Cardamine species. We conclude that the selective pressures associated with the habitats typical of C. resedifolia may have caused the rapid evolution of genes involved in cold response. PMID:22257588
Trucks involved in fatal accidents factbook 2007.
DOT National Transportation Integrated Search
2010-01-01
This document presents aggregate statistics on trucks involved in traffic accidents in 2007. The : statistics are derived from the Trucks Involved in Fatal Accidents (TIFA) file, compiled by the : University of Michigan Transportation Research Instit...
Buses involved in fatal accidents factbook 2007
DOT National Transportation Integrated Search
2010-03-01
This document presents aggregate statistics on buses involved in traffic accidents in 2007. The : statistics are derived from the Buses Involved in Fatal Accidents (BIFA) file, compiled by the : University of Michigan Transportation Research Institut...
Trucks involved in fatal accidents factbook 2008.
DOT National Transportation Integrated Search
2011-03-01
This document presents aggregate statistics on trucks involved in traffic accidents in 2008. The : statistics are derived from the Trucks Involved in Fatal Accidents (TIFA) file, compiled by the : University of Michigan Transportation Research Instit...
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Maximizing sensitivity of the psychomotor vigilance test (PVT) to sleep loss.
Basner, Mathias; Dinges, David F
2011-05-01
The psychomotor vigilance test (PVT) is among the most widely used measures of behavioral alertness, but there is large variation among published studies in PVT performance outcomes and test durations. To promote standardization of the PVT and increase its sensitivity and specificity to sleep loss, we determined PVT metrics and task durations that optimally discriminated sleep deprived subjects from alert subjects. Repeated-measures experiments involving 10-min PVT assessments every 2 h across both acute total sleep deprivation (TSD) and 5 days of chronic partial sleep deprivation (PSD). Controlled laboratory environment. 74 healthy subjects (34 female), aged 22-45 years. TSD experiment involving 33 h awake (N = 31 subjects) and a PSD experiment involving 5 nights of 4 h time in bed (N = 43 subjects). In a paired t-test paradigm and for both TSD and PSD, effect sizes of 10 different PVT performance outcomes were calculated. Effect sizes were high for both TSD (1.59-1.94) and PSD (0.88-1.21) for PVT metrics related to lapses and to measures of psychomotor speed, i.e., mean 1/RT (response time) and mean slowest 10% 1/RT. In contrast, PVT mean and median RT outcomes scored low to moderate effect sizes influenced by extreme values. Analyses facilitating only portions of the full 10-min PVT indicated that for some outcomes, high effect sizes could be achieved with PVT durations considerably shorter than 10 min, although metrics involving lapses seemed to profit from longer test durations in TSD. Due to their superior conceptual and statistical properties and high sensitivity to sleep deprivation, metrics involving response speed and lapses should be considered primary outcomes for the 10-min PVT. In contrast, PVT mean and median metrics, which are among the most widely used outcomes, should be avoided as primary measures of alertness. Our analyses also suggest that some shorter-duration PVT versions may be sensitive to sleep loss, depending on the outcome variable selected, although this will need to be confirmed in comparative analyses of separate duration versions of the PVT. Using both sensitive PVT metrics and optimal test durations maximizes the sensitivity of the PVT to sleep loss and therefore potentially decreases the sample size needed to detect the same neurobehavioral deficit. We propose criteria to better standardize the 10-min PVT and facilitate between-study comparisons and meta-analyses.
IJmker, Stefan; Blatter, Birgitte M.; de Korte, Elsbeth M.
2007-01-01
Introduction The objective of the present study is to describe the extent of productivity loss among computer workers with neck/shoulder symptoms and hand/arm symptoms, and to examine associations between pain intensity, various physical and psychosocial factors and productivity loss in computer workers with neck/shoulder and hand/arm symptoms. Methods A cross-sectional design was used. The study population consisted of 654 computer workers with neck/shoulder or hand/arm symptoms from five different companies. Descriptive statistics were used to describe the occurrence of self-reported productivity loss. Logistic regression analyses were used to examine the associations. Results In 26% of all the cases reporting symptoms, productivity loss was involved, the most often in cases reporting both symptoms (36%). Productivity loss involved sickness absence in 11% of the arm/hand cases, 32% of the neck/shoulder cases and 43% of the cases reporting both symptoms. The multivariate analyses showed statistically significant odds ratios for pain intensity (OR: 1.26; CI: 1.12–1.41), for high effort/no low reward (OR: 2.26; CI: 1.24–4.12), for high effort/low reward (OR: 1.95; CI: 1.09–3.50), and for low job satisfaction (OR: 3.10; CI: 1.44–6.67). Physical activity in leisure time, full-time work and overcommitment were not associated with productivity loss. Conclusion In most computer workers with neck/shoulder symptoms or hand/arm symptoms productivity loss derives from a decreased performance at work and not from sickness absence. Favorable psychosocial work characteristics might prevent productivity loss in symptomatic workers. PMID:17636455
Energy requirements in preschool-age children with cerebral palsy.
Walker, Jacqueline L; Bell, Kristie L; Boyd, Roslyn N; Davies, Peter S W
2012-12-01
There is a paucity of data concerning the energy requirements (ERs) of preschool-age children with cerebral palsy (CP), the knowledge of which is essential for early nutritional management. We aimed to determine the ERs for preschool-age children with CP in relation to functional ability, motor type, and distribution and compared with typically developing children (TDC) and published estimation equations. Thirty-two children with CP (63% male) of all functional abilities, motor types, and distributions and 16 TDC (63% male) aged 2.9-4.4 y participated in this study. The doubly labeled water method was used to determine ERs. Statistical analyses were conducted by 1-factor ANOVA and post hoc Tukey honestly significant difference tests, independent and paired t tests, Bland and Altman analyses, correlations, and multivariable regressions. As a population, children with CP had significantly lower ERs than did TDC (P < 0.05). No significant difference in ERs was found between ambulant children and TDC. Marginally ambulant and nonambulant children had ERs that were ∼18% lower than those of ambulant children and 31% lower than those of TDC. A trend toward lower ERs with greater numbers of limbs involved was observed. The influence of motor type could not be determined statistically. Published equations substantially underestimated ERs in the nonambulant children by ∼22%. In preschool-age children with CP, ERs decreased as ambulatory status declined and more limbs were involved. The greatest predictor of ERs was fat-free mass, then ambulatory status. Future research should build on the information presented to expand the knowledge base regarding ERs in children with CP. This trial was registered with the Australian New Zealand Clinical Trials Registry as ACTRN 12612000686808.
Investigation of serum biomarkers in primary gout patients using iTRAQ-based screening.
Ying, Ying; Chen, Yong; Zhang, Shun; Huang, Haiyan; Zou, Rouxin; Li, Xiaoke; Chu, Zanbo; Huang, Xianqian; Peng, Yong; Gan, Minzhi; Geng, Baoqing; Zhu, Mengya; Ying, Yinyan; Huang, Zuoan
2018-03-21
Primary gout is a major disease that affects human health; however, its pathogenesis is not well known. The purpose of this study was to identify biomarkers to explore the underlying mechanisms of primary gout. We used the isobaric tags for relative and absolute quantitation (iTRAQ) technique combined with liquid chromatography-tandem mass spectrometry to screen differentially expressed proteins between gout patients and controls. We also identified proteins potentially involved in gout pathogenesis by analysing biological processes, cellular components, molecular functions, Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and protein-protein interactions. We further verified some samples using enzyme-linked immunosorbent assay (ELISA). Statistical analyses were carried out using SPSS v. 20.0 and ROC (receiver operating characterstic) curve analyses were carried out using Medcalc software. Two-sided p-values <0.05 were deemed to be statistically significant for all analyses. We identified 95 differentially expressed proteins (50 up-regulated and 45 down-regulated), and selected nine proteins (α-enolase (ENOA), glyceraldehyde-3-phosphate dehydrogenase (G3P), complement component C9 (CO9), profilin-1 (PROF1), lipopolysaccharide-binding protein (LBP), tubulin beta-4A chain (TBB4A), phosphoglycerate kinase (PGK1), glucose-6-phosphate isomerase (G6PI), and transketolase (TKT)) for verification. This showed that the level of TBB4A was significantly higher in primary gout than in controls (p=0.023). iTRAQ technology was useful in the selection of differentially expressed proteins from proteomes, and provides a strong theoretical basis for the study of biomarkers and mechanisms in primary gout. In addition, TBB4A protein may be associated with primary gout.
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Spatial panel analyses of alcohol outlets and motor vehicle crashes in California: 1999–2008
Ponicki, William R.; Gruenewald, Paul J.; Remer, Lillian G.
2014-01-01
Although past research has linked alcohol outlet density to higher rates of drinking and many related social problems, there is conflicting evidence of density’s association with traffic crashes. An abundance of local alcohol outlets simultaneously encourages drinking and reduces driving distances required to obtain alcohol, leading to an indeterminate expected impact on alcohol-involved crash risk. This study separately investigates the effects of outlet density on (1) the risk of injury crashes relative to population and (2) the likelihood that any given crash is alcohol-involved, as indicated by police reports and single-vehicle nighttime status of crashes. Alcohol outlet density effects are estimated using Bayesian misalignment Poisson analyses of all California ZIP codes over the years 1999–2008. These misalignment models allow panel analysis of ZIP-code data despite frequent redefinition of postal-code boundaries, while also controlling for overdispersion and the effects of spatial autocorrelation. Because models control for overall retail density, estimated alcohol-outlet associations represent the extra effect of retail establishments selling alcohol. The results indicate a number of statistically well-supported associations between retail density and crash behavior, but the implied effects on crash risks are relatively small. Alcohol-serving restaurants have a greater impact on overall crash risks than on the likelihood that those crashes involve alcohol, whereas bars primarily affect the odds that crashes are alcohol-involved. Off-premise outlet density is negatively associated with risks of both crashes and alcohol involvement, while the presence of a tribal casino in a ZIP code is linked to higher odds of police-reported drinking involvement. Alcohol outlets in a given area are found to influence crash risks both locally and in adjacent ZIP codes, and significant spatial autocorrelation also suggests important relationships across geographical units. These results suggest that each type of alcohol outlet can have differing impacts on risks of crashing as well as the alcohol involvement of those crashes. PMID:23537623
Job involvement of primary healthcare employees: does a service provision model play a role?
Koponen, Anne M; Laamanen, Ritva; Simonsen-Rehn, Nina; Sundell, Jari; Brommels, Mats; Suominen, Sakari
2010-05-01
To investigate whether the development of job involvement of primary healthcare (PHC) employees in Southern Municipality (SM), where PHC services were outsourced to an independent non-profit organisation, differed from that in the three comparison municipalities (M1, M2, M3) with municipal service providers. Also, the associations of job involvement with factors describing the psychosocial work environment were investigated. A panel mail survey 2000-02 in Finland (n=369, response rates 73% and 60%). The data were analysed by descriptive statistics and multivariate linear regression analysis. Despite the favourable development in the psychosocial work environment, job involvement decreased most in SM, which faced the biggest organisational changes. Job involvement decreased also in M3, where the psychosocial work environment deteriorated most. Job involvement in 2002 was best predicted by high baseline level of interactional justice and work control, positive change in interactional justice, and higher age. Also other factors, such as organisational stability, seemed to play a role; after controlling for the effect of the psychosocial work characteristics, job involvement was higher in M3 than in SM. Outsourcing of PHC services may decrease job involvement at least during the first years. A particular service provision model is better than the others only if it is superior in providing a favourable and stable psychosocial work environment.
Disentangling patient and public involvement in healthcare decisions: why the difference matters.
Fredriksson, Mio; Tritter, Jonathan Q
2017-01-01
Patient and public involvement has become an integral aspect of many developed health systems and is judged to be an essential driver for reform. However, little attention has been paid to the distinctions between patients and the public, and the views of patients are often seen to encompass those of the general public. Using an ideal-type approach, we analyse crucial distinctions between patient involvement and public involvement using examples from Sweden and England. We highlight that patients have sectional interests as health service users in contrast to citizens who engage as a public policy agent reflecting societal interests. Patients draw on experiential knowledge and focus on output legitimacy and performance accountability, aim at typical representativeness, and a direct responsiveness to individual needs and preferences. In contrast, the public contributes with collective perspectives generated from diversity, centres on input legitimacy achieved through statistical representativeness, democratic accountability and indirect responsiveness to general citizen preferences. Thus, using patients as proxies for the public fails to achieve intended goals and benefits of involvement. We conclude that understanding and measuring the impact of patient and public involvement can only develop with the application of a clearer comprehension of the differences. © 2016 Foundation for the Sociology of Health & Illness.
Li, Bo; Guo, Kenan; Zeng, Li; Zeng, Benhua; Huo, Ran; Luo, Yuanyuan; Wang, Haiyang; Dong, Meixue; Zheng, Peng; Zhou, Chanjuan; Chen, Jianjun; Liu, Yiyun; Liu, Zhao; Fang, Liang; Wei, Hong; Xie, Peng
2018-01-31
Major depressive disorder (MDD) is a common mood disorder. Gut microbiota may be involved in the pathogenesis of depression via the microbe-gut-brain axis. Liver is vulnerable to exposure of bacterial products translocated from the gut via the portal vein and may be involved in the axis. In this study, germ-free mice underwent fecal microbiota transplantation from MDD patients and healthy controls. Behavioral tests verified the depression model. Metabolomics using gas chromatography-mass spectrometry, nuclear magnetic resonance, and liquid chromatography-mass spectrometry determined the influence of microbes on liver metabolism. With multivariate statistical analysis, 191 metabolites were distinguishable in MDD mice from control (CON) mice. Compared with CON mice, MDD mice showed lower levels for 106 metabolites and higher levels for 85 metabolites. These metabolites are associated with lipid and energy metabolism and oxidative stress. Combined analyses of significantly changed proteins in livers from another depression model induced by chronic unpredictive mild stress returned a high score for the Lipid Metabolism, Free Radical Scavenging, and Molecule Transports network, and canonical pathways were involved in energy metabolism and tryptophan degradation. The two mouse models of depression suggest that changes in liver metabolism might be involved in the pathogenesis of MDD. Conjoint analyses of fecal, serum, liver, and hippocampal metabolites from fecal microbiota transplantation mice suggested that aminoacyl-tRNA biosynthesis significantly changed and fecal metabolites showed a close relationship with the liver. These findings may help determine the biological mechanisms of depression and provide evidence about "depression microbes" impacting on liver metabolism.
Jones, Adriane Clark; Hambright, K David; Caron, David A
2018-05-01
Microbial communities are comprised of complex assemblages of highly interactive taxa. We employed network analyses to identify and describe microbial interactions and co-occurrence patterns between microbial eukaryotes and bacteria at two locations within a low salinity (0.5-3.5 ppt) lake over an annual cycle. We previously documented that the microbial diversity and community composition within Lake Texoma, southwest USA, were significantly affected by both seasonal forces and a site-specific bloom of the harmful alga, Prymnesium parvum. We used network analyses to answer ecological questions involving both the bacterial and microbial eukaryotic datasets and to infer ecological relationships within the microbial communities. Patterns of connectivity at both locations reflected the seasonality of the lake including a large rain disturbance in May, while a comparison of the communities between locations revealed a localized response to the algal bloom. A network built from shared nodes (microbial operational taxonomic units and environmental variables) and correlations identified conserved associations at both locations within the lake. Using network analyses, we were able to detect disturbance events, characterize the ecological extent of a harmful algal bloom, and infer ecological relationships not apparent from diversity statistics alone.
Libiger, Ondrej; Schork, Nicholas J.
2015-01-01
It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061
Urbanowicz, Ryan J.; Granizo-Mackenzie, Ambrose; Moore, Jason H.
2014-01-01
Michigan-style learning classifier systems (M-LCSs) represent an adaptive and powerful class of evolutionary algorithms which distribute the learned solution over a sizable population of rules. However their application to complex real world data mining problems, such as genetic association studies, has been limited. Traditional knowledge discovery strategies for M-LCS rule populations involve sorting and manual rule inspection. While this approach may be sufficient for simpler problems, the confounding influence of noise and the need to discriminate between predictive and non-predictive attributes calls for additional strategies. Additionally, tests of significance must be adapted to M-LCS analyses in order to make them a viable option within fields that require such analyses to assess confidence. In this work we introduce an M-LCS analysis pipeline that combines uniquely applied visualizations with objective statistical evaluation for the identification of predictive attributes, and reliable rule generalizations in noisy single-step data mining problems. This work considers an alternative paradigm for knowledge discovery in M-LCSs, shifting the focus from individual rules to a global, population-wide perspective. We demonstrate the efficacy of this pipeline applied to the identification of epistasis (i.e., attribute interaction) and heterogeneity in noisy simulated genetic association data. PMID:25431544
What do results from coordinate-based meta-analyses tell us?
Albajes-Eizagirre, Anton; Radua, Joaquim
2018-08-01
Coordinate-based meta-analyses (CBMA) methods, such as Activation Likelihood Estimation (ALE) and Seed-based d Mapping (SDM), have become an invaluable tool for summarizing the findings of voxel-based neuroimaging studies. However, the progressive sophistication of these methods may have concealed two particularities of their statistical tests. Common univariate voxelwise tests (such as the t/z-tests used in SPM and FSL) detect voxels that activate, or voxels that show differences between groups. Conversely, the tests conducted in CBMA test for "spatial convergence" of findings, i.e., they detect regions where studies report "more peaks than in most regions", regions that activate "more than most regions do", or regions that show "larger differences between groups than most regions do". The first particularity is that these tests rely on two spatial assumptions (voxels are independent and have the same probability to have a "false" peak), whose violation may make their results either conservative or liberal, though fortunately current versions of ALE, SDM and some other methods consider these assumptions. The second particularity is that the use of these tests involves an important paradox: the statistical power to detect a given effect is higher if there are no other effects in the brain, whereas lower in presence of multiple effects. Copyright © 2018 Elsevier Inc. All rights reserved.
Anholt, R M; Berezowski, J; Robertson, C; Stephen, C
2015-09-01
There is interest in the potential of companion animal surveillance to provide data to improve pet health and to provide early warning of environmental hazards to people. We implemented a companion animal surveillance system in Calgary, Alberta and the surrounding communities. Informatics technologies automatically extracted electronic medical records from participating veterinary practices and identified cases of enteric syndrome in the warehoused records. The data were analysed using time-series analyses and a retrospective space-time permutation scan statistic. We identified a seasonal pattern of reports of occurrences of enteric syndromes in companion animals and four statistically significant clusters of enteric syndrome cases. The cases within each cluster were examined and information about the animals involved (species, age, sex), their vaccination history, possible exposure or risk behaviour history, information about disease severity, and the aetiological diagnosis was collected. We then assessed whether the cases within the cluster were unusual and if they represented an animal or public health threat. There was often insufficient information recorded in the medical record to characterize the clusters by aetiology or exposures. Space-time analysis of companion animal enteric syndrome cases found evidence of clustering. Collection of more epidemiologically relevant data would enhance the utility of practice-based companion animal surveillance.
Is there a genetic cause for cancer cachexia? – a clinical validation study in 1797 patients
Solheim, T S; Fayers, P M; Fladvad, T; Tan, B; Skorpen, F; Fearon, K; Baracos, V E; Klepstad, P; Strasser, F; Kaasa, S
2011-01-01
Background: Cachexia has major impact on cancer patients' morbidity and mortality. Future development of cachexia treatment needs methods for early identification of patients at risk. The aim of the study was to validate nine single-nucleotide polymorphisms (SNPs) previously associated with cachexia, and to explore 182 other candidate SNPs with the potential to be involved in the pathophysiology. Method: A total of 1797 cancer patients, classified as either having severe cachexia, mild cachexia or no cachexia, were genotyped. Results: After allowing for multiple testing, there was no statistically significant association between any of the SNPs analysed and the cachexia groups. However, consistent with prior reports, two SNPs from the acylpeptide hydrolase (APEH) gene showed suggestive statistical significance (P=0.02; OR, 0.78). Conclusion: This study failed to detect any significant association between any of the SNPs analysed and cachexia; although two SNPs from the APEH gene had a trend towards significance. The APEH gene encodes the enzyme APEH, postulated to be important in the endpoint of the ubiquitin system and thus the breakdown of proteins into free amino acids. In cachexia, there is an extensive breakdown of muscle proteins and an increase in the production of acute phase proteins in the liver. PMID:21934689
Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.
Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip
2018-02-01
Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.
Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao
2009-01-01
Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
Hayhoe, Richard P G; Lentjes, Marleen A H; Luben, Robert N; Khaw, Kay-Tee; Welch, Ailsa A
2015-08-01
In our aging population, maintenance of bone health is critical to reduce the risk of osteoporosis and potentially debilitating consequences of fractures in older individuals. Among modifiable lifestyle and dietary factors, dietary magnesium and potassium intakes are postulated to influence bone quality and osteoporosis, principally via calcium-dependent alteration of bone structure and turnover. We investigated the influence of dietary magnesium and potassium intakes, as well as circulating magnesium, on bone density status and fracture risk in an adult population in the United Kingdom. A random subset of 4000 individuals from the European Prospective Investigation into Cancer and Nutrition-Norfolk cohort of 25,639 men and women with baseline data was used for bone density cross-sectional analyses and combined with fracture cases (n = 1502) for fracture case-cohort longitudinal analyses (mean follow-up 13.4 y). Relevant biological, lifestyle, and dietary covariates were used in multivariate regression analyses to determine associations between dietary magnesium and potassium intakes and calcaneal broadband ultrasound attenuation (BUA), as well as in Prentice-weighted Cox regression to determine associated risk of fracture. Separate analyses, excluding dietary covariates, investigated associations of BUA and fractures with serum magnesium concentration. Statistically significant positive trends in calcaneal BUA for women (n = 1360) but not men (n = 968) were apparent across increasing quintiles of magnesium plus potassium (Mg+K) z score intake (P = 0.03) or potassium intake alone (P = 0.04). Reduced hip fracture risk in both men (n = 1958) and women (n = 2755) was evident for individuals in specific Mg+K z score intake quintiles compared with the lowest. Statistically significant trends in fracture risk in men across serum magnesium concentration groups were apparent for spine fractures (P = 0.02) and total hip, spine, and wrist fractures (P = 0.02). None of these individual statistically significant associations remained after adjustment for multiple testing. These findings enhance the limited literature studying the association of magnesium and potassium with bone density and demonstrate that further investigation is warranted into the mechanisms involved and the potential protective role against osteoporosis. © 2015 American Society for Nutrition.
Triković-Janjić, Olivera; Apostolović, Mirjana; Janosević, Mirjana; Filipović, Gordana
2008-02-01
Anthropometric methods of measuring the whole body and body parts are the most commonly applied methods of analysing the growth and development of children. Anthropometric measures are interconnected, so that with growth and development the change of one of the parameters causes the change of the other. The aim of the paper was to analyse whether dental development follows the overall growth and development and what the ratio of this interdependence is. The research involved a sample of 134 participants, aged between 6 and 8 years. Dental age was determined as the average of the sum of existing permanent teeth from the participants aged 6, 7 and 8. With the aim of analysing physical growth and development, commonly accepted anthropometric indexes were applied: height, weight, circumference of the head, the chest cavity at its widest point, the upper arm, the abdomen, the thigh and thickness of the epidermis. The dimensions were measured according to the methodology of the International Biological Programme. The influence of the pertinent variables' related size on the analysed variable was deter mined by the statistical method of multivariable regression. The middle values of all the anthropometric parametres, except for the thickness of the epidermis, were slightly bigger with male participants, and the circumference of the chest cavity was statistically considerably bigger (p < 0.05). The results of anthropometric measurement showed in general a distinct homogeneity not only of the sample group but also within gender, in relation to all the dimensions, excyt for the thickness of the epidermis. The average of the dental age of the participants was 10.36, (10.42 and 10.31 for females and males respectively). Considerable correlation (R = 0.59) with high statistical significance (p < 0.001) was determined between dental age and the set of anthropometric parameters of general growth and development. There is a considerable positive correlation (R = 0.59) between dental age and anthropometric parameters of general growth and development, which confirms that dental development follows the overall growth and development of children, aged between 6 and 8 years.
The analysis of influence of individual and environmental factors on 2-wheeled users' injuries.
Marković, Nenad; Pešić, Dalibor R; Antić, Boris; Vujanić, Milan
2016-08-17
Powered 2-wheeled motor vehicles (PTWs) are one of the most vulnerable categories of road users. Bearing that fact in mind, we have researched the effects of individual and environmental factors on the severity and type of injuries of PTW users. The aim was to recognize the circumstances that cause these accidents and take some preventive actions that would improve the level of road safety for PTWs. In the period from 2001 to 2010, an analysis of 139 road accidents involving PTWs was made by the Faculty of Transport and Traffic Engineering in Belgrade. The effects of both individual (age, gender, etc.) and environmental factors (place of an accident, time of day, etc.) on the cause of accidents and severity and type of injuries of PTWs are reported in this article. Analyses of these effects were conducted using logistic regression, chi-square tests, and Pearson's correlation. Factors such as categories of road users, pavement conditions, place of accident, age, and time of day have a statistically significant effect on PTW injuries, whereas other factors (gender, road type; that is, straight or curvy) do not. The article also defines the interdependence of the occurrence of particular injuries at certain speeds. The results show that if PTW users died of a head injury, these were usually concurrent with chest injuries, injuries to internal organs, and limb injuries. It has been shown that there is a high degree of influence of individual factors on the occurrence of accidents involving 2-wheelers (PTWs/bicycles) but with no statistically significant relation. Establishing the existence of such conditionalities enables identifying and defining factors that have an impact on the occurrence of traffic accidents involving bicyclists or PTWs. Such a link between individual factors and the occurrence of accidents makes it possible for system managers to take appropriate actions aimed at certain categories of 2-wheelers in order to reduce casualties in a particular area. The analysis showed that most of the road factors do not have a statistically significant effect on either category of 2-wheeler. Namely, the logistic regression analysis showed that there is a statistically significant effect of the place of accident on the occurrence of accidents involving bicyclists.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
NASA Astrophysics Data System (ADS)
Toth-Tascau, Mirela; Balanean, Flavia; Krepelka, Mircea
2013-10-01
Musculoskeletal impairment of the upper limb can cause difficulties in performing basic daily activities. Three dimensional motion analyses can provide valuable data of arm movement in order to precisely determine arm movement and inter-joint coordination. The purpose of this study was to develop a method to evaluate the degree of impairment based on the influence of shoulder movements in the amplitude of elbow flexion and extension based on the assumption that a lack of motion of the elbow joint will be compensated by an increased shoulder activity. In order to develop and validate a statistical model, one healthy young volunteer has been involved in the study. The activity of choice simulated blowing the nose, starting from a slight flexion of the elbow and raising the hand until the middle finger touches the tip of the nose and return to the start position. Inter-joint coordination between the elbow and shoulder movements showed significant correlation. Statistical regression was used to fit an equation model describing the influence of shoulder movements on the elbow mobility. The study provides a brief description of the kinematic analysis protocol and statistical models that may be useful in describing the relation between inter-joint movements of daily activities.
Gadomski, Adam; Ausloos, Marcel; Casey, Tahlia
2017-04-01
This article addresses a set of observations framed in both deterministic as well as statistical formal guidelines. It operates within the framework of nonlinear dynamical systems theory (NDS). It is argued that statistical approaches can manifest themselves ambiguously, creating practical discrepancies in psychological and cognitive data analyses both quantitatively and qualitatively. This is sometimes termed in literature as 'questionable research practices.' This communication points to the demand for a deeper awareness of the data 'initial conditions, allowing to focus on pertinent evolution constraints in such systems.' It also considers whether the exponential (Malthus-type) or the algebraic (Pareto-type) statistical distribution ought to be effectively considered in practical interpretations. The role of repetitive specific behaviors by patients seeking treatment is examined within the NDS frame. The significance of these behaviors, involving a certain memory effect seems crucial in determining a patient's progression or regression. With this perspective, it is discussed how a sensitively applied hazardous or triggering factor can be helpful for well-controlled psychological strategic treatments; those attributable to obsessive-compulsive disorders or self-injurious behaviors are recalled in particular. There are both inherent criticality- and complexity-exploiting (reduced-variance based) relations between a therapist and a patient that can be intrinsically included in NDS theory.
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Exploring Indigenous Identities of Urban American Indian Youth of the Southwest
Kulis, Stephen; Wagaman, M. Alex; Tso, Crescentia; Brown, Eddie F.
2013-01-01
This study examined the indigenous identities of urban American Indian youth using measures related to three theoretical dimensions of Markstrom's identity model: identification (tribal and ethnic heritage), connection (reservation ties), and involvement in traditional cultural practices and spirituality. Data came from self-administered questionnaires completed by 142 urban American Indian middle school students in a southwestern metropolitan area with the largest urban American Indian population in the United States. Using both quantitative and qualitative measures, descriptive statistics showed most youth were connected to all three dimensions of indigenous identity. Hierarchical regression analyses showed that youth with the strongest sense of American Indian ethnic identity had native fathers and were heavily involved in traditional cultural practices and spirituality. Although urban American Indians may face challenges in maintaining their tribal identities, the youth in this study appeared strongly moored to their native indigenous heritage. Implications for future research are discussed. PMID:23766553
How can the struggling research community adapt to the information age?
NASA Astrophysics Data System (ADS)
Bouma, Johan
2017-04-01
The widespread use of internet and social media has fundamentally changed the relationship of research with society culminating in :"fact-free politics". Rather than operate from the position of distant experts who are graciously willing to serve mankind, expecting gratitude and admiration in return, scientists encounter knowledgeable stakeholders realizing :"citizen science". Some see science as just producing :"yet another opinion". It is time now to re-establish and advocate the basic power of the scientific effort, involving stakeholders systematically, by: analysing a problem, shaping it into a researchable item, applying scientifically sound data and methods, testing results statistically and presenting results, realizing that "the" truth does not exist. The seventeen UN Sustainable Development Goals (SDG's) provide an attractive focus for inter- and transdisciplinary research approaches defining a series of options covering several SDG's in a system's analysis. Involved stakeholders and policy makers remain responsible to select their favorite option.
HTR1B and HTR2C in autism spectrum disorders in Brazilian families.
Orabona, G M; Griesi-Oliveira, K; Vadasz, E; Bulcão, V L S; Takahashi, V N V O; Moreira, E S; Furia-Silva, M; Ros-Melo, A M S; Dourado, F; Matioli, S R; Matioli, R; Otto, P; Passos-Bueno, M R
2009-01-23
Autism spectrum disorders (ASD) is a group of behaviorally defined neurodevelopmental disabilities characterized by multiple genetic etiologies and a complex presentation. Several studies suggest the involvement of the serotonin system in the development of ASD, but only few have investigated serotonin receptors. We have performed a case-control and a family-based study with 9 polymorphisms mapped to two serotonin receptor genes (HTR1B and HTR2C) in 252 Brazilian male ASD patients of European ancestry. These analyses showed evidence of undertransmission of the HTR1B haplotypes containing alleles -161G and -261A at HTR1B gene to ASD (P=0.003), but no involvement of HTR2C to the predisposition to this disease. Considering the relatively low level of statistical significance and the power of our sample, further studies are required to confirm the association of these serotonin-related genes and ASD.
TERT rs2736098 polymorphism and cancer risk: results of a meta-analysis.
Qi, Hao-Yu; Zou, Peng; Zhao, Lin; Zhu, Jue; Gu, Ai-Hua
2012-01-01
Several studies have demonstrated associations between the TERT rs2736098 single nucleotide polymorphisms (SNPs) and susceptibility to cancer development. However, there are conflicting results. A systematic meta-analysis was therefore performed to establish the cancer risk associated with the polymorphism. In this meta-analysis, a total of 6 case-control studies, including 5,567 cases and 6,191 controls, were included. Crude odds ratios with 95% confidence intervals were used to assess the strength of associations in several genetic models. Our results showed no association reaching the level of statistical significance for overall risk. Interestingly, in the stratified analyses (subdivided by ethnicity), significantly increased risks were found in the Asian subgroup which indicates the TERT rs2736098 polymorphism may have controversial involvement in cancer susceptibility. Overall, this meta-analysis indicates that the TERT rs2736098 polymorphism may have little involvement in cancer susceptibility.
Analyzing user behavior of the micro-blogging website Sina Weibo during hot social events
NASA Astrophysics Data System (ADS)
Guan, Wanqiu; Gao, Haoyu; Yang, Mingmin; Li, Yuan; Ma, Haixin; Qian, Weining; Cao, Zhigang; Yang, Xiaoguang
2014-02-01
The spread and resonance of users’ opinions on Sina Weibo, the most popular micro-blogging website in China, are tremendously influential, having significantly affected the processes of many real-world hot social events. We select 21 hot events that were widely discussed on Sina Weibo in 2011, and do some statistical analyses. Our main findings are that (i) male users are more likely to be involved, (ii) messages that contain pictures and those posted by verified users are more likely to be reposted, while those with URLs are less likely, (iii) the gender factor, for most events, presents no significant difference in reposting likelihood.
Chang, Lin-Chau; Mahmood, Riaz; Qureshi, Samina; Breder, Christopher D
2017-01-01
Standardised MedDRA Queries (SMQs) have been developed since the early 2000's and used by academia, industry, public health, and government sectors for detecting safety signals in adverse event safety databases. The purpose of the present study is to characterize how SMQs are used and the impact in safety analyses for New Drug Application (NDA) and Biologics License Application (BLA) submissions to the United States Food and Drug Administration (USFDA). We used the PharmaPendium database to capture SMQ use in Summary Basis of Approvals (SBoAs) of drugs and biologics approved by the USFDA. Characteristics of the drugs and the SMQ use were employed to evaluate the role of SMQ safety analyses in regulatory decisions and the veracity of signals they revealed. A comprehensive search of the SBoAs yielded 184 regulatory submissions approved from 2006 to 2015. Search strategies more frequently utilized restrictive searches with "narrow terms" to enhance specificity over strategies using "broad terms" to increase sensitivity, while some involved modification of search terms. A majority (59%) of 1290 searches used descriptive statistics, however inferential statistics were utilized in 35% of them. Commentary from reviewers and supervisory staff suggested that a small, yet notable percentage (18%) of 1290 searches supported regulatory decisions. The searches with regulatory impact were found in 73 submissions (40% of the submissions investigated). Most searches (75% of 227 searches) with regulatory implications described how the searches were confirmed, indicating prudence in the decision-making process. SMQs have an increasing role in the presentation and review of safety analysis for NDAs/BLAs and their regulatory reviews. This study suggests that SMQs are best used for screening process, with descriptive statistics, description of SMQ modifications, and systematic verification of cases which is crucial for drawing regulatory conclusions.
Chang, Lin-Chau; Mahmood, Riaz; Qureshi, Samina
2017-01-01
Purpose Standardised MedDRA Queries (SMQs) have been developed since the early 2000’s and used by academia, industry, public health, and government sectors for detecting safety signals in adverse event safety databases. The purpose of the present study is to characterize how SMQs are used and the impact in safety analyses for New Drug Application (NDA) and Biologics License Application (BLA) submissions to the United States Food and Drug Administration (USFDA). Methods We used the PharmaPendium database to capture SMQ use in Summary Basis of Approvals (SBoAs) of drugs and biologics approved by the USFDA. Characteristics of the drugs and the SMQ use were employed to evaluate the role of SMQ safety analyses in regulatory decisions and the veracity of signals they revealed. Results A comprehensive search of the SBoAs yielded 184 regulatory submissions approved from 2006 to 2015. Search strategies more frequently utilized restrictive searches with “narrow terms” to enhance specificity over strategies using “broad terms” to increase sensitivity, while some involved modification of search terms. A majority (59%) of 1290 searches used descriptive statistics, however inferential statistics were utilized in 35% of them. Commentary from reviewers and supervisory staff suggested that a small, yet notable percentage (18%) of 1290 searches supported regulatory decisions. The searches with regulatory impact were found in 73 submissions (40% of the submissions investigated). Most searches (75% of 227 searches) with regulatory implications described how the searches were confirmed, indicating prudence in the decision-making process. Conclusions SMQs have an increasing role in the presentation and review of safety analysis for NDAs/BLAs and their regulatory reviews. This study suggests that SMQs are best used for screening process, with descriptive statistics, description of SMQ modifications, and systematic verification of cases which is crucial for drawing regulatory conclusions. PMID:28570569
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary
2003-02-01
Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small
Reif, David M.; Israel, Mark A.; Moore, Jason H.
2007-01-01
The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B
2008-08-07
There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data.
Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B
2008-01-01
Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data. PMID:18687127
Changing response of the North Atlantic/European winter climate to the 11 year solar cycle
NASA Astrophysics Data System (ADS)
Ma, Hedi; Chen, Haishan; Gray, Lesley; Zhou, Liming; Li, Xing; Wang, Ruili; Zhu, Siguang
2018-03-01
Recent studies have presented conflicting results regarding the 11 year solar cycle (SC) influences on winter climate over the North Atlantic/European region. Analyses of only the most recent decades suggest a synchronized North Atlantic Oscillation (NAO)-like response pattern to the SC. Analyses of long-term climate data sets dating back to the late 19th century, however, suggest a mean sea level pressure (mslp) response that lags the SC by 2-4 years in the southern node of the NAO (i.e. Azores region). To understand the conflicting nature and cause of these time dependencies in the SC surface response, the present study employs a lead/lag multi-linear regression technique with a sliding window of 44 years over the period 1751-2016. Results confirm previous analyses, in which the average response for the whole time period features a statistically significant 2-4 year lagged mslp response centered over the Azores region. Overall, the lagged nature of Azores mslp response is generally consistent in time. Stronger and statistically significant SC signals tend to appear in the periods when the SC forcing amplitudes are relatively larger. Individual month analysis indicates the consistent lagged response in December-January-February average arises primarily from early winter months (i.e. December and January), which has been associated with ocean feedback processes that involve reinforcement by anomalies from the previous winter. Additional analysis suggests that the synchronous NAO-like response in recent decades arises primarily from late winter (February), possibly reflecting a result of strong internal noise.
Teenage pregnancy and long-term mental health outcomes among Indigenous women in Canada.
Xavier, Chloé G; Brown, Hilary K; Benoit, Anita C
2018-06-01
Our objectives were to (1) compare the risks for poor long-term mental health outcomes among indigenous women with and without a teenage pregnancy and (2) determine if community and cultural factors modify this risk. We conducted a secondary analysis of the 2012 Aboriginal Peoples Survey. Respondents were women aged 25 to 49 years who had given birth to at least one child. Teenage mothers (age at first birth 13 to 19 years; n = 1330) were compared to adult mothers (age at first birth 20 years or older; n = 2630). Mental health outcomes were psychological distress, mental health status, suicide ideation/attempt, and alcohol consumption. To address objective 1, we used binary logistic regression analyses before and after controlling for covariates. To address objective 2, we tested the significance of interaction terms between teenage pregnancy status and effect measure modifiers. In unadjusted analyses, teenage pregnancy was associated with increased risk for poor/fair mental health [odds ratio (OR) 1.77, 95% confidence interval (CI) 1.24-2.53] and suicide attempt/ideation (OR 1.95, 95% CI 1.07-3.54). However, the associations were not statistically significant after adjusting for demographic, socioeconomic, environmental, and health covariates. Teenage pregnancy was not associated with increased risk for high psychological distress or heavy alcohol consumption in unadjusted or adjusted analyses. The interaction term for involvement in cultural activities was statistically significant for poor/fair mental health; however, after stratification, ORs were non-significant. Among indigenous mothers, teenage pregnancy was less important than broader social and health circumstances in predicting long-term mental health.
Canturk, Kemal Murat; Emre, Ramazan; Gurkan, Cemal; Komur, Ilhami; Muslumanoglu, Omer; Dogan, Muhammed
2016-07-01
Here, we report an incest paternity case involving three biological brothers as alleged fathers (AFs), their biological sister and her child that was investigated using the Investigator ESSplex Plus, AmpFLSTR Identifiler Plus/Investigator IDplex Plus and PowerPlex 16 kits. Initial duo paternity investigations using 15-loci autosomal short tandem repeat (STR) analyses failed to exclude any of the AFs. Despite the fact that one of the brothers, AF1, had a mismatch with the child at a single locus (D2S1338), the possibility of a single-step mutation could not be ruled out. When the number of autosomal STR loci analysed was increased to 22 without the inclusion of the mother, AF2 and AF3 still could not be excluded, since both of them again had no mismatches with the child. A breakthrough was possible only upon inclusion of the mother so that trio paternity investigations were carried out. This time AF1 and AF2 could be excluded at two loci (D2S1338 and D1S1656) and six loci (vWa, D1S1656, D12S391, FGA, PENTA E and PENTA D), respectively, and AF3 was then the only brother who could not be excluded from paternity. Subsequent statistical analyses suggested that AF3 could be the biological father of the child with a combined paternity index >100 billion and a probability of paternity >99.99999999%. These findings consolidate the fact that complex paternity cases such as those involving incest could benefit more from the inclusion of the mother than simply increasing the number of STR loci analysed. © The Author(s) 2015.
Gorczynski, Paul; Burnell, Karen; Dewey, Ann; Costello, Joseph T
2017-02-01
Evidence based practice (EBP) is a process that involves making conscientious decisions that take into account the best available information, clinical expertise, and values and experiences of the patient. EBP helps empower health care professionals to establish service provisions that are clinically excellent, cost-effective, and culturally sensitive to the wishes of their patients. With a need for rapid integration of new evidence into EBP, systematic reviews and meta-analyses have become important tools for health care professionals. Systematic reviews and meta-analyses are conducted in a conscientious manner, following an established set of rules where individuals identify studies that address a particular question based on clearly defined inclusion and exclusion criteria along with a predetermined method of analysis. Conducting systematic reviews and meta-analyses isn't easy nor quick and requires knowledge in a particular subject area, research methods, and statistics. Teaching health care professionals, including undergraduate and graduate students, the processes and skills necessary to carry out systematic reviews and meta-analyses is essential, yet few teaching resources exist for academic staff to facilitate this endeavor. The purpose of this article is to present two strategies taken by academic staff in the Faculty of Science at the University of Portsmouth, UK to teach evidence synthesis and processes to enhance EBP. One case involves a pedagogical approach used with exercise science masters students while the other details the work of an on-line postgraduate certificate program that has been developed in collaboration with Cochrane UK. © 2016 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
Jouet, Agathe; McMullan, Mark; van Oosterhout, Cock
2015-06-01
Plant immune genes, or resistance genes, are involved in a co-evolutionary arms race with a diverse range of pathogens. In agronomically important grasses, such R genes have been extensively studied because of their role in pathogen resistance and in the breeding of resistant cultivars. In this study, we evaluate the importance of recombination, mutation and selection on the evolution of the R gene complex Rp1 of Sorghum, Triticum, Brachypodium, Oryza and Zea. Analyses show that recombination is widespread, and we detected 73 independent instances of sequence exchange, involving on average 1567 of 4692 nucleotides analysed (33.4%). We were able to date 24 interspecific recombination events and found that four occurred postspeciation, which suggests that genetic introgression took place between different grass species. Other interspecific events seemed to have been maintained over long evolutionary time, suggesting the presence of balancing selection. Significant positive selection (i.e. a relative excess of nonsynonymous substitutions (dN /dS >1)) was detected in 17-95 codons (0.42-2.02%). Recombination was significantly associated with areas with high levels of polymorphism but not with an elevated dN /dS ratio. Finally, phylogenetic analyses show that recombination results in a general overestimation of the divergence time (mean = 14.3%) and an alteration of the gene tree topology if the tree is not calibrated. Given that the statistical power to detect recombination is determined by the level of polymorphism of the amplicon as well as the number of sequences analysed, it is likely that many studies have underestimated the importance of recombination relative to the mutation rate. © 2015 John Wiley & Sons Ltd.
Is more better than less? An analysis of children's mental health services.
Foster, E M
2000-01-01
OBJECTIVE: To assess the dose-response relationship for outpatient therapy received by children and adolescents-that is, to determine the impact of added outpatient visits on key mental health outcomes (functioning and symptomatology). DATA SOURCES/STUDY SETTING: The results presented involve analyses of data from the Fort Bragg Demonstration and are based on a sample of 301 individuals using outpatient services. STUDY DESIGN: This article provides estimates of the impact of outpatient therapy based on comparisons of individuals receiving differing treatment doses. Those comparisons involve standard multiple regression analyses as well as instrumental variables estimation. The latter provides a means of adjusting comparisons for unobserved or unmeasured differences among individuals receiving differing doses, differences that would otherwise be confounded with the impact of treatment dose. DATA COLLECTION/EXTRACTION METHODS: Using structured diagnostic interviews and behavior checklists completed by the child and his or her caretaker, detailed data on psychopathology, symptomatology, and psychosocial functioning were collected on individuals included in these analyses. Information on the use of mental health services was taken from insurance claims and a management information system. Services data were used to describe the use of outpatient therapy within the year following entry into the study. PRINCIPAL FINDINGS/CONCLUSIONS: Instrumental variables estimation indicates that added outpatient therapy improves functioning among children and adolescents. The effect is statistically significant and of moderate practical magnitude. These results imply that conventional analyses of the dose-response relationship may understate the impact of additional treatment on functioning. This finding is robust to choice of functional form, length of time over which outcomes are measured, and model specification. Dose does not appear to influence symptomatology. PMID:11130814
Using statistical deformable models to reconstruct vocal tract shape from magnetic resonance images.
Vasconcelos, M J M; Rua Ventura, S M; Freitas, D R S; Tavares, J M R S
2010-10-01
The mechanisms involved in speech production are complex and have thus been subject to growing attention by the scientific community. It has been demonstrated that magnetic resonance imaging (MRI) is a powerful means in the understanding of the morphology of the vocal tract. Over the last few years, statistical deformable models have been successfully used to identify and characterize bones and organs in medical images and point distribution models (PDMs) have gained particular relevance. In this work, the suitability of these models has been studied to characterize and further reconstruct the shape of the vocal tract in the articulation of Portuguese European (EP) speech sounds, one of the most spoken languages worldwide, with the aid of MR images. Therefore, a PDM has been built from a set of MR images acquired during the artificially sustained articulation of 25 EP speech sounds. Following this, the capacity of this statistical model to characterize the shape deformation of the vocal tract during the production of sounds was analysed. Next, the model was used to reconstruct five EP oral vowels and the EP fricative consonants. As far as a study on speech production is concerned, this study is considered to be the first approach to characterize and reconstruct the vocal tract shape from MR images by using PDMs. In addition, the findings achieved permit one to conclude that this modelling technique compels an enhanced understanding of the dynamic speech events involved in sustained articulations based on MRI, which are of particular interest for speech rehabilitation and simulation.
Frew, Paula M; Williams, Victoria A; Shapiro, Eve T; Sanchez, Travis; Rosenberg, Eli S; Fenimore, Vincent L; Sullivan, Patrick S
2013-11-21
HIV continues to be a major concern among MSM, yet Black MSM have not been enrolled in HIV research studies in proportionate numbers to White MSM. We developed an HIV prevention research brand strategy for MSM. Questionnaires and focus groups were conducted with 54 participants. Descriptive statistics and chi-square analyses were performed and qualitative data were transcribed and content analyzed to identify common themes. Formative research results indicated that younger Black MSM (18-29 years) were less likely to think about joining prevention studies compared to older (≥30 years) Black MSM ( x 2 = 5.92, P = 0.015). Qualitative and quantitative results indicate four prominent themes related to brand development: (1) communication sources (message deliverer), (2) message (impact of public health messaging on perceptions of HIV research), (3) intended audience (underlying issues that influence personal relevance of HIV research), and (4) communication channels (reaching intended audiences). The findings highlight the importance of behavioral communication translational research to effectively engage hard-to-reach populations. Despite reservations, MSM in our formative study expressed a need for active involvement and greater education to facilitate their engagement in HIV prevention research. Thus, the brand concept of "InvolveMENt" emerged.
Veilleux, Sophie; Noiseux, Isabelle; Lachapelle, Nathalie; Kohen, Rita; Vachon, Luc; Guay, Brian White; Bitton, Alain; Rioux, John D
2018-02-01
This study aims to characterize the relationships between the quality of the information given by the physician, the involvement of the patient in shared decision making (SDM), and outcomes in terms of satisfaction and anxiety pertaining to the treatment of inflammatory bowel disease (IBD). A Web survey was conducted among 200 Canadian patients affected with IBD. The theoretical model of SDM was adjusted using path analysis. SAS software was used for all statistical analyses. The quality of the knowledge transfer between the physician and the patient is significantly associated with the components of SDM: information comprehension, patient involvement and decision certainty about the chosen treatment. In return, patient involvement in SDM is significantly associated with higher satisfaction and, as a result, lower anxiety as regards treatment selection. This study demonstrates the importance of involving patients in shared treatment decision making in the context of IBD. Understanding shared decision making may motivate patients to be more active in understanding the relevant information for treatment selection, as it is related to their level of satisfaction, anxiety and adherence to treatment. This relationship should encourage physicians to promote shared decision making. Copyright © 2017 Elsevier B.V. All rights reserved.
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944
Adrion, Christine; Mansmann, Ulrich
2012-09-10
A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Clinical Research Methodology 2: Observational Clinical Research.
Sessler, Daniel I; Imrey, Peter B
2015-10-01
Case-control and cohort studies are invaluable research tools and provide the strongest feasible research designs for addressing some questions. Case-control studies usually involve retrospective data collection. Cohort studies can involve retrospective, ambidirectional, or prospective data collection. Observational studies are subject to errors attributable to selection bias, confounding, measurement bias, and reverse causation-in addition to errors of chance. Confounding can be statistically controlled to the extent that potential factors are known and accurately measured, but, in practice, bias and unknown confounders usually remain additional potential sources of error, often of unknown magnitude and clinical impact. Causality-the most clinically useful relation between exposure and outcome-can rarely be definitively determined from observational studies because intentional, controlled manipulations of exposures are not involved. In this article, we review several types of observational clinical research: case series, comparative case-control and cohort studies, and hybrid designs in which case-control analyses are performed on selected members of cohorts. We also discuss the analytic issues that arise when groups to be compared in an observational study, such as patients receiving different therapies, are not comparable in other respects.
Proteomic analysis of early phase of conidia germination in Aspergillus nidulans.
Oh, Young Taek; Ahn, Chun-Seob; Kim, Jeong Geun; Ro, Hyeon-Su; Lee, Chang-Won; Kim, Jae Won
2010-03-01
In order to investigate proteins involved in early phase of conidia germination, proteomic analysis was performed using two-dimensional gel electrophoresis (2D-GE) in conjunction with MALDI-TOF mass spectrometry (MS). The expression levels of 241 proteins varied quantitatively with statistical significance (P<0.05) at the early phase of the germination stage. Out of these 57 were identified by MALDI-TOF MS. Through classification of physiological functions from Conserved Domain Database analysis, among the identified proteins, 21, 13, and 6 proteins were associated with energy metabolism, protein synthesis, and protein folding process, respectively. Interestingly, eight proteins, which are involved in detoxification of reactive oxygen species (ROS) including catalase A, thioredoxin reductase, and mitochondrial peroxiredoxin, were also identified. The expression levels of the genes were further confirmed using Northern blot and reverse transcriptase (RT)-PCR analyses. This study represents the first proteomic analysis of early phase of conidia germination and will contribute to a better understanding of the molecular events involved in conidia germination process. Copyright (c) 2009 Elsevier Inc. All rights reserved.
In silico prediction of protein-protein interactions in human macrophages
2014-01-01
Background Protein-protein interaction (PPI) network analyses are highly valuable in deciphering and understanding the intricate organisation of cellular functions. Nevertheless, the majority of available protein-protein interaction networks are context-less, i.e. without any reference to the spatial, temporal or physiological conditions in which the interactions may occur. In this work, we are proposing a protocol to infer the most likely protein-protein interaction (PPI) network in human macrophages. Results We integrated the PPI dataset from the Agile Protein Interaction DataAnalyzer (APID) with different meta-data to infer a contextualized macrophage-specific interactome using a combination of statistical methods. The obtained interactome is enriched in experimentally verified interactions and in proteins involved in macrophage-related biological processes (i.e. immune response activation, regulation of apoptosis). As a case study, we used the contextualized interactome to highlight the cellular processes induced upon Mycobacterium tuberculosis infection. Conclusion Our work confirms that contextualizing interactomes improves the biological significance of bioinformatic analyses. More specifically, studying such inferred network rather than focusing at the gene expression level only, is informative on the processes involved in the host response. Indeed, important immune features such as apoptosis are solely highlighted when the spotlight is on the protein interaction level. PMID:24636261
Shanley, Thomas P; Cvijanovich, Natalie; Lin, Richard; Allen, Geoffrey L; Thomas, Neal J; Doctor, Allan; Kalyanaraman, Meena; Tofil, Nancy M; Penfil, Scott; Monaco, Marie; Odoms, Kelli; Barnes, Michael; Sakthivel, Bhuvaneswari; Aronow, Bruce J; Wong, Hector R
2007-01-01
We have conducted longitudinal studies focused on the expression profiles of signaling pathways and gene networks in children with septic shock. Genome-level expression profiles were generated from whole blood-derived RNA of children with septic shock (n = 30) corresponding to day one and day three of septic shock, respectively. Based on sequential statistical and expression filters, day one and day three of septic shock were characterized by differential regulation of 2,142 and 2,504 gene probes, respectively, relative to controls (n = 15). Venn analysis demonstrated 239 unique genes in the day one dataset, 598 unique genes in the day three dataset, and 1,906 genes common to both datasets. Functional analyses demonstrated time-dependent, differential regulation of genes involved in multiple signaling pathways and gene networks primarily related to immunity and inflammation. Notably, multiple and distinct gene networks involving T cell- and MHC antigen-related biology were persistently downregulated on both day one and day three. Further analyses demonstrated large scale, persistent downregulation of genes corresponding to functional annotations related to zinc homeostasis. These data represent the largest reported cohort of patients with septic shock subjected to longitudinal genome-level expression profiling. The data further advance our genome-level understanding of pediatric septic shock and support novel hypotheses. PMID:17932561
Increasing value and reducing waste in research design, conduct, and analysis.
Ioannidis, John P A; Greenland, Sander; Hlatky, Mark A; Khoury, Muin J; Macleod, Malcolm R; Moher, David; Schulz, Kenneth F; Tibshirani, Robert
2014-01-11
Correctable weaknesses in the design, conduct, and analysis of biomedical and public health research studies can produce misleading results and waste valuable resources. Small effects can be difficult to distinguish from bias introduced by study design and analyses. An absence of detailed written protocols and poor documentation of research is common. Information obtained might not be useful or important, and statistical precision or power is often too low or used in a misleading way. Insufficient consideration might be given to both previous and continuing studies. Arbitrary choice of analyses and an overemphasis on random extremes might affect the reported findings. Several problems relate to the research workforce, including failure to involve experienced statisticians and methodologists, failure to train clinical researchers and laboratory scientists in research methods and design, and the involvement of stakeholders with conflicts of interest. Inadequate emphasis is placed on recording of research decisions and on reproducibility of research. Finally, reward systems incentivise quantity more than quality, and novelty more than reliability. We propose potential solutions for these problems, including improvements in protocols and documentation, consideration of evidence from studies in progress, standardisation of research efforts, optimisation and training of an experienced and non-conflicted scientific workforce, and reconsideration of scientific reward systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
Network meta-analyses could be improved by searching more sources and by involving a librarian.
Li, Lun; Tian, Jinhui; Tian, Hongliang; Moher, David; Liang, Fuxiang; Jiang, Tongxiao; Yao, Liang; Yang, Kehu
2014-09-01
Network meta-analyses (NMAs) aim to rank the benefits (or harms) of interventions, based on all available randomized controlled trials. Thus, the identification of relevant data is critical. We assessed the conduct of the literature searches in NMAs. Published NMAs were retrieved by searching electronic bibliographic databases and other sources. Two independent reviewers selected studies and five trained reviewers abstracted data regarding literature searches, in duplicate. Search method details were examined using descriptive statistics. Two hundred forty-nine NMAs were included. Eight used previous systematic reviews to identify primary studies without further searching, and five did not report any literature searches. In the 236 studies that used electronic databases to identify primary studies, the median number of databases was 3 (interquartile range: 3-5). MEDLINE, EMBASE, and Cochrane Central Register of Controlled Trials were the most commonly used databases. The most common supplemental search methods included reference lists of included studies (48%), reference lists of previous systematic reviews (40%), and clinical trial registries (32%). None of these supplemental methods was conducted in more than 50% of the NMAs. Literature searches in NMAs could be improved by searching more sources, and by involving a librarian or information specialist. Copyright © 2014 Elsevier Inc. All rights reserved.
Searching Choices: Quantifying Decision-Making Processes Using Search Engine Data.
Moat, Helen Susannah; Olivola, Christopher Y; Chater, Nick; Preis, Tobias
2016-07-01
When making a decision, humans consider two types of information: information they have acquired through their prior experience of the world, and further information they gather to support the decision in question. Here, we present evidence that data from search engines such as Google can help us model both sources of information. We show that statistics from search engines on the frequency of content on the Internet can help us estimate the statistical structure of prior experience; and, specifically, we outline how such statistics can inform psychological theories concerning the valuation of human lives, or choices involving delayed outcomes. Turning to information gathering, we show that search query data might help measure human information gathering, and it may predict subsequent decisions. Such data enable us to compare information gathered across nations, where analyses suggest, for example, a greater focus on the future in countries with a higher per capita GDP. We conclude that search engine data constitute a valuable new resource for cognitive scientists, offering a fascinating new tool for understanding the human decision-making process. Copyright © 2016 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
How big should a mammal be? A macroecological look at mammalian body size over space and time
Smith, Felisa A.; Lyons, S. Kathleen
2011-01-01
Macroecology was developed as a big picture statistical approach to the study of ecology and evolution. By focusing on broadly occurring patterns and processes operating at large spatial and temporal scales rather than on localized and/or fine-scaled details, macroecology aims to uncover general mechanisms operating at organism, population, and ecosystem levels of organization. Macroecological studies typically involve the statistical analysis of fundamental species-level traits, such as body size, area of geographical range, and average density and/or abundance. Here, we briefly review the history of macroecology and use the body size of mammals as a case study to highlight current developments in the field, including the increasing linkage with biogeography and other disciplines. Characterizing the factors underlying the spatial and temporal patterns of body size variation in mammals is a daunting task and moreover, one not readily amenable to traditional statistical analyses. Our results clearly illustrate remarkable regularities in the distribution and variation of mammalian body size across both geographical space and evolutionary time that are related to ecology and trophic dynamics and that would not be apparent without a broader perspective. PMID:21768152
Using venlafaxine to treat behavioral disorders in patients with autism spectrum disorder.
Carminati, Giuliana Galli; Gerber, Fabienne; Darbellay, Barbara; Kosel, Markus Mathaus; Deriaz, Nicolas; Chabert, Jocelyne; Fathi, Marc; Bertschy, Gilles; Ferrero, François; Carminati, Federico
2016-02-04
To test the efficacy of venlafaxine at a dose of 18.75 mg/day on the reduction of behavioral problems such as irritability and hyperactivity/noncompliance in patients with intellectual disabilities and autism spectrum disorder (ASD). Our secondary hypothesis was that the usual doses of zuclopenthixol and/or clonazepam would decrease in the venlafaxine-treated group. In a randomized double-blind study, we compared six patients who received venlafaxine along with their usual treatment (zuclopenthixol and/or clonazepam) with seven patients who received placebo plus usual care. Irritability, hyperactivity/noncompliance, and overall clinical improvement were measured after 2 and 8 weeks, using validated clinical scales. Univariate analyses showed that the symptom of irritability improved in the entire sample (p = 0.023 after 2 weeks, p = 0.061 at study endpoint), although no difference was observed between the venlafaxine and placebo groups. No significant decrease in hyperactivity/noncompliance was observed during the study. At the end of the study, global improvement was observed in 33% of participants treated with venlafaxine and in 71% of participants in the placebo group (p = 0.29). The study found that decreased cumulative doses of clonazepam and zuclopenthixol were required for the venlafaxine group. Multivariate analyses (principal component analyses) with at least three combinations of variables showed that the two populations could be clearly separated (p b 0.05). Moreover, in all cases, the venlafaxine population had lower values for the Aberrant Behavior Checklist (ABC), Behavior Problems Inventory (BPI), and levels of urea with respect to the placebo group. In one case, a reduction in the dosage of clonazepam was also suggested. For an additional set of variables (ABC factor 2, BPI frequency of aggressive behaviors, hematic ammonia at Day 28, and zuclopenthixol and clonazepam intake), the separation between the two samples was statistically significant as was the Bartlett's test, but the Kaiser–Meyer–Olkin Measure of Sampling Adequacy was below the accepted threshold. This set of variables showed a reduction in the cumulative intake of both zuclopenthixol and clonazepam. Despite the small sample sizes, this study documented a statistically significant effect of venlafaxine. Moreover, we showed that lower doses of zuclopenthixol and clonazepam were needed in the venlafaxine group, although this difference was not statistically significant. This was confirmed by multivariate analyses, where this difference reached statistical significance when using a combination of variables involving zuclopenthixol. Larger-scale studies are recommended to better investigate the effectiveness of venlafaxine treatment in patients with intellectual disabilities and ASD.
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497
Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
Injuries to Cyclists due to a Dog-Bicycle Interaction.
Loder, Randall T; Yaacoub, Alan P
2018-05-01
Both dogs and bicycles are common in our society and thus a dog-bicycle interaction resulting in an injury to a cyclist is possible. It was the purpose of this study to investigate such injuries. The National Electronic Injury Surveillance System (NEISS) data for the 10-year period from 2006 through 2015 associated with bicycles were accessed. Injuries involving dogs were identified and the mechanism of injury determined. Due to the stratified and weighted nature of the NEISS data, statistical analyses were performed with SUDAAN 10 software (RTI International, Research Triangle Park, North Carolina, United States). A p < 0.05 was considered statistically significant. There were 5,184,057 emergency department visits for bicycle-associated injuries; dogs were involved in 35,254 (0.67%) cases. The average age for those involved with a dog was 33.2 years and it was 25.5 years for those in which dogs were not involved. There were more females in the dog group (34.1 vs. 27.6%). Dog involvement increased from ages 0 to 14 years, then decreased until the age of 20 years and then progressively increased. Dog-associated injuries most frequently occurred away from home, involved the knee and distal lower extremity, 49.1% sustaining dog bites. Dog bites were more common in younger individuals. Four injury mechanisms (chased by a dog, hit/collided with a dog, swerved/tried to avoid a dog or riding with a dog) accounted for 97.5% of the injuries. Those chased by a dog were younger, more commonly released from the emergency department, had an injury involving the lower extremity and frequently sustained a bite. The most severe injuries were in those who swerved/tried to avoid a dog or hit a dog. Approximately 1% of injuries to bicyclists are associated with dogs; one-half sustained a bite. Potential/proposed prevention strategies could be educational materials regarding bicycles and dogs to owners, dog restraint, student/parent education and educational materials in waiting rooms of veterinarians, paediatricians, family practice physicians and emergency rooms. Schattauer GmbH Stuttgart.
NASA Astrophysics Data System (ADS)
Cederquist, D. P.; Mac Niocaill, C.; Van der Voo, R.
1997-01-01
Bingham statistical analyses were applied to paleomagnetic data from 50 published studies from North America, of Carboniferous through Early Jurassic age, in an attempt to test whether the azimuths of the long axes of the Bingham ellipses lie tangent to the apparent polar wander path. The underlying assumption is that paleomagnetic directions will form a Fisherian (circular) distribution if no apparent polar wander has taken place during magnetization acquisition. However, the distribution should appear elongated (elliptical) if magnetization acquisition occurred over a significant amount of time involving apparent polar wander. The long axes in direction space yield corresponding azimuths in paleopole space, which can be compared to the North American APWP. We find that, generally, these azimuths are indeed sub-parallel to the APWP, validating the methods and the hypothesis. Plotting a pole as an azimuthal cord, representing the long axis of the ellipse, will provide additional robustness or definition to an APWP based upon temporally sparse paleomagnetic studies.
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
Education on invasive mechanical ventilation involving intensive care nurses: a systematic review.
Guilhermino, Michelle C; Inder, Kerry J; Sundin, Deborah
2018-03-26
Intensive care unit nurses are critical for managing mechanical ventilation. Continuing education is essential in building and maintaining nurses' knowledge and skills, potentially improving patient outcomes. The aim of this study was to determine whether continuing education programmes on invasive mechanical ventilation involving intensive care unit nurses are effective in improving patient outcomes. Five electronic databases were searched from 2001 to 2016 using keywords such as mechanical ventilation, nursing and education. Inclusion criteria were invasive mechanical ventilation continuing education programmes that involved nurses and measured patient outcomes. Primary outcomes were intensive care unit mortality and in-hospital mortality. Secondary outcomes included hospital and intensive care unit length of stay, length of intubation, failed weaning trials, re-intubation incidence, ventilation-associated pneumonia rate and lung-protective ventilator strategies. Studies were excluded if they excluded nurses, patients were ventilated for less than 24 h, the education content focused on protocol implementation or oral care exclusively or the outcomes were participant satisfaction. Quality was assessed by two reviewers using an education intervention critical appraisal worksheet and a risk of bias assessment tool. Data were extracted independently by two reviewers and analysed narratively due to heterogeneity. Twelve studies met the inclusion criteria for full review: 11 pre- and post-intervention observational and 1 quasi-experimental design. Studies reported statistically significant reductions in hospital length of stay, length of intubation, ventilator-associated pneumonia rates, failed weaning trials and improvements in lung-protective ventilation compliance. Non-statistically significant results were reported for in-hospital and intensive care unit mortality, re-intubation and intensive care unit length of stay. Limited evidence of the effectiveness of continuing education programmes on mechanical ventilation involving nurses in improving patient outcomes exists. Comprehensive continuing education is required. Well-designed trials are required to confirm that comprehensive continuing education involving intensive care nurses about mechanical ventilation improves patient outcomes. © 2018 British Association of Critical Care Nurses.
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
ERIC Educational Resources Information Center
Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie
2017-01-01
To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical…
Maximizing Sensitivity of the Psychomotor Vigilance Test (PVT) to Sleep Loss
Basner, Mathias; Dinges, David F.
2011-01-01
Study Objectives: The psychomotor vigilance test (PVT) is among the most widely used measures of behavioral alertness, but there is large variation among published studies in PVT performance outcomes and test durations. To promote standardization of the PVT and increase its sensitivity and specificity to sleep loss, we determined PVT metrics and task durations that optimally discriminated sleep deprived subjects from alert subjects. Design: Repeated-measures experiments involving 10-min PVT assessments every 2 h across both acute total sleep deprivation (TSD) and 5 days of chronic partial sleep deprivation (PSD). Setting: Controlled laboratory environment. Participants: 74 healthy subjects (34 female), aged 22–45 years. Interventions: TSD experiment involving 33 h awake (N = 31 subjects) and a PSD experiment involving 5 nights of 4 h time in bed (N = 43 subjects). Measurements and Results: In a paired t-test paradigm and for both TSD and PSD, effect sizes of 10 different PVT performance outcomes were calculated. Effect sizes were high for both TSD (1.59–1.94) and PSD (0.88–1.21) for PVT metrics related to lapses and to measures of psychomotor speed, i.e., mean 1/RT (response time) and mean slowest 10% 1/RT. In contrast, PVT mean and median RT outcomes scored low to moderate effect sizes influenced by extreme values. Analyses facilitating only portions of the full 10-min PVT indicated that for some outcomes, high effect sizes could be achieved with PVT durations considerably shorter than 10 min, although metrics involving lapses seemed to profit from longer test durations in TSD. Conclusions: Due to their superior conceptual and statistical properties and high sensitivity to sleep deprivation, metrics involving response speed and lapses should be considered primary outcomes for the 10-min PVT. In contrast, PVT mean and median metrics, which are among the most widely used outcomes, should be avoided as primary measures of alertness. Our analyses also suggest that some shorter-duration PVT versions may be sensitive to sleep loss, depending on the outcome variable selected, although this will need to be confirmed in comparative analyses of separate duration versions of the PVT. Using both sensitive PVT metrics and optimal test durations maximizes the sensitivity of the PVT to sleep loss and therefore potentially decreases the sample size needed to detect the same neurobehavioral deficit. We propose criteria to better standardize the 10-min PVT and facilitate between-study comparisons and meta-analyses. Citation: Basner M; Dinges DF. Maximizing sensitivity of the psychomotor vigilance test (PVT) to sleep loss. SLEEP 2011;34(5):581-591. PMID:21532951
Corzo, P; Salman-Monte, T C; Torrente-Segarra, V; Polino, L; Mojal, S; Carbonell-Abelló, J
2017-06-01
Objective To describe long-term clinical and serological outcome in all systemic lupus erythematosus (SLE) domains in SLE patients with hand arthralgia (HA) and joint ultrasound (JUS) inflammatory abnormalities, and to compare them with asymptomatic SLE patients with normal JUS. Methods SLE patients with HA who presented JUS inflammatory abnormalities ('cases') and SLE patients without HA who did not exhibit JUS abnormalities at baseline ('controls') were included. All SLE clinical and serological domain involvement data were collected. End follow-up clinical activity and damage scores (systemic lupus erythematosus disease activity index (SLEDAI), Systemic Lupus International Collaborating Clinics/American College of Rheumatology (SLICC/ACR)) were recorded. JUS inflammatory abnormalities were defined based on the Proceedings of the Seventh International Consensus Conference on Outcome Measures in Rheumatology Clinical Trials (OMERACT-7) definitions. Statistical analyses were carried out to compare 'cases' and 'controls'. Results A total of 35 patients were recruited. The 'cases', n = 18/35, had a higher incidence of musculoskeletal involvement (arthralgia and/or arthritis) through the follow-up period (38.9% vs 0%, p = 0.008) and received more hydroxychloroquine (61.1% vs 25.0%, p = 0.034) and methotrexate (27.8% vs 0%, p = 0.046) compared to 'controls', n = 17/35. Other comparisons did not reveal any statistical differences. Conclusions We found SLE patients with arthralgia who presented JUS inflammatory abnormalities received more hydroxychloroquine and methotrexate, mainly due to persistent musculoskeletal involvement over time. JUS appears to be a useful technique for predicting worse musculoskeletal outcome in SLE patients.
Lachowiec, Jennifer; Shen, Xia; Queitsch, Christine; Carlborg, Örjan
2015-01-01
Efforts to identify loci underlying complex traits generally assume that most genetic variance is additive. Here, we examined the genetics of Arabidopsis thaliana root length and found that the genomic narrow-sense heritability for this trait in the examined population was statistically zero. The low amount of additive genetic variance that could be captured by the genome-wide genotypes likely explains why no associations to root length could be found using standard additive-model-based genome-wide association (GWA) approaches. However, as the broad-sense heritability for root length was significantly larger, and primarily due to epistasis, we also performed an epistatic GWA analysis to map loci contributing to the epistatic genetic variance. Four interacting pairs of loci were revealed, involving seven chromosomal loci that passed a standard multiple-testing corrected significance threshold. The genotype-phenotype maps for these pairs revealed epistasis that cancelled out the additive genetic variance, explaining why these loci were not detected in the additive GWA analysis. Small population sizes, such as in our experiment, increase the risk of identifying false epistatic interactions due to testing for associations with very large numbers of multi-marker genotypes in few phenotyped individuals. Therefore, we estimated the false-positive risk using a new statistical approach that suggested half of the associated pairs to be true positive associations. Our experimental evaluation of candidate genes within the seven associated loci suggests that this estimate is conservative; we identified functional candidate genes that affected root development in four loci that were part of three of the pairs. The statistical epistatic analyses were thus indispensable for confirming known, and identifying new, candidate genes for root length in this population of wild-collected A. thaliana accessions. We also illustrate how epistatic cancellation of the additive genetic variance explains the insignificant narrow-sense and significant broad-sense heritability by using a combination of careful statistical epistatic analyses and functional genetic experiments.
Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas
2015-12-01
The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.
Swahn, Monica H; Bossarte, Robert M
2007-08-01
To examine the cross-sectional associations between preteen alcohol use initiation and subsequent suicide ideation and attempts for boys and girls in a nationally representative sample of high school students. Analyses are computed using data from the 2005 national Youth Risk Behavior Survey, which includes a representative sample (n = 13,639) of high-school students in grades 9-12 in the United States. Cross-sectional logistic regression analyses were conducted to determine the associations between early alcohol use and reports of suicide ideation and suicide attempts for boys and girls while controlling for demographic characteristics, substance use, involvement in physical fights, weapon carrying, physical abuse by dating partner, sexual assault, and sadness. Among study participants, 25.4% reported drinking before age 13 years. Preteen alcohol use initiation was statistically significantly associated with suicidal ideation (adjusted OR = 1.89, 95% CI =1.46-2.44) and suicide attempts (adjusted OR = 2.71, 95% CI =1.82-4.02) relative to nondrinkers. Preteen alcohol use initiation was statistically significantly associated with suicidal ideation and attempts relative to nondrinkers for both boys and girls. Alcohol use among adolescents, particularly preteen alcohol use initiation, is an important risk factor for both suicide ideation and suicide attempts among boys and girls. Increased efforts to delay and reduce early alcohol use are needed, and may reduce suicide attempts.
Operational satellites and the global monitoring of snow and ice
NASA Technical Reports Server (NTRS)
Walsh, John E.
1991-01-01
The altitudinal dependence of the global warming projected by global climate models is at least partially attributable to the albedo-temperature feedback involving snow and ice, which must be regarded as key variables in the monitoring for global change. Statistical analyses of data from IR and microwave sensors monitoring the areal coverage and extent of sea ice have led to mixed conclusions about recent trends of hemisphere sea ice coverage. Seasonal snow cover has been mapped for over 20 years by NOAA/NESDIS on the basis of imagery from a variety of satellite sensors. Multichannel passive microwave data show some promise for the routine monitoring of snow depth over unforested land areas.
Current Status and Challenges of Atmospheric Data Assimilation
NASA Astrophysics Data System (ADS)
Atlas, R. M.; Gelaro, R.
2016-12-01
The issues of modern atmospheric data assimilation are fairly simple to comprehend but difficult to address, involving the combination of literally billions of model variables and tens of millions of observations daily. In addition to traditional meteorological variables such as wind, temperature pressure and humidity, model state vectors are being expanded to include explicit representation of precipitation, clouds, aerosols and atmospheric trace gases. At the same time, model resolutions are approaching single-kilometer scales globally and new observation types have error characteristics that are increasingly non-Gaussian. This talk describes the current status and challenges of atmospheric data assimilation, including an overview of current methodologies, the difficulty of estimating error statistics, and progress toward coupled earth system analyses.
Comparison of open versus closed group interventions for sexually abused adolescent girls.
Tourigny, Marc; Hébert, Martine
2007-01-01
A first aim of this study is to evaluate the efficacy of an open group therapy for sexually abused teenagers using a quasi-experimental pretest/posttest treatment design. A second aim was to explore whether differential gains were linked to an open versus a closed group format. Results indicate that sexually abused girls involved in an open group therapy showed significant gains relative to teenagers of the control group girls for the majority of the variables considered. Analyses contrasting the two formats of group therapy fail to identify statistical differences suggesting that both open and closed group formats are likely to be associated with the same significant gains for sexually abused teenagers.
Inferential Statistics in "Language Teaching Research": A Review and Ways Forward
ERIC Educational Resources Information Center
Lindstromberg, Seth
2016-01-01
This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…
Modeling Cross-Situational Word–Referent Learning: Prior Questions
Yu, Chen; Smith, Linda B.
2013-01-01
Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490
Assessing groundwater vulnerability to agrichemical contamination in the Midwest US
Burkart, M.R.; Kolpin, D.W.; James, D.E.
1999-01-01
Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.
Maćków, Anna; Małachowska-Sobieska, Monika; Demczuk-Włodarczyk, Ewa; Sidorowska, Marta; Szklarska, Alicja; Lipowicz, Anna
2014-01-01
The aim of the study was to present the influence of neurophysiological hippotherapy on the transference of the centre of gravity (COG) among children with cerebral palsy (CP). The study involved 19 children aged 4-13 years suffering from CP who demonstrated an asymmetric (A/P) model of compensation. Body balance was studied with the Cosmogamma Balance Platform. An examination on this platform was performed before and after a session of neurophysiological hippotherapy. In order to compare the correlations and differences between the examinations, the results were analysed using Student's T-test for dependent samples at p ≤ 0.05 as the level of statistical significance and descriptive statistics were calculated. The mean value of the body's centre of gravity in the frontal plane (COG X) was 18.33 (mm) during the first examination, changing by 21.84 (mm) after neurophysiological hippotherapy towards deloading of the antigravity lower limb (p ≤ 0.0001). The other stabilographic parameters increased; however, only the change in average speed of antero - posterior COG oscillation was statistically significant (p = 0.0354). 1. One session of neurophysiological hippotherapy induced statistically significant changes in the position of the centre of gravity in the body in the frontal plane and the average speed of COG oscillation in the sagittal plane among CP children demonstrating an asymmetric model of compensation (A/P).
Taylor, Anne W; Dal Grande, Eleonora; Grant, Janet; Appleton, Sarah; Gill, Tiffany K; Shi, Zumin; Adams, Robert J
2013-04-01
Attrition in cohort studies can cause the data to be nonreflective of the original population. Although of little concern if intragroup comparisons are being made or cause and effect assessed, the assessment of bias was undertaken in this study so that intergroup or descriptive analyses could be undertaken. The North West Adelaide Health Study is a chronic disease and risk factor cohort study undertaken in Adelaide, South Australia. In the original wave (1999), clinical and self-report data were collected from 4,056 adults. In the third wave (2008-2010), 2,710 adults were still actively involved. Comparisons were made against two other data sources: Australian Bureau of Statistics Estimated Residential Population and a regular conducted chronic disease and risk factor surveillance system. Comparisons of demographics (age, sex, area, education, work status, and income) proved to be statistically significantly different. In addition, smoking status, body mass index, and general health status were statistically significant from the comparison group. No statistically significant differences were found for alcohol risk. Although the third wave of this cohort study is not representative of the broader population on the variables assessed, weighting of the data and analytical approaches can account for differences. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bansah, S.; Ali, G.; Haque, M. A.; Tang, V.
2017-12-01
The proportion of precipitation that becomes streamflow is a function of internal catchment characteristics - which include geology, landscape characteristics and vegetation - and influence overall storage dynamics. The timing and quantity of water discharged by a catchment are indeed embedded in event hydrographs. Event hydrograph timing parameters, such as the response lag and time of concentration, are important descriptors of how long it takes the catchment to respond to input precipitation and how long it takes the latter to filter through the catchment. However, the extent to which hydrograph timing parameters relate to average response times derived from fitting transfer functions to annual hydrographs is unknown. In this study, we used a gamma transfer function to determine catchment average response times as well as event-specific hydrograph parameters across a network of eight nested watersheds ranging from 0.19 km2 to 74.6 km2 prairie catchments located in south central Manitoba (Canada). Various statistical analyses were then performed to correlate average response times - estimated using the parameters of the fitted gamma transfer function - to event-specific hydrograph parameters. Preliminary results show significant interannual variations in response times and hydrograph timing parameters: the former were in the order of a few hours to days, while the latter ranged from a few days to weeks. Some statistically significant relationships were detected between response times and event-specific hydrograph parameters. Future analyses will involve the comparison of statistical distributions of event-specific hydrograph parameters with that of runoff response times and baseflow transit times in order to quantity catchment storage dynamics across a range of temporal scales.
MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, M; Petrick, N; Obuchowski, N
The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less
ERIC Educational Resources Information Center
Ali, Usama S.; Walker, Michael E.
2014-01-01
Two methods are currently in use at Educational Testing Service (ETS) for equating observed item difficulty statistics. The first method involves the linear equating of item statistics in an observed sample to reference statistics on the same items. The second method, or the item response curve (IRC) method, involves the summation of conditional…
Kassubek, Jan; Müller, Hans-Peter; Del Tredici, Kelly; Brettschneider, Johannes; Pinkhardt, Elmar H; Lulé, Dorothée; Böhm, Sarah; Braak, Heiko; Ludolph, Albert C
2014-06-01
Diffusion tensor imaging can identify amyotrophic lateral sclerosis-associated patterns of brain alterations at the group level. Recently, a neuropathological staging system for amyotrophic lateral sclerosis has shown that amyotrophic lateral sclerosis may disseminate in a sequential regional pattern during four disease stages. The objective of the present study was to apply a new methodological diffusion tensor imaging-based approach to automatically analyse in vivo the fibre tracts that are prone to be involved at each neuropathological stage of amyotrophic lateral sclerosis. Two data samples, consisting of 130 diffusion tensor imaging data sets acquired at 1.5 T from 78 patients with amyotrophic lateral sclerosis and 52 control subjects; and 55 diffusion-tensor imaging data sets at 3.0 T from 33 patients with amyotrophic lateral sclerosis and 22 control subjects, were analysed by a tract of interest-based fibre tracking approach to analyse five tracts that become involved during the course of amyotrophic lateral sclerosis: the corticospinal tract (stage 1); the corticorubral and the corticopontine tracts (stage 2); the corticostriatal pathway (stage 3); the proximal portion of the perforant path (stage 4); and two reference pathways. The statistical analyses of tracts of interest showed differences between patients with amyotrophic lateral sclerosis and control subjects for all tracts. The significance level of the comparisons at the group level was lower, the higher the disease stage with corresponding involved fibre tracts. Both the clinical phenotype as assessed by the amyotrophic lateral sclerosis functional rating scale-revised and disease duration correlated significantly with the resulting staging scheme. In summary, the tract of interest-based technique allowed for individual analysis of predefined tract structures, thus making it possible to image in vivo the disease stages in amyotrophic lateral sclerosis. This approach can be used not only for individual clinical work-up purposes, but enlarges the spectrum of potential non-invasive surrogate markers as a neuroimaging-based read-out for amyotrophic lateral sclerosis studies within a clinical context. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Analysis of Trace Siderophile Elements at High Spatial Resolution Using Laser Ablation ICP-MS
NASA Astrophysics Data System (ADS)
Campbell, A. J.; Humayun, M.
2006-05-01
Laser ablation inductively coupled plasma mass spectometry is an increasingly important method of performing spatially resolved trace element analyses. Over the last several years we have applied this technique to measure siderophile element distributions at the ppm level in a variety of natural and synthetic samples, especially metallic phases in meteorites and experimental run products intended for trace element partitioning studies. These samples frequently require trace element analyses to be made at a finer spatial resolution (25 microns or better) than is frequently attained using LA-ICP-MS. In this presentation we review analytical protocols that were developed to optimize the LA-ICP-MS measurements for high spatial resolution. Particular attention is paid to the trade-offs involving sensitivity, ablation pit depth and diameter, background levels, and number of elements measured. To maximize signal/background ratios and avoid difficulties associated with ablating to depths greater than the ablation pit diameter, measurement involved integration of rapidly varying, transient but well-behaved signals. The abundances of platinum group elements and other siderophile elements in ferrous metals were calibrated against well-characterized standards, including iron meteorites and NIST certified steels. The calibrations can be set against the known abundance of an independently determined element, but normalization to 100 percent can also be employed, and was more useful in many circumstances. Evaluation of uncertainties incorporated counting statistics as well as a measure of instrumental uncertainty, determined by replicate analyses of the standards. These methods have led to a number of insights into the formation and chemical processing of metal in the early solar system.
Liu, Guorui; Cai, Zongwei; Zheng, Minghui; Jiang, Xiaoxu; Nie, Zhiqiang; Wang, Mei
2015-01-01
Identifying marker congeners of unintentionally produced polychlorinated naphthalenes (PCNs) from industrial thermal sources might be useful for predicting total PCN (∑2-8PCN) emissions by the determination of only indicator congeners. In this study, potential indicator congeners were identified based on the PCN data in 122 stack gas samples from over 60 plants involved in more than ten industrial thermal sources reported in our previous case studies. Linear regression analyses identified that the concentrations of CN27/30, CN52/60, and CN66/67 correlated significantly with ∑2-8PCN (R(2)=0.77, 0.80, and 0.58, respectively; n=122, p<0.05), which might be good candidates for indicator congeners. Equations describing relationships between indicators and ∑2-8PCN were established. The linear regression analyses involving 122 samples showed that the relationships between the indicator congeners and ∑2-8PCN were not significantly affected by factors such as industry types, raw materials used, or operating conditions. Hierarchical cluster analysis and similarity calculations for the 122 stack gas samples were adopted to group those samples and evaluating their similarity and difference based on the PCN homolog distributions from different industrial thermal sources. Generally, the fractions of less chlorinated homologs comprised of di-, tri-, and tetra-homologs were much higher than that of more chlorinated homologs for up to 111 stack gas samples contained in group 1 and 2, which indicating the dominance of lower chlorinated homologs in stack gas from industrial thermal sources. Copyright © 2014 Elsevier Ltd. All rights reserved.
Recio-Rodriguez, Jose I; Gomez-Marcos, Manuel A; Patino Alonso, Maria C; Martin-Cantera, Carlos; Ibañez-Jalon, Elisa; Melguizo-Bejar, Amor; Garcia-Ortiz, Luis
2013-12-01
The present study analyses the relation between smoking status and the parameters used to assess vascular structure and function. This cross-sectional, multi-centre study involved a random sample of 1553 participants from the EVIDENT study. The smoking status, peripheral augmentation index and ankle-brachial index were measured in all participants. In a small subset of the main population (265 participants), the carotid intima-media thickness and pulse wave velocity were also measured. After controlling for the effect of age, sex and other risk factors, present smokers have higher values of carotid intima-media thickness (p = 0.011). Along the same lines, current smokers have higher values of pulse wave velocity and lower mean values of ankle-brachial index but without statistical significance in both cases. Among the parameters of vascular structure and function analysed, only the IMT shows association with the smoking status, after adjusting for confounders.
Statistical methods for incomplete data: Some results on model misspecification.
McIsaac, Michael; Cook, R J
2017-02-01
Inverse probability weighted estimating equations and multiple imputation are two of the most studied frameworks for dealing with incomplete data in clinical and epidemiological research. We examine the limiting behaviour of estimators arising from inverse probability weighted estimating equations, augmented inverse probability weighted estimating equations and multiple imputation when the requisite auxiliary models are misspecified. We compute limiting values for settings involving binary responses and covariates and illustrate the effects of model misspecification using simulations based on data from a breast cancer clinical trial. We demonstrate that, even when both auxiliary models are misspecified, the asymptotic biases of double-robust augmented inverse probability weighted estimators are often smaller than the asymptotic biases of estimators arising from complete-case analyses, inverse probability weighting or multiple imputation. We further demonstrate that use of inverse probability weighting or multiple imputation with slightly misspecified auxiliary models can actually result in greater asymptotic bias than the use of naïve, complete case analyses. These asymptotic results are shown to be consistent with empirical results from simulation studies.
NASA Technical Reports Server (NTRS)
Fuelberg, H. E.; Meyer, P. J.
1984-01-01
Structure and correlation functions are used to describe atmospheric variability during the 10-11 April day of AVE-SESAME 1979 that coincided with the Red River Valley tornado outbreak. The special mesoscale rawinsonde data are employed in calculations involving temperature, geopotential height, horizontal wind speed and mixing ratio. Functional analyses are performed in both the lower and upper troposphere for the composite 24 h experiment period and at individual 3 h observation times. Results show that mesoscale features are prominent during the composite period. Fields of mixing ratio and horizontal wind speed exhibit the greatest amounts of small-scale variance, whereas temperature and geopotential height contain the least. Results for the nine individual times show that small-scale variance is greatest during the convective outbreak. The functions also are used to estimate random errors in the rawinsonde data. Finally, sensitivity analyses are presented to quantify confidence limits of the structure functions.
Review of Research Reporting Guidelines for Radiology Researchers.
Cronin, Paul; Rawson, James V
2016-05-01
Prior articles have reviewed reporting guidelines and study evaluation tools for clinical research. However, only some of the many available accepted reporting guidelines at the Enhancing the QUAlity and Transparency Of health Research Network have been discussed in previous reports. In this paper, we review the key Enhancing the QUAlity and Transparency Of health Research reporting guidelines that have not been previously discussed. The study types include diagnostic and prognostic studies, reliability and agreement studies, observational studies, analytical and descriptive, experimental studies, quality improvement studies, qualitative research, health informatics, systematic reviews and meta-analyses, economic evaluations, and mixed methods studies. There are also sections on study protocols, and statistical analyses and methods. In each section, there is a brief overview of the study type, and then the reporting guideline(s) that are most applicable to radiology researchers including radiologists involved in health services research are discussed. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
2013-01-01
Background The present study analyses the relation between smoking status and the parameters used to assess vascular structure and function. Methods This cross-sectional, multi-centre study involved a random sample of 1553 participants from the EVIDENT study. Measurements: The smoking status, peripheral augmentation index and ankle-brachial index were measured in all participants. In a small subset of the main population (265 participants), the carotid intima-media thickness and pulse wave velocity were also measured. Results After controlling for the effect of age, sex and other risk factors, present smokers have higher values of carotid intima-media thickness (p = 0.011). Along the same lines, current smokers have higher values of pulse wave velocity and lower mean values of ankle-brachial index but without statistical significance in both cases. Conclusions Among the parameters of vascular structure and function analysed, only the IMT shows association with the smoking status, after adjusting for confounders. PMID:24289208
Chireshe, Regis; Rutondoki, Edward Ntare; Ojwang, Paul
2010-12-01
The study investigated perceptions of the availability and effectiveness of HIV/AIDS awareness and intervention programmes by people with disabilities in Uganda. Participants (N=95) were made up of 15 leaders of disabled people's organisations (DPOs) and 80 people with disabilities (PWDs). A survey design which used both quantitative and qualitative research methods was adopted. A questionnaire was used for leaders of DPOs while focus group discussions (FGDs) were held with the rest of the participants. Descriptive statistics were used to analyse the quantitative data. The qualitative data were analysed by means of a content analysis. The study found that although PWDs were aware of the HIV/AIDS pandemic, they felt discriminated against on HIV/AIDS issues. The PWDs had difficulties in accessing HIV/AIDS services because of mainly communication problems. Results further revealed that the HIV/AIDS policy on disability was not very clear. The PWDs requested for full involvement in HIV/AIDS advocacy and training programmes. Recommendations were made.
Between Order and Disorder: A ‘Weak Law’ on Recent Electoral Behavior among Urban Voters?
Borghesi, Christian; Chiche, Jean; Nadal, Jean-Pierre
2012-01-01
A new viewpoint on electoral involvement is proposed from the study of the statistics of the proportions of abstentionists, blank and null, and votes according to list of choices, in a large number of national elections in different countries. Considering 11 countries without compulsory voting (Austria, Canada, Czech Republic, France, Germany, Italy, Mexico, Poland, Romania, Spain, and Switzerland), a stylized fact emerges for the most populated cities when one computes the entropy associated to the three ratios, which we call the entropy of civic involvement of the electorate. The distribution of this entropy (over all elections and countries) appears to be sharply peaked near a common value. This almost common value is typically shared since the 1970s by electorates of the most populated municipalities, and this despite the wide disparities between voting systems and types of elections. Performing different statistical analyses, we notably show that this stylized fact reveals particular correlations between the blank/null votes and abstentionists ratios. We suggest that the existence of this hidden regularity, which we propose to coin as a ‘weak law on recent electoral behavior among urban voters’, reveals an emerging collective behavioral norm characteristic of urban citizen voting behavior in modern democracies. Analyzing exceptions to the rule provides insights into the conditions under which this normative behavior can be expected to occur. PMID:22848365
Between order and disorder: a 'weak law' on recent electoral behavior among urban voters?
Borghesi, Christian; Chiche, Jean; Nadal, Jean-Pierre
2012-01-01
A new viewpoint on electoral involvement is proposed from the study of the statistics of the proportions of abstentionists, blank and null, and votes according to list of choices, in a large number of national elections in different countries. Considering 11 countries without compulsory voting (Austria, Canada, Czech Republic, France, Germany, Italy, Mexico, Poland, Romania, Spain, and Switzerland), a stylized fact emerges for the most populated cities when one computes the entropy associated to the three ratios, which we call the entropy of civic involvement of the electorate. The distribution of this entropy (over all elections and countries) appears to be sharply peaked near a common value. This almost common value is typically shared since the 1970s by electorates of the most populated municipalities, and this despite the wide disparities between voting systems and types of elections. Performing different statistical analyses, we notably show that this stylized fact reveals particular correlations between the blank/null votes and abstentionists ratios. We suggest that the existence of this hidden regularity, which we propose to coin as a 'weak law on recent electoral behavior among urban voters', reveals an emerging collective behavioral norm characteristic of urban citizen voting behavior in modern democracies. Analyzing exceptions to the rule provides insights into the conditions under which this normative behavior can be expected to occur.
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
2013-01-01
Background The availability of gene expression data that corresponds to pig immune response challenges provides compelling material for the understanding of the host immune system. Meta-analysis offers the opportunity to confirm and expand our knowledge by combining and studying at one time a vast set of independent studies creating large datasets with increased statistical power. In this study, we performed two meta-analyses of porcine transcriptomic data: i) scrutinized the global immune response to different challenges, and ii) determined the specific response to Porcine Reproductive and Respiratory Syndrome Virus (PRRSV) infection. To gain an in-depth knowledge of the pig response to PRRSV infection, we used an original approach comparing and eliminating the common genes from both meta-analyses in order to identify genes and pathways specifically involved in the PRRSV immune response. The software Pointillist was used to cope with the highly disparate data, circumventing the biases generated by the specific responses linked to single studies. Next, we used the Ingenuity Pathways Analysis (IPA) software to survey the canonical pathways, biological functions and transcription factors found to be significantly involved in the pig immune response. We used 779 chips corresponding to 29 datasets for the pig global immune response and 279 chips obtained from 6 datasets for the pig response to PRRSV infection, respectively. Results The pig global immune response analysis showed interconnected canonical pathways involved in the regulation of translation and mitochondrial energy metabolism. Biological functions revealed in this meta-analysis were centred around translation regulation, which included protein synthesis, RNA-post transcriptional gene expression and cellular growth and proliferation. Furthermore, the oxidative phosphorylation and mitochondria dysfunctions, associated with stress signalling, were highly regulated. Transcription factors such as MYCN, MYC and NFE2L2 were found in this analysis to be potentially involved in the regulation of the immune response. The host specific response to PRRSV infection engendered the activation of well-defined canonical pathways in response to pathogen challenge such as TREM1, toll-like receptor and hyper-cytokinemia/ hyper-chemokinemia signalling. Furthermore, this analysis brought forth the central role of the crosstalk between innate and adaptive immune response and the regulation of anti-inflammatory response. The most significant transcription factor potentially involved in this analysis was HMGB1, which is required for the innate recognition of viral nucleic acids. Other transcription factors like interferon regulatory factors IRF1, IRF3, IRF5 and IRF8 were also involved in the pig specific response to PRRSV infection. Conclusions This work reveals key genes, canonical pathways and biological functions involved in the pig global immune response to diverse challenges, including PRRSV infection. The powerful statistical approach led us to consolidate previous findings as well as to gain new insights into the pig immune response either to common stimuli or specifically to PRRSV infection. PMID:23552196
Beretzky, Zsuzsanna; Péntek, Márta
2017-12-01
Informal care plays an important role in ageing societies. To analyse informal care use and its determinants among patients with chronic diseases in Hungary. Patient level data from previous studies in 14 diagnoses were analysed including patients' EQ-5D-3L health status. Descriptive statistics were performed and a linear regression model was built to analyse determinants of informal care time. 2047 patients (female: 58%) with mean age of 58.9 (SD = 16.3) years and EQ-5D-3L index score of 0.64 (SD = 0.33) were involved. 27% received informal care, the average time of care was 7.54 (SD = 26.36) hours/week. Both the rate of informal care use and its time differed significantly between the diagnoses (p<0.05), the highest were in dementia, Parkinsons' disease and in chronic inflammatory immunological diseases. Significant determinants were age, EQ-5D-3L scores, gender and certain diagnosis dummys (R 2 = 0.111). Informal care use is significant in chronic debilitating conditions. Future studies are encouraged to reveal unmet needs, preferences and further explanatory factors. Orv Hetil. 2017; 158(52): 2068-2078.
Social Mediation of Persuasive Media in Adolescent Substance Prevention
Crano, William D.; Alvaro, Eusebio M.; Tan, Cara N.; Siegel, Jason T.
2017-01-01
Social commentary about prevention messages may affect their likelihood of acceptance. To investigate this possibility, student participants (N = 663) viewed three anti-marijuana advertisements, each followed immediately by videotaped discussions involving four adults or four adolescents using either extreme or moderate language in their positive commentaries. The commentaries were expected to affect participants’ perceptions of the extent to which the ads were designed to control their behavior (perceived control), which was hypothesized to inhibit persuasion. Two indirect effects analyses were conducted. Marijuana attitudes and usage intentions were the outcome variables. Both analyses revealed statistically significant source by language interactions on participants’ perceived control (both p < .02). Further analyses revealed significant indirect effects of language extremity on attitudes and intentions through perceived control with adult, but not peer sources (both p < .05). These perceptions were associated with more negative marijuana attitudes and diminished usage intentions when adults used moderate (vs. extreme) language in their favorable ad commentaries (both p < .05). The findings may facilitate development of more effective prevention methods that emphasize the importance the role of perceived control in persuasion, and the impact of interpersonal communication variations on acceptance of media-transmitted prevention messages. PMID:28301181
Social mediation of persuasive media in adolescent substance prevention.
Crano, William D; Alvaro, Eusebio M; Tan, Cara N; Siegel, Jason T
2017-06-01
Social commentary about prevention messages may affect their likelihood of acceptance. To investigate this possibility, student participants (N = 663) viewed 3 antimarijuana advertisements, each followed immediately by videotaped discussions involving 4 adults or 4 adolescents using either extreme or moderate language in their positive commentaries. The commentaries were expected to affect participants' perceptions of the extent to which the ads were designed to control their behavior (perceived control), which was hypothesized to inhibit persuasion. Two indirect effects analyses were conducted. Marijuana attitudes and usage intentions were the outcome variables. Both analyses revealed statistically significant source by language interactions on participants' perceived control (both p < .02). Further analyses revealed significant indirect effects of language extremity on attitudes and intentions through perceived control with adult, but not peer sources (both p < .05). These perceptions were associated with more negative marijuana attitudes and diminished usage intentions when adults used moderate (vs. extreme) language in their favorable ad commentaries (both p < .05). The findings may facilitate development of more effective prevention methods that emphasize the importance of the role of perceived control in persuasion, and the impact of interpersonal communication variations on acceptance of media-transmitted prevention messages. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Akiki, Teddy J; Averill, Christopher L; Wrocklage, Kristen M; Scott, J Cobb; Averill, Lynnette A; Schweinsburg, Brian; Alexander-Bloch, Aaron; Martini, Brenda; Southwick, Steven M; Krystal, John H; Abdallah, Chadi G
2018-08-01
Disruption in the default mode network (DMN) has been implicated in numerous neuropsychiatric disorders, including posttraumatic stress disorder (PTSD). However, studies have largely been limited to seed-based methods and involved inconsistent definitions of the DMN. Recent advances in neuroimaging and graph theory now permit the systematic exploration of intrinsic brain networks. In this study, we used resting-state functional magnetic resonance imaging (fMRI), diffusion MRI, and graph theoretical analyses to systematically examine the DMN connectivity and its relationship with PTSD symptom severity in a cohort of 65 combat-exposed US Veterans. We employed metrics that index overall connectivity strength, network integration (global efficiency), and network segregation (clustering coefficient). Then, we conducted a modularity and network-based statistical analysis to identify DMN regions of particular importance in PTSD. Finally, structural connectivity analyses were used to probe whether white matter abnormalities are associated with the identified functional DMN changes. We found decreased DMN functional connectivity strength to be associated with increased PTSD symptom severity. Further topological characterization suggests decreased functional integration and increased segregation in subjects with severe PTSD. Modularity analyses suggest a spared connectivity in the posterior DMN community (posterior cingulate, precuneus, angular gyrus) despite overall DMN weakened connections with increasing PTSD severity. Edge-wise network-based statistical analyses revealed a prefrontal dysconnectivity. Analysis of the diffusion networks revealed no alterations in overall strength or prefrontal structural connectivity. DMN abnormalities in patients with severe PTSD symptoms are characterized by decreased overall interconnections. On a finer scale, we found a pattern of prefrontal dysconnectivity, but increased cohesiveness in the posterior DMN community and relative sparing of connectivity in this region. The DMN measures established in this study may serve as a biomarker of disease severity and could have potential utility in developing circuit-based therapeutics. Published by Elsevier Inc.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760
Agüera-Ortiz, L F; Ramos-García, M; Gobartt, A L
To determine and to compare the tolerability and effectiveness of a slow escalation of the dose of rivastigmine in patients with Alzheimer's disease with respect to using it with a faster escalation. We conducted a multi-centre, naturalistic, open-label, randomised trial with 429 hospital outpatients diagnosed with Alzheimer-type dementia (according to DSM-IV and NINCDS-ADRA criteria) and in whom treatment with rivastigmine was clinically indicated. Two study groups were established: slow escalation and fast escalation (in accordance with usual clinical practice); effectiveness and tolerability variables were analysed in the two groups, as was the proportion of patients who reached therapeutic doses (> 6 mg/day). The scores obtained on the CGI, MMSE, NPI and Barthel index scales were analysed, together with adverse events and reactions concerning spontaneous communication, and scores on the UKU scale. The slow escalation group displayed slightly higher percentages of sub-therapeutic anticipated interruptions than the fast escalation group (chi-square test; p < 0.05). On comparing the two treatment groups, no statistically significant differences were observed for the evolution of the scores on the different scales of effectiveness; no statistically significant differences were found between the two groups in the safety and tolerability analyses (chi-square test, exact test; p > 0.05) for most of the parameters that were studied (adverse reactions in spontaneous communication and the modified UKU scale). Slow escalation of the dose of rivastigmine did not display greater effectiveness or tolerability in comparison to an escalation applied in accordance with usual clinical practice.
Kahl, W-A; Dilissen, N; Hidas, K; Garrido, C J; López-Sánchez-Vizcaíno, V; Román-Alpiste, M J
2017-11-01
We reconstruct the 3-D microstructure of centimetre-sized olivine crystals in rocks from the Almirez ultramafic massif (SE Spain) using combined X-ray micro computed tomography (μ-CT) and electron backscatter diffraction (EBSD). The semidestructive sample treatment involves geographically oriented drill pressing of rocks and preparation of oriented thin sections for EBSD from the μ-CT scanned cores. The μ-CT results show that the mean intercept length (MIL) analyses provide reliable information on the shape preferred orientation (SPO) of texturally different olivine groups. We show that statistical interpretation of crystal preferred orientation (CPO) and SPO of olivine becomes feasible because the highest densities of the distribution of main olivine crystal axes from EBSD are aligned with the three axes of the 3-D ellipsoid calculated from the MIL analyses from μ-CT. From EBSD data we distinguish multiple CPO groups and by locating the thin sections within the μ-CT volume, we assign SPO to the corresponding olivine crystal aggregates, which confirm the results of statistical comparison. We demonstrate that the limitations of both methods (i.e. no crystal orientation data in μ-CT and no spatial information in EBSD) can be overcome, and the 3-D orientation of the crystallographic axes of olivines from different orientation groups can be successfully correlated with the crystal shapes of representative olivine grains. Through this approach one can establish the link among geological structures, macrostructure, fabric and 3-D SPO-CPO relationship at the hand specimen scale even in complex, coarse-grained geomaterials. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Brinton, Louise A; Cook, Michael B; McCormack, Valerie; Johnson, Kenneth C; Olsson, Håkan; Casagrande, John T; Cooke, Rosie; Falk, Roni T; Gapstur, Susan M; Gaudet, Mia M; Gaziano, J Michael; Gkiokas, Georgios; Guénel, Pascal; Henderson, Brian E; Hollenbeck, Albert; Hsing, Ann W; Kolonel, Laurence N; Isaacs, Claudine; Lubin, Jay H; Michels, Karin B; Negri, Eva; Parisi, Dominick; Petridou, Eleni Th; Pike, Malcolm C; Riboli, Elio; Sesso, Howard D; Snyder, Kirk; Swerdlow, Anthony J; Trichopoulos, Dimitrios; Ursin, Giske; van den Brandt, Piet A; Van Den Eeden, Stephen K; Weiderpass, Elisabete; Willett, Walter C; Ewertz, Marianne; Thomas, David B
2014-03-01
The etiology of male breast cancer is poorly understood, partly because of its relative rarity. Although genetic factors are involved, less is known regarding the role of anthropometric and hormonally related risk factors. In the Male Breast Cancer Pooling Project, a consortium of 11 case-control and 10 cohort investigations involving 2405 case patients (n = 1190 from case-control and n = 1215 from cohort studies) and 52013 control subjects, individual participant data were harmonized and pooled. Unconditional logistic regression generated study design-specific (case-control/cohort) odds ratios (ORs) and 95% confidence intervals (CIs), with exposure estimates combined using fixed effects meta-analysis. All statistical tests were two-sided. Risk was statistically significantly associated with weight (highest/lowest tertile: OR = 1.36; 95% CI = 1.18 to 1.57), height (OR = 1.18; 95% CI = 1.01 to 1.38), and body mass index (BMI; OR = 1.30; 95% CI = 1.12 to 1.51), with evidence that recent rather than distant BMI was the strongest predictor. Klinefelter syndrome (OR = 24.7; 95% CI = 8.94 to 68.4) and gynecomastia (OR = 9.78; 95% CI = 7.52 to 12.7) were also statistically significantly associated with risk, relations that were independent of BMI. Diabetes also emerged as an independent risk factor (OR = 1.19; 95% CI = 1.04 to 1.37). There were also suggestive relations with cryptorchidism (OR = 2.18; 95% CI = 0.96 to 4.94) and orchitis (OR = 1.43; 95% CI = 1.02 to 1.99). Although age at onset of puberty and histories of infertility were unrelated to risk, never having had children was statistically significantly related (OR = 1.29; 95% CI = 1.01 to 1.66). Among individuals diagnosed at older ages, a history of fractures was statistically significantly related (OR = 1.41; 95% CI = 1.07 to 1.86). Consistent findings across case-control and cohort investigations, complemented by pooled analyses, indicated important roles for anthropometric and hormonal risk factors in the etiology of male breast cancer. Further investigation should focus on potential roles of endogenous hormones.
Aurora, R. Nisha; Putcha, Nirupama; Swartz, Rachel; Punjabi, Naresh M.
2016-01-01
Background Obstructive sleep apnea is a prevalent yet underdiagnosed condition associated with cardiovascular morbidity and mortality. Home sleep testing offers an efficient means for diagnosing obstructive sleep apnea but has primarily been deployed in clinical samples with a high pretest probability. The current study sought to assess if obstructive sleep apnea can be diagnosed with home sleep testing in a non-referred sample without involvement of a sleep medicine specialist. Methods A study of community-based adults with untreated obstructive sleep apnea was undertaken. Misclassification of disease severity based on home sleep testing with and without involvement of a sleep medicine specialist was assessed, and agreement was characterized using scatter plots, Pearson's correlation coefficient, Bland-Altman analysis, and the kappa statistic. Analyses were also conducted to assess whether any observed differences varied as a function of pretest probability of obstructive sleep apnea or subjective sleepiness. Results The sample consisted of 191 subjects with over half (56.5%) having obstructive sleep apnea. Without involvement of a sleep medicine specialist, obstructive sleep apnea was not identified in only 5.8% of the sample. Analyses comparing the categorical assessment of disease severity with and without a sleep medicine specialist showed that in total, 32 subjects (16.8%) were misclassified. Agreement in the disease severity with and without a sleep medicine specialist was not influenced by the pretest probability or daytime sleep tendency. Conclusion Obstructive sleep apnea can be reliably identified with home sleep testing in a non-referred sample irrespective of the pretest probability of the disease. PMID:26968467
Aurora, R Nisha; Putcha, Nirupama; Swartz, Rachel; Punjabi, Naresh M
2016-07-01
Obstructive sleep apnea is a prevalent yet underdiagnosed condition associated with cardiovascular morbidity and mortality. Home sleep testing offers an efficient means for diagnosing obstructive sleep apnea but has been deployed primarily in clinical samples with a high pretest probability. The present study sought to assess whether obstructive sleep apnea can be diagnosed with home sleep testing in a nonreferred sample without involvement of a sleep medicine specialist. A study of community-based adults with untreated obstructive sleep apnea was undertaken. Misclassification of disease severity according to home sleep testing with and without involvement of a sleep medicine specialist was assessed, and agreement was characterized using scatter plots, Pearson's correlation coefficient, Bland-Altman analysis, and the κ statistic. Analyses were also conducted to assess whether any observed differences varied as a function of pretest probability of obstructive sleep apnea or subjective sleepiness. The sample consisted of 191 subjects, with more than half (56.5%) having obstructive sleep apnea. Without involvement of a sleep medicine specialist, obstructive sleep apnea was not identified in only 5.8% of the sample. Analyses comparing the categorical assessment of disease severity with and without a sleep medicine specialist showed that in total, 32 subjects (16.8%) were misclassified. Agreement in the disease severity with and without a sleep medicine specialist was not influenced by the pretest probability or daytime sleep tendency. Obstructive sleep apnea can be reliably identified with home sleep testing in a nonreferred sample, irrespective of the pretest probability of the disease. Copyright © 2016 Elsevier Inc. All rights reserved.
King, Keith A; Vidourek, Rebecca A; Merianos, Ashley L; Bartsch, Lauren A
2015-01-01
Marijuana use rates remain higher among Hispanic youth compared to youth from other ethnic groups. The purpose of the study was to examine if sex, age, authoritarian parenting, perceived school experiences, lifetime depression, legal involvement, and perceived social norms of marijuana use predicted recent marijuana use and past year marijuana use among Hispanic youth. The participants of this study were a nationwide sample of Hispanic youth (n = 3,457) in the United States. A secondary data analysis of the 2012 National Survey on Drug Use and Health was performed. Unadjusted odds ratios were computed via univariate logistic regression analyses and all statistically significant variables were retained and included in the final multiple logistic regression analyses. Recent marijuana use was operationally defined as use within the past 30 days, and marijuana use in the past year was defined as use within the past year. Results indicated that 7.5% of Hispanic youth used within the past month and 14.5% of Hispanic youth used within the past year. Results revealed that significant predictors for recent use were age, authoritarian parenting, perceived school experiences, legal involvement, and perceived social norms of youth marijuana use. Predictors for past year were age, perceived school experiences, legal involvement, and perceived social norms of youth marijuana use. Findings from this study can be used to address the public health problem of marijuana use among Hispanic youth that is ultimately contributing to health disparities among this ethnic group nationwide. Recommendations for future studies are included.
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
Comments on `A Cautionary Note on the Interpretation of EOFs'.
NASA Astrophysics Data System (ADS)
Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio
2003-04-01
The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.
2014-01-01
Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829
Pham, Timothy T; Miller, Michael J; Harrison, Donald L; Lloyd, Ann E; Crosby, Kimberly M; Johnson, Jeremy L
2013-12-01
Responding to safety concerns, the American Heart Association (AHA) published guidelines for non-steroidal anti-inflammatory drug (NSAID) use in patients with pre-existing cardiovascular disease (CVD) during 2005 and revised them in 2007. In the revision, a stepped approach to pain management recommended non-selective NSAIDs over highly selective NSAIDs. This research evaluated NSAID prescribing during and after guideline dissemination. A cross-sectional sample of 8666 adult, community-based practice visits with one NSAID prescription representing approximately 305 million visits from the National Ambulatory Medical Care Survey (NAMCS) from 2005 to 2010 was studied. Multivariable logistic regression controlling for patient, provider and visit characteristics assessed the associations between diagnosis of CVD and NSAID type prescribed during each calendar year. Visits were stratified by arthritis diagnosis to model short-term/intermittent and long-term NSAID use. Approximately one-third (36.8%) of visits involving a NSAID prescription included at least one of four diagnoses for CVD (i.e. hypertension, congestive heart failure, ischaemic heart disease or cerebrovascular disease). Visits involving a CVD diagnosis had increased odds of a prescription for celecoxib, a highly selective NSAIDs, overall [adjusted odds ratio (AOR) = 1.29, 95% confidence interval (CI): 1.06-1.57] and in the subgroup of visits without an arthritis diagnosis (AOR = 1.45, 95% CI: 1.11-1.89). Results were not statistically significant for visits with an arthritis diagnosis (AOR = 1.10, 95% CI: 0.47-2.57). When analysed by year, the relationship was statistically significant in 2005 and 2006, but not statistically significant in each subsequent year. National prescribing trends suggest partial implementation of AHA guidelines for NSAID prescribing in CVD from 2005 to 2010. © 2012 John Wiley & Sons Ltd.
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
Reliability of Computerized Neurocognitive Tests for Concussion Assessment: A Meta-Analysis.
Farnsworth, James L; Dargo, Lucas; Ragan, Brian G; Kang, Minsoo
2017-09-01
Although widely used, computerized neurocognitive tests (CNTs) have been criticized because of low reliability and poor sensitivity. A systematic review was published summarizing the reliability of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores; however, this was limited to a single CNT. Expansion of the previous review to include additional CNTs and a meta-analysis is needed. Therefore, our purpose was to analyze reliability data for CNTs using meta-analysis and examine moderating factors that may influence reliability. A systematic literature search (key terms: reliability, computerized neurocognitive test, concussion) of electronic databases (MEDLINE, PubMed, Google Scholar, and SPORTDiscus) was conducted to identify relevant studies. Studies were included if they met all of the following criteria: used a test-retest design, involved at least 1 CNT, provided sufficient statistical data to allow for effect-size calculation, and were published in English. Two independent reviewers investigated each article to assess inclusion criteria. Eighteen studies involving 2674 participants were retained. Intraclass correlation coefficients were extracted to calculate effect sizes and determine overall reliability. The Fisher Z transformation adjusted for sampling error associated with averaging correlations. Moderator analyses were conducted to evaluate the effects of the length of the test-retest interval, intraclass correlation coefficient model selection, participant demographics, and study design on reliability. Heterogeneity was evaluated using the Cochran Q statistic. The proportion of acceptable outcomes was greatest for the Axon Sports CogState Test (75%) and lowest for the ImPACT (25%). Moderator analyses indicated that the type of intraclass correlation coefficient model used significantly influenced effect-size estimates, accounting for 17% of the variation in reliability. The Axon Sports CogState Test, which has a higher proportion of acceptable outcomes and shorter test duration relative to other CNTs, may be a reliable option; however, future studies are needed to compare the diagnostic accuracy of these instruments.
Mayhew, Terry M; Lucocq, John M
2011-03-01
Various methods for quantifying cellular immunogold labelling on transmission electron microscope thin sections are currently available. All rely on sound random sampling principles and are applicable to single immunolabelling across compartments within a given cell type or between different experimental groups of cells. Although methods are also available to test for colocalization in double/triple immunogold labelling studies, so far, these have relied on making multiple measurements of gold particle densities in defined areas or of inter-particle nearest neighbour distances. Here, we present alternative two-step approaches to codistribution and colocalization assessment that merely require raw counts of gold particles in distinct cellular compartments. For assessing codistribution over aggregate compartments, initial statistical evaluation involves combining contingency table and chi-squared analyses to provide predicted gold particle distributions. The observed and predicted distributions allow testing of the appropriate null hypothesis, namely, that there is no difference in the distribution patterns of proteins labelled by different sizes of gold particle. In short, the null hypothesis is that of colocalization. The approach for assessing colabelling recognises that, on thin sections, a compartment is made up of a set of sectional images (profiles) of cognate structures. The approach involves identifying two groups of compartmental profiles that are unlabelled and labelled for one gold marker size. The proportions in each group that are also labelled for the second gold marker size are then compared. Statistical analysis now uses a 2 × 2 contingency table combined with the Fisher exact probability test. Having identified double labelling, the profiles can be analysed further in order to identify characteristic features that might account for the double labelling. In each case, the approach is illustrated using synthetic and/or experimental datasets and can be refined to correct observed labelling patterns to specific labelling patterns. These simple and efficient approaches should be of more immediate utility to those interested in codistribution and colocalization in multiple immunogold labelling investigations.
The Effects of Student and Text Characteristics on the Oral Reading Fluency of Middle-Grade Students
Barth, Amy E.; Tolar, Tammy D.; Fletcher, Jack M.; Francis, David
2014-01-01
We evaluated the effects of student characteristics (sight word reading efficiency, phonological decoding, verbal knowledge, level of reading ability, grade, gender) and text features (passage difficulty, length, genre, and language and discourse attributes) on the oral reading fluency of a sample of middle-school students in Grades 6–8 (N = 1,794). Students who were struggling (n = 704) and typically developing readers (n = 1,028) were randomly assigned to read five 1-min passages from each of 5 Lexile bands (within student range of 550 Lexiles). A series of multilevel analyses showed that student and text characteristics contributed uniquely to oral reading fluency rates. Student characteristics involving sight word reading efficiency and level of decoding ability accounted for more variability than reader type and verbal knowledge, with small, but statistically significant effects of grade and gender. The most significant text feature was passage difficulty level. Interactions involving student text characteristics, especially attributes involving overall ability level and difficulty of the text, were also apparent. These results support views of the development of oral reading fluency that involve interactions of student and text characteristics and highlight the importance of scaling for passage difficulty level in assessing individual differences in oral reading fluency. PMID:24567659
Frew, Paula M.; Williams, Victoria A.; Shapiro, Eve T.; Sanchez, Travis; Rosenberg, Eli S.; Fenimore, Vincent L.; Sullivan, Patrick S.
2014-01-01
Background HIV continues to be a major concern among MSM, yet Black MSM have not been enrolled in HIV research studies in proportionate numbers to White MSM. We developed an HIV prevention research brand strategy for MSM. Methods Questionnaires and focus groups were conducted with 54 participants. Descriptive statistics and chi-square analyses were performed and qualitative data were transcribed and content analyzed to identify common themes. Results Formative research results indicated that younger Black MSM (18–29 years) were less likely to think about joining prevention studies compared to older (≥30 years) Black MSM (x2 = 5.92, P = 0.015). Qualitative and quantitative results indicate four prominent themes related to brand development: (1) communication sources (message deliverer), (2) message (impact of public health messaging on perceptions of HIV research), (3) intended audience (underlying issues that influence personal relevance of HIV research), and (4) communication channels (reaching intended audiences). Conclusion The findings highlight the importance of behavioral communication translational research to effectively engage hard-to-reach populations. Despite reservations, MSM in our formative study expressed a need for active involvement and greater education to facilitate their engagement in HIV prevention research. Thus, the brand concept of “InvolveMENt” emerged. PMID:24639900
Factors predicting survival in amyotrophic lateral sclerosis patients on non-invasive ventilation.
Gonzalez Calzada, Nuria; Prats Soro, Enric; Mateu Gomez, Lluis; Giro Bulta, Esther; Cordoba Izquierdo, Ana; Povedano Panades, Monica; Dorca Sargatal, Jordi; Farrero Muñoz, Eva
2016-01-01
Non invasive ventilation (NIV) improves quality of life and extends survival in amyotrophic lateral sclerosis (ALS) patients. However, few data exist about the factors related to survival. We intended to assess the predictive factors that influence survival in patients after NIV initiation. Patients who started NIV from 2000 to 2014 and were tolerant (compliance ≥ 4 hours) were included; demographic, disease related and respiratory variables at NIV initiation were analysed. Statistical analysis was performed using the Kaplan-Meier test and Cox proportional hazard models. 213 patients were included with median survival from NIV initiation of 13.5 months. In univariate analysis, the identified risk factors for mortality were severity of bulbar involvement (HR 2), Forced Vital Capacity (FVC) % (HR 0.99) and ALSFRS-R (HR 0.97). Multivariate analysis showed that bulbar involvement (HR 1.92) and ALSFRS-R (HR 0.97) were independent predictive factors of survival in patients on NIV. In our study, the two prognostic factors in ALS patients following NIV were the severity of bulbar involvement and ALSFRS-R at the time on NIV initiation. A better assessment of bulbar involvement, including evaluation of the upper airway, and a careful titration on NIV are necessary to optimize treatment efficacy.
Genetics of PCOS: A systematic bioinformatics approach to unveil the proteins responsible for PCOS.
Panda, Pritam Kumar; Rane, Riya; Ravichandran, Rahul; Singh, Shrinkhla; Panchal, Hetalkumar
2016-06-01
Polycystic ovary syndrome (PCOS) is a hormonal imbalance in women, which causes problems during menstrual cycle and in pregnancy that sometimes results in fatality. Though the genetics of PCOS is not fully understood, early diagnosis and treatment can prevent long-term effects. In this study, we have studied the proteins involved in PCOS and the structural aspects of the proteins that are taken into consideration using computational tools. The proteins involved are modeled using Modeller 9v14 and Ab-initio programs. All the 43 proteins responsible for PCOS were subjected to phylogenetic analysis to identify the relatedness of the proteins. Further, microarray data analysis of PCOS datasets was analyzed that was downloaded from GEO datasets to find the significant protein-coding genes responsible for PCOS, which is an addition to the reported protein-coding genes. Various statistical analyses were done using R programming to get an insight into the structural aspects of PCOS that can be used as drug targets to treat PCOS and other related reproductive diseases.
2012-01-01
Background There is a need for more Comparative Effectiveness Research (CER) to strengthen the evidence base for clinical and policy decision-making. Effectiveness Guidance Documents (EGD) are targeted to clinical researchers. The aim of this EGD is to provide specific recommendations for the design of prospective acupuncture studies to support optimal use of resources for generating evidence that will inform stakeholder decision-making. Methods Document development based on multiple systematic consensus procedures (written Delphi rounds, interactive consensus workshop, international expert review). To balance aspects of internal and external validity, multiple stakeholders including patients, clinicians and payers were involved. Results Recommendations focused mainly on randomized studies and were developed for the following areas: overall research strategy, treatment protocol, expertise and setting, outcomes, study design and statistical analyses, economic evaluation, and publication. Conclusion The present EGD, based on an international consensus developed with multiple stakeholder involvement, provides the first systematic methodological guidance for future CER on acupuncture. PMID:22953730
Busch, Hauke; Boerries, Melanie; Bao, Jie; Hanke, Sebastian T; Hiss, Manuel; Tiko, Theodhor; Rensing, Stefan A
2013-01-01
Transcription factors (TFs) often trigger developmental decisions, yet, their transcripts are often only moderately regulated and thus not easily detected by conventional statistics on expression data. Here we present a method that allows to determine such genes based on trajectory analysis of time-resolved transcriptome data. As a proof of principle, we have analysed apical stem cells of filamentous moss (P. patens) protonemata that develop from leaflets upon their detachment from the plant. By our novel correlation analysis of the post detachment transcriptome kinetics we predict five out of 1,058 TFs to be involved in the signaling leading to the establishment of pluripotency. Among the predicted regulators is the basic helix loop helix TF PpRSL1, which we show to be involved in the establishment of apical stem cells in P. patens. Our methodology is expected to aid analysis of key players of developmental decisions in complex plant and animal systems.
Involvement in bullying and suicidal ideation in middle adolescence: a 2-year follow-up study.
Heikkilä, Hanna-Kaisa; Väänänen, Juha; Helminen, Mika; Fröjd, Sari; Marttunen, Mauri; Kaltiala-Heino, Riittakerttu
2013-02-01
The objective of the study was to ascertain whether involvement in bullying increases the risk for subsequent suicidal ideation. A total of 2,070 Finnish girls and boys aged 15 were surveyed in the ninth grade (age 15) in schools, and followed up 2 years later in the Adolescent Mental Health Cohort Study. Involvement in bullying was elicited at age 15 by two questions focusing on being a bully and being a victim of bullying. Suicidal ideation was elicited by one item of the short Beck Depression Inventory at age 17. Baseline depressive symptoms and externalizing symptoms, age and sex were controlled for. Statistical analyses were carried out using cross-tabulations with Chi-square/Fisher's exact test and logistic regression. Suicidal ideation at age 17 was 3-4 times more prevalent among those who had been involved in bullying at age 15 than among those not involved. Suicidal ideation at age 17 was most prevalent among former victims of bullying. Being a victim of bullying at age 15 continued to predict subsequent suicidal ideation when depressive and externalizing symptoms were controlled for. Being a bully at age 15 also persisted as borderline significantly predictive of suicidal ideation when baseline symptoms were controlled for. Findings indicate adolescent victims and perpetrators of bullying alike are at long-term risk for suicidal ideation.
Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M
2017-02-01
Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
USDA-ARS?s Scientific Manuscript database
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
Statistical Inference at Work: Statistical Process Control as an Example
ERIC Educational Resources Information Center
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
ERIC Educational Resources Information Center
Steyvers, Mark; Tenenbaum, Joshua B.
2005-01-01
We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…
Improving suicide mortality statistics in Tarragona (Catalonia, Spain) between 2004-2012.
Barbería, Eneko; Gispert, Rosa; Gallo, Belén; Ribas, Gloria; Puigdefàbregas, Anna; Freitas, Adriana; Segú, Elena; Torralba, Pilar; García-Sayago, Francisco; Estarellas, Aina
2016-07-20
Monitoring and preventing suicidal behaviour requires, among other data, knowing suicide deaths precisely. They often appear under-reported or misclassified in the official mortality statistics. The aim of this study is to analyse the under-reporting found in the suicide mortality statistics of Tarragona (a province of Catalonia, Spain). The analysis takes into account all suicide deaths that occurred in the Tarragona Area of the Catalan Institute of Legal Medicine and Forensic Sciences (TA-CILMFS) between 2004 and 2012. The sources of information were the death data files of the Catalan Mortality Register, as well as the Autopsies Files of the TA-CILMFS. Suicide rates and socio-demographic profiles were statistically compared between the suicide initially reported and the final one. The mean percentage of non-reported cases in the period was 16.2%, with a minimum percentage of 2.2% in 2005 and a maximum of 26.8% in 2009. The crude mortality rate by suicide rose from 6.6 to 7.9 per 100,000 inhabitants once forensic data were incorporated. Small differences were detected between the socio-demographic profile of the suicide initially reported and the final one. Supplementary information was obtained on the suicide method, which revealed a significant increase in poisoning and suicides involving trains. An exhaustive review of suicide deaths data from forensic sources has led to an improvement in the under-reported statistical information. It also improves the knowledge of the method of suicide and personal characteristics. Copyright © 2016 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
Statistical Sources for Health Science Librarians.
ERIC Educational Resources Information Center
Weise, Frieda
This continuing education course syllabus presents information on the collection of vital and health statistics, lists of agencies or organizations involved in statistical collection and/or dissemination, annotated bibliographies of statistical sources, and guidelines for accessing statistical information. Topics covered include: (1) the reporting…
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
NASA Astrophysics Data System (ADS)
Esa, Suraya; Mohamed, Nurul Akmal
2017-05-01
This study aims to identify the relationship between students' learning styles and mathematics anxiety amongst Form Four students in Kerian, Perak. The study involves 175 Form Four students as respondents. The instrument which is used to assess the students' learning styles and mathematic anxiety is adapted from the Grasha's Learning Styles Inventory and the Mathematics Anxiety Scale (MAS) respectively. The types of learning styles used are independent, avoidant, collaborative, dependent, competitive and participant. The collected data is processed by SPSS (Statistical Packages for Social Sciences 16.0). The data is analysed by using descriptive statistics and inferential statistics that include t-test and Pearson correlation. The results show that majority of the students adopt collaborative learning style and the students have moderate level of mathematics anxiety. Moreover, it is found that there is significant difference between learning style avoidant, collaborative, dependent and participant based on gender. Amongst all students' learning style, there exists a weak but significant correlation between avoidant, independent and participant learning style and mathematics anxiety. It is very important for the teachers need to be concerned about the effects of learning styles on mathematics anxiety. Therefore, the teachers should understand mathematics anxiety and implement suitable learning strategies in order for the students to overcome their mathematics anxiety.
Ewertzon, M; Lützén, K; Svensson, E; Andershed, B
2010-06-01
The involvement of family members in psychiatric care is important for the recovery of persons with psychotic disorders and subsequently reduces the burden on the family. Earlier qualitative studies suggest that the participation of family members can be limited by how they experience the professionals' approach, which suggests a connection to the concept of alienation. Thus, the aim of this study was in a national sample investigate family members' experiences of the psychiatric health care professionals' approach. Data were collected by the Family Involvement and Alienation Questionnaire. The median level and quartiles were used to describe the distributions and data were analysed with non-parametric statistical methods. Seventy family members of persons receiving psychiatric care participated in the study. The results indicate that a majority of the participants respond that they have experiencing a negative approach from the professionals, indicating lack of confirmation and cooperation. The results also indicate that a majority of the participants felt powerlessness and social isolation in the care being provided, indicating feelings of alienation. A significant but weak association was found between the family members' experiences of the professionals' approach and their feelings of alienation.
Timing in turn-taking and its implications for processing models of language
Levinson, Stephen C.; Torreira, Francisco
2015-01-01
The core niche for language use is in verbal interaction, involving the rapid exchange of turns at talking. This paper reviews the extensive literature about this system, adding new statistical analyses of behavioral data where they have been missing, demonstrating that turn-taking has the systematic properties originally noted by Sacks et al. (1974; hereafter SSJ). This system poses some significant puzzles for current theories of language processing: the gaps between turns are short (of the order of 200 ms), but the latencies involved in language production are much longer (over 600 ms). This seems to imply that participants in conversation must predict (or ‘project’ as SSJ have it) the end of the current speaker’s turn in order to prepare their response in advance. This in turn implies some overlap between production and comprehension despite their use of common processing resources. Collecting together what is known behaviorally and experimentally about the system, the space for systematic explanations of language processing for conversation can be significantly narrowed, and we sketch some first model of the mental processes involved for the participant preparing to speak next. PMID:26124727
Rost, Michael; Wangmo, Tenzin; Niggli, Felix; Hartmann, Karin; Hengartner, Heinz; Ansari, Marc; Brazzola, Pierluigi; Rischewski, Johannes; Beck-Popovic, Maja; Kühne, Thomas; Elger, Bernice S
2017-12-01
The goal is to present how shared decision-making in paediatric oncology occurs from the viewpoints of parents and physicians. Eight Swiss Pediatric Oncology Group centres participated in this prospective study. The sample comprised a parent and physician of the minor patient (<18 years). Surveys were statistically analysed by comparing physicians' and parents' perspectives and by evaluating factors associated with children's actual involvement. Perspectives of ninety-one parents and twenty physicians were obtained for 151 children. Results indicate that for six aspects of information provision examined, parents' and physicians' perceptions differed. Moreover, parents felt that the children were more competent to understand diagnosis and prognosis, assessed the disease of the children as worse, and reported higher satisfaction with decision-making on the part of the children. A patient's age and gender predicted involvement. Older children and girls were more likely to be involved. In the decision-making process, parents held a less active role than they actually wanted. Physicians should take measures to ensure that provided information is understood correctly. Furthermore, they should work towards creating awareness for systematic differences between parents and physicians with respect to the perception of the child, the disease, and shared decision-making.
Guo, Fuchuan; Zi, Tianqi; Liu, Liyan; Feng, Rennan; Sun, Changhao
2017-07-19
It has been demonstrated that mangiferin can ameliorate hypertriglyceridemia by modulating the expression levels of genes involved in lipid metabolism in animal experiments, but its effects on the serum metabolic fingerprinting of hyperlipidemia animal models have not been reported. Thus, a NMR-based metabolomics approach was conducted to explore the effects of mangiferin on hyperlipidemia hamsters and to gain a better understanding of the involved metabolic pathways. Hamsters fed with a high-fat diet were orally administered with mangiferin 150 mg per kg BW once a day for 8 weeks. Serum samples were analysed by 1 H NMR, and multivariate statistical analysis was applied to the data to identify potential biomarkers. In total, 20 discriminating metabolites were identified. It turned out that mangiferin administration can partly reverse the metabolism disorders induced by a high-fat diet and exerted a good anti-hypertriglyceridemia effect. Mangiferin ameliorated hyperlipidemia by intervening in some major metabolic pathways, involving glycolysis, the TCA cycle, synthesis of ketone bodies, and BCAAs as well as choline and lipid metabolism. These findings provided new essential information on the effects of mangiferin and demonstrated the great potential of this nutrimetabolomics approach.
Profiling drunk driving recidivists in Denmark.
Møller, Mette; Haustein, Sonja; Prato, Carlo Giacomo
2015-10-01
Drunk drivers are a menace to themselves and to other road users, as drunk driving significantly increases the risk of involvement in road accidents and the probability of severe or fatal injuries. Although injuries and fatalities related to road accidents have decreased in recent decades, the prevalence of drunk driving among drivers killed in road accidents has remained stable, at around 25% or more during the past 10 years. Understanding drunk driving, and in particular, recidivism, is essential for designing effective countermeasures, and accordingly, the present study aims at identifying the differences between non-drunk drivers, drunk driving non-recidivists and drunk driving recidivists with respect to their demographic and socio-economic characteristics, road accident involvement and other traffic and non-traffic-related law violations. This study is based on register-data from Statistics Denmark and includes information from 2008 to 2012 for the entire population, aged 18 or older, of Denmark. The results from univariate and multivariate statistical analyses reveal a five year prevalence of 17% for drunk driving recidivism, and a significant relation between recidivism and the drunk drivers' gender, age, income, education, receipt of an early retirement pension, household type, and residential area. Moreover, recidivists are found to have a higher involvement in alcohol-related road accidents, as well as other traffic and, in particular, non-traffic-related offences. These findings indicate that drunk driving recidivism is more likely to occur among persons who are in situations of socio-economic disadvantage and marginalisation. Thus, to increase their effectiveness, preventive measures aiming to reduce drunk driving should also address issues related to the general life situations of the drunk driving recidivists that contribute to an increased risk of drunk driving recidivism. Copyright © 2015 Elsevier Ltd. All rights reserved.
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft
NASA Technical Reports Server (NTRS)
Gross, D.; Miller, D. R.; Soland, R. M.
1980-01-01
The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.
Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.
Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S
2016-01-01
Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
Fine-structural changes in the midgut of old Drosophila melanogaster
NASA Technical Reports Server (NTRS)
Anton-Erxleben, F.; Miquel, J.; Philpott, D. E.
1983-01-01
Senescent fine-structural changes in the midgut of Drosophila melanogaster are investigated. A large number of midgut mitochondria in old flies exhibit nodular cristae and a tubular system located perpendicular to the normal cristae orientation. Anterior intestinal cells show a senescent accumulation of age pigment, either with a surrounding two-unit membrane or without any membrane. The predominant localization of enlarged mitochondria and pigment in the luminal gut region may be related to the polarized metabolism of the intestinal cells. Findings concur with previous observations of dense-body accumulations and support the theory that mitochondria are involved in the aging of fixed post-mitotic cells. Demonstrated by statistical analyses is that mitochondrial size increase is related to mitochondrial variation increase.
An Automated Blur Detection Method for Histological Whole Slide Imaging
Moles Lopez, Xavier; D'Andrea, Etienne; Barbot, Paul; Bridoux, Anne-Sophie; Rorive, Sandrine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine
2013-01-01
Whole slide scanners are novel devices that enable high-resolution imaging of an entire histological slide. Furthermore, the imaging is achieved in only a few minutes, which enables image rendering of large-scale studies involving multiple immunohistochemistry biomarkers. Although whole slide imaging has improved considerably, locally poor focusing causes blurred regions of the image. These artifacts may strongly affect the quality of subsequent analyses, making a slide review process mandatory. This tedious and time-consuming task requires the scanner operator to carefully assess the virtual slide and to manually select new focus points. We propose a statistical learning method that provides early image quality feedback and automatically identifies regions of the image that require additional focus points. PMID:24349343
Impact of gender, age and experience of pilots on general aviation accidents.
Bazargan, Massoud; Guzhva, Vitaly S
2011-05-01
General aviation (GA) accounts for more than 82% of all air transport-related accidents and air transport-related fatalities in the U.S. In this study, we conduct a series of statistical analyses to investigate the significance of a pilot's gender, age and experience in influencing the risk for pilot errors and fatalities in GA accidents. There is no evidence from the Chi-square tests and logistic regression models that support the likelihood of an accident caused by pilot error to be related to pilot gender. However, evidence is found that male pilots, those older than 60 years of age, and with more experience, are more likely to be involved in a fatal accident. Copyright © 2010 Elsevier Ltd. All rights reserved.
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Youngstrom, Eric A
2014-03-01
To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
Multimodal Feature Integration in the Angular Gyrus during Episodic and Semantic Retrieval
Bonnici, Heidi M.; Richter, Franziska R.; Yazar, Yasemin
2016-01-01
Much evidence from distinct lines of investigation indicates the involvement of angular gyrus (AnG) in the retrieval of both episodic and semantic information, but the region's precise function and whether that function differs across episodic and semantic retrieval have yet to be determined. We used univariate and multivariate fMRI analysis methods to examine the role of AnG in multimodal feature integration during episodic and semantic retrieval. Human participants completed episodic and semantic memory tasks involving unimodal (auditory or visual) and multimodal (audio-visual) stimuli. Univariate analyses revealed the recruitment of functionally distinct AnG subregions during the retrieval of episodic and semantic information. Consistent with a role in multimodal feature integration during episodic retrieval, significantly greater AnG activity was observed during retrieval of integrated multimodal episodic memories compared with unimodal episodic memories. Multivariate classification analyses revealed that individual multimodal episodic memories could be differentiated in AnG, with classification accuracy tracking the vividness of participants' reported recollections, whereas distinct unimodal memories were represented in sensory association areas only. In contrast to episodic retrieval, AnG was engaged to a statistically equivalent degree during retrieval of unimodal and multimodal semantic memories, suggesting a distinct role for AnG during semantic retrieval. Modality-specific sensory association areas exhibited corresponding activity during both episodic and semantic retrieval, which mirrored the functional specialization of these regions during perception. The results offer new insights into the integrative processes subserved by AnG and its contribution to our subjective experience of remembering. SIGNIFICANCE STATEMENT Using univariate and multivariate fMRI analyses, we provide evidence that functionally distinct subregions of angular gyrus (AnG) contribute to the retrieval of episodic and semantic memories. Our multivariate pattern classifier could distinguish episodic memory representations in AnG according to whether they were multimodal (audio-visual) or unimodal (auditory or visual) in nature, whereas statistically equivalent AnG activity was observed during retrieval of unimodal and multimodal semantic memories. Classification accuracy during episodic retrieval scaled with the trial-by-trial vividness with which participants experienced their recollections. Therefore, the findings offer new insights into the integrative processes subserved by AnG and how its function may contribute to our subjective experience of remembering. PMID:27194327
Multimodal Feature Integration in the Angular Gyrus during Episodic and Semantic Retrieval.
Bonnici, Heidi M; Richter, Franziska R; Yazar, Yasemin; Simons, Jon S
2016-05-18
Much evidence from distinct lines of investigation indicates the involvement of angular gyrus (AnG) in the retrieval of both episodic and semantic information, but the region's precise function and whether that function differs across episodic and semantic retrieval have yet to be determined. We used univariate and multivariate fMRI analysis methods to examine the role of AnG in multimodal feature integration during episodic and semantic retrieval. Human participants completed episodic and semantic memory tasks involving unimodal (auditory or visual) and multimodal (audio-visual) stimuli. Univariate analyses revealed the recruitment of functionally distinct AnG subregions during the retrieval of episodic and semantic information. Consistent with a role in multimodal feature integration during episodic retrieval, significantly greater AnG activity was observed during retrieval of integrated multimodal episodic memories compared with unimodal episodic memories. Multivariate classification analyses revealed that individual multimodal episodic memories could be differentiated in AnG, with classification accuracy tracking the vividness of participants' reported recollections, whereas distinct unimodal memories were represented in sensory association areas only. In contrast to episodic retrieval, AnG was engaged to a statistically equivalent degree during retrieval of unimodal and multimodal semantic memories, suggesting a distinct role for AnG during semantic retrieval. Modality-specific sensory association areas exhibited corresponding activity during both episodic and semantic retrieval, which mirrored the functional specialization of these regions during perception. The results offer new insights into the integrative processes subserved by AnG and its contribution to our subjective experience of remembering. Using univariate and multivariate fMRI analyses, we provide evidence that functionally distinct subregions of angular gyrus (AnG) contribute to the retrieval of episodic and semantic memories. Our multivariate pattern classifier could distinguish episodic memory representations in AnG according to whether they were multimodal (audio-visual) or unimodal (auditory or visual) in nature, whereas statistically equivalent AnG activity was observed during retrieval of unimodal and multimodal semantic memories. Classification accuracy during episodic retrieval scaled with the trial-by-trial vividness with which participants experienced their recollections. Therefore, the findings offer new insights into the integrative processes subserved by AnG and how its function may contribute to our subjective experience of remembering. Copyright © 2016 Bonnici, Richter, et al.
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Knowledge translation and implementation in spinal cord injury: a systematic review.
Noonan, V K; Wolfe, D L; Thorogood, N P; Park, S E; Hsieh, J T; Eng, J J
2014-08-01
To conduct a systematic review examining the effectiveness of knowledge translation (KT) interventions in changing clinical practice and patient outcomes. MEDLINE/PubMed, CINAHL, EMBASE and PsycINFO were searched for studies published from January 1980 to July 2012 that reported and evaluated an implemented KT intervention in spinal cord injury (SCI) care. We reviewed and summarized results from studies that documented the implemented KT intervention, its impact on changing clinician behavior and patient outcomes as well as the facilitators and barriers encountered during the implementation. A total of 13 articles featuring 10 studies were selected and abstracted from 4650 identified articles. KT interventions included developing and implementing patient care protocols, providing clinician education and incorporating outcome measures into clinical practice. The methods (or drivers) to facilitate the implementation included organizing training sessions for clinical staff, introducing computerized reminders and involving organizational leaders. The methodological quality of studies was mostly poor. Only 3 out of 10 studies evaluated the success of the implementation using statistical analyses, and all 3 reported significant behavior change. Out of the 10 studies, 6 evaluated the effect of the implementation on patient outcomes using statistical analyses, with 4 reporting significant improvements. The commonly cited facilitators and barriers were communication and resources, respectively. The field of KT in SCI is in its infancy with only a few relevant publications. However, there is some evidence that KT interventions may change clinician behavior and improve patient outcomes. Future studies should ensure rigorous study methods are used to evaluate KT interventions.
Neuroanatomical Characterization of Child Offspring of Bipolar Parents
Singh, Manpreet K.; DelBello, Melissa P.; Adler, Caleb M.; Stanford, Kevin E.; Strakowski, Stephen M.
2012-01-01
Objectives To examine structural differences in selected anterior limbic brain regions between at-risk children of parents with bipolar I disorder and children with healthy parents. We hypothesized that at-risk children would exhibit abnormalities in brain regions that are involved in mood regulation. Methods Children (8–12 years old) of parents with bipolar I disorder (“at-risk”, AR, N=21) and of parents without any DSM-IV Axis I disorder (health controls, HC, N=24) were evaluated using diagnosticassessments and brain magnetic resonance imaging (MRI). Morphometric analyses were used to examine group differences in the prefrontal cortical, thalamic, striatal, and amygdalar volumes. Results Nine (43%) of the AR children met DSM-IV-TR criteria for a non-bipolar mood disorder at the time of assessment. AR and HC children did not demonstrate statistically significant differences across regions of interest [Wilks Lambda = 0.86, F(4,39)=1.64, p=0.18; effect size, (f)=0.19]. Post-hoc analyses of covariance showed the largest relative effect size was contributed by the prefrontal cortex [(f)=0.26]. Conclusions 8 to 12 year old children with a familial risk for mania do not exhibit any statistically significant volumetric differences in the prefrontal cortex, thalamus, striatum, or amygdala as compared to age matched children of parents without any psychopathology. Longitudinal studies examining whether structural changes over time may be associated with vulnerability for developing subsequent bipolar disorder are needed to clarify the underlying pathophysiology of this disorder. PMID:18356766
WAIS-IV subtest covariance structure: conceptual and statistical considerations.
Ward, L Charles; Bergman, Maria A; Hebert, Katina R
2012-06-01
D. Wechsler (2008b) reported confirmatory factor analyses (CFAs) with standardization data (ages 16-69 years) for 10 core and 5 supplemental subtests from the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV). Analyses of the 15 subtests supported 4 hypothesized oblique factors (Verbal Comprehension, Working Memory, Perceptual Reasoning, and Processing Speed) but also revealed unexplained covariance between Block Design and Visual Puzzles (Perceptual Reasoning subtests). That covariance was not included in the final models. Instead, a path was added from Working Memory to Figure Weights (Perceptual Reasoning subtest) to improve fit and achieve a desired factor pattern. The present research with the same data (N = 1,800) showed that the path from Working Memory to Figure Weights increases the association between Working Memory and Matrix Reasoning. Specifying both paths improves model fit and largely eliminates unexplained covariance between Block Design and Visual Puzzles but with the undesirable consequence that Figure Weights and Matrix Reasoning are equally determined by Perceptual Reasoning and Working Memory. An alternative 4-factor model was proposed that explained theory-implied covariance between Block Design and Visual Puzzles and between Arithmetic and Figure Weights while maintaining compatibility with WAIS-IV Index structure. The proposed model compared favorably with a 5-factor model based on Cattell-Horn-Carroll theory. The present findings emphasize that covariance model comparisons should involve considerations of conceptual coherence and theoretical adherence in addition to statistical fit. (c) 2012 APA, all rights reserved
Institutional racism in public health contracting: Findings of a nationwide survey from New Zealand.
Came, H; Doole, C; McKenna, B; McCreanor, T
2018-02-01
Public institutions within New Zealand have long been accused of mono-culturalism and institutional racism. This study sought to identify inconsistencies and bias by comparing government funded contracting processes for Māori public health providers (n = 60) with those of generic providers (n = 90). Qualitative and quantitative data were collected (November 2014-May 2015), through a nationwide telephone survey of public health providers, achieving a 75% response rate. Descriptive statistical analyses were applied to quantitative responses and an inductive approach was taken to analyse data from open-ended responses in the survey domains of relationships with portfolio contract managers, contracting and funding. The quantitative data showed four sites of statistically significant variation: length of contracts, intensity of monitoring, compliance costs and frequency of auditing. Non-significant data involved access to discretionary funding and cost of living adjustments, the frequency of monitoring, access to Crown (government) funders and representation on advisory groups. The qualitative material showed disparate provider experiences, dependent on individual portfolio managers, with nuanced differences between generic and Māori providers' experiences. This study showed that monitoring government performance through a nationwide survey was an innovative way to identify sites of institutional racism. In a policy context where health equity is a key directive to the health sector, this study suggests there is scope for New Zealand health funders to improve their contracting practices. Copyright © 2017 Elsevier Ltd. All rights reserved.
Behr, Guilherme A; Patel, Jay P; Coote, Marg; Moreira, Jose C F; Gelain, Daniel P; Steiner, Meir; Frey, Benicio N
2017-05-01
Previous studies have reported that salivary concentrations of certain hormones correlate with their respective serum levels. However, most of these studies did not control for potential blood contamination in saliva. In the present study we developed a statistical method to test the amount of blood contamination that needs to be avoided in saliva samples for the following hormones: cortisol, estradiol, progesterone, testosterone and oxytocin. Saliva and serum samples were collected from 38 healthy, medication-free women (mean age=33.8±7.3yr.; range=19-45). Serum and salivary hormonal levels and the amount of transferrin in saliva samples were determined using enzyme immunoassays. Salivary transferrin levels did not correlate with salivary cortisol or estradiol (up to 3mg/dl), but they were positively correlated with salivary testosterone, progesterone and oxytocin (p<0.05). After controlling for blood contamination, only cortisol (r=0.65, P<0.001) and progesterone levels (r=0.57, P=0.002) displayed a positive correlation between saliva and serum. Our analyses suggest that transferrin levels higher than 0.80, 0.92 and 0.64mg/dl should be avoided for testosterone, progesterone and oxytocin salivary analyses, respectively. We recommend that salivary transferrin is measured in research involving salivary hormones in order to determine the level of blood contamination that might affect specific hormonal salivary concentrations. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Schuckit, Marc A; Saunders, John B
2006-09-01
This paper presents the recommendations, developed from a 3-year consultation process, for a program of research to underpin the development of diagnostic concepts and criteria in the Substance Use Disorders section of the Diagnostic and Statistical Manual of Mental Disorders (DSM) and potentially the relevant section of the next revision of the International Classification of Diseases (ICD). A preliminary list of research topics was developed at the DSM-V Launch Conference in 2004. This led to the presentation of articles on these topics at a specific Substance Use Disorders Conference in February 2005, at the end of which a preliminary list of research questions was developed. This was further refined through an iterative process involving conference participants over the following year. Research questions have been placed into four categories: (1) questions that could be addressed immediately through secondary analyses of existing data sets; (2) items likely to require position papers to propose criteria or more focused questions with a view to subsequent analyses of existing data sets; (3) issues that could be proposed for literature reviews, but with a lower probability that these might progress to a data analytic phase; and (4) suggestions or comments that might not require immediate action, but that could be considered by the DSM-V and ICD 11 revision committees as part of their deliberations. A broadly based research agenda for the development of diagnostic concepts and criteria for substance use disorders is presented.
Gender and Publishing in Nursing: a secondary analysis of h-index ranking tables.
Porter, Sam
2018-05-24
To analyse published ranking tables on academics' h-index scores to establish whether male nursing academics are disproportionately represented in these tables compared with their representation across the whole profession. Previous studies have identified a disproportionate representation of UK male nursing academics in publishing in comparison to their US counterparts. Secondary statistical analysis, which involved comparative correlation of proportions. Four papers from the UK, Canada and Australia containing h-index ranking tables and published between 2010-2017, were re-analysed in June 2017 to identify authors' sex. Pearson's chi-squared test was applied to ascertain whether the number of men included in the tables was statistically proportionate to the number of men on the pertinent national professional register. There was a disproportionate number of men with high h-index scores in the UK and Canadian data sets, compared with the proportion of men on the pertinent national registers. The number of men in the Australian data set was proportionate with the number of men on the nursing register. There was a disproportionate number of male professors in UK universities. The influence of men over nursing publishing in the UK and Canada outweighs their representation across the whole profession. Similarly, in the UK, men's representation in the professoriate is disproportionately great. However, the Australian results suggest that gender inequality is not inevitable and that it is possible to create more egalitarian nursing cultures. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias
2014-07-16
To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.
Hydrothermal contamination of public supply wells in Napa and Sonoma Valleys, California
Forrest, Matthew J.; Kulongoski, Justin T.; Edwards, Matthew S.; Farrar, Christopher D.; Belitz, Kenneth; Norris, Richard D.
2013-01-01
Groundwater chemistry and isotope data from 44 public supply wells in the Napa and Sonoma Valleys, California were determined to investigate mixing of relatively shallow groundwater with deeper hydrothermal fluids. Multivariate analyses including Cluster Analyses, Multidimensional Scaling (MDS), Principal Components Analyses (PCA), Analysis of Similarities (ANOSIM), and Similarity Percentage Analyses (SIMPER) were used to elucidate constituent distribution patterns, determine which constituents are significantly associated with these hydrothermal systems, and investigate hydrothermal contamination of local groundwater used for drinking water. Multivariate statistical analyses were essential to this study because traditional methods, such as mixing tests involving single species (e.g. Cl or SiO2) were incapable of quantifying component proportions due to mixing of multiple water types. Based on these analyses, water samples collected from the wells were broadly classified as fresh groundwater, saline waters, hydrothermal fluids, or mixed hydrothermal fluids/meteoric water wells. The Multivariate Mixing and Mass-balance (M3) model was applied in order to determine the proportion of hydrothermal fluids, saline water, and fresh groundwater in each sample. Major ions, isotopes, and physical parameters of the waters were used to characterize the hydrothermal fluids as Na–Cl type, with significant enrichment in the trace elements As, B, F and Li. Five of the wells from this study were classified as hydrothermal, 28 as fresh groundwater, two as saline water, and nine as mixed hydrothermal fluids/meteoric water wells. The M3 mixing-model results indicated that the nine mixed wells contained between 14% and 30% hydrothermal fluids. Further, the chemical analyses show that several of these mixed-water wells have concentrations of As, F and B that exceed drinking-water standards or notification levels due to contamination by hydrothermal fluids.
Analysis of longitudinal data from animals with missing values using SPSS.
Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F
2016-06-01
Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.
Hyde, J M; Cerezo, A; Williams, T J
2009-04-01
Statistical analysis of atom probe data has improved dramatically in the last decade and it is now possible to determine the size, the number density and the composition of individual clusters or precipitates such as those formed in reactor pressure vessel (RPV) steels during irradiation. However, the characterisation of the onset of clustering or co-segregation is more difficult and has traditionally focused on the use of composition frequency distributions (for detecting clustering) and contingency tables (for detecting co-segregation). In this work, the authors investigate the possibility of directly examining the neighbourhood of each individual solute atom as a means of identifying the onset of solute clustering and/or co-segregation. The methodology involves comparing the mean observed composition around a particular type of solute with that expected from the overall composition of the material. The methodology has been applied to atom probe data obtained from several irradiated RPV steels. The results show that the new approach is more sensitive to fine scale clustering and co-segregation than that achievable using composition frequency distribution and contingency table analyses.
NASA Astrophysics Data System (ADS)
Hendricks, Lorin; Spencer Guthrie, W.; Mazzeo, Brian
2018-04-01
An automated acoustic impact-echo testing device with seven channels has been developed for faster surveying of bridge decks. Due to potential variations in bridge deck overlay thickness, varying conditions between testing passes, and occasional imprecise equipment calibrations, a method that can account for variations in deck properties and testing conditions was necessary to correctly interpret the acoustic data. A new methodology involving statistical analyses was therefore developed. After acoustic impact-echo data are collected and analyzed, the results are normalized by the median for each channel, a Gaussian distribution is fit to the histogram of the data, and the Kullback-Leibler divergence test or Otsu's method is then used to determine the optimum threshold for differentiating between intact and delaminated concrete. The new methodology was successfully applied to individual channels of previously unusable acoustic impact-echo data obtained from a three-lane interstate bridge deck surfaced with a polymer overlay, and the resulting delamination map compared very favorably with the results of a manual deck sounding survey.
Image Processing Diagnostics: Emphysema
NASA Astrophysics Data System (ADS)
McKenzie, Alex
2009-10-01
Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.
The forgotten survey: social services in the Oxford district: 1935-40.
Peretz, Elizabeth
2011-01-01
This article describes one of the lesser known social surveys of the first half of the twentieth century in Britain and looks at its origins and its outcomes. Funded by the Rockefeller grant to Oxford University to enhance social studies there, the Oxford Survey published in two volumes in 1938 and 1940 engaged Oxford academics from agricultural economics, economics, statistics, and government, as well as Barnett House members involved in voluntary organizations, adult education, settlements, citizenship, and social work. It was a far-reaching study that aimed to analyse all aspects of public services, in the context of a thorough-going description of the geography, industry, and population statistics of the local area. It was also designed to have national relevance, because of the development of the motor industry in Cowley. The Oxford Survey differed from Booth and Rowntree's exploration of the habits and circumstances of the urban poor. Instead, it had more affinity to surveys of industrial and regional planning and work coming from the Le Play school, in which the act of surveying communities was perceived as a way of enhancing citizenship.
Language experience changes subsequent learning
Onnis, Luca; Thiessen, Erik
2013-01-01
What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. PMID:23200510
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
Implications of caesarean section for children's school achievement: A population-based study.
Smithers, Lisa G; Mol, Ben W; Wilkinson, Chris; Lynch, John W
2016-08-01
Caesarean birth is one of the most frequently performed major obstetrical interventions. Although there is speculation that caesarean at term may have consequences for children's later health and development, longer-term studies are needed. We aimed to evaluate risks to poor school achievement among children born by caesarean section compared with spontaneous vaginal birth. This population-based observational study involved linkage of routinely collected perinatal data with children's school assessments. Perinatal data included all children born in South Australia from 1999 to 2005. Participants were children born by elective caesarean (exposed, n = 650) or vaginal birth (unexposed, n = 2959), to women who previously had a caesarean delivery. School assessments were reported via a standardised national assessment program for children attending grade three (at ~eight years of age). Assessments included reading, writing, spelling, grammar and numeracy and were categorised according to performing at above or ≤National Minimum Standards (NMS). Statistical analyses involved augmented inverse probability weighting (apiw) and accounted for a range of maternal, perinatal and sociodemographic characteristics. Children performing ≤NMS for vaginal birth versus caesarean section were as follows: reading 144/640 (23%) and 688/2921 (24%), writing 69/636(11%) and 351/2917 (12%), spelling 128/646 (20%) and 684/2937 (23%), grammar 132/646 (20%) and 655/2937 (22%), and numeracy 151/634 (24%) and 729/2922 (25%). Both the raw data and the aipw analyses suggested little differences in school achievement between children born by caesarean versus vaginal birth. Analyses that carefully controlled for a wide range of confounders suggest that caesarean section does not increase the risk of poor school outcomes at age eight. © 2016 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith
2017-01-01
Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A
2016-06-14
A false positive is the mistake of inferring an effect when none exists, and although α controls the false positive (Type I error) rate in classical hypothesis testing, a given α value is accurate only if the underlying model of randomness appropriately reflects experimentally observed variance. Hypotheses pertaining to one-dimensional (1D) (e.g. time-varying) biomechanical trajectories are most often tested using a traditional zero-dimensional (0D) Gaussian model of randomness, but variance in these datasets is clearly 1D. The purpose of this study was to determine the likelihood that analyzing smooth 1D data with a 0D model of variance will produce false positives. We first used random field theory (RFT) to predict the probability of false positives in 0D analyses. We then validated RFT predictions via numerical simulations of smooth Gaussian 1D trajectories. Results showed that, across a range of public kinematic, force/moment and EMG datasets, the median false positive rate was 0.382 and not the assumed α=0.05, even for a simple two-sample t test involving N=10 trajectories per group. The median false positive rate for experiments involving three-component vector trajectories was p=0.764. This rate increased to p=0.945 for two three-component vector trajectories, and to p=0.999 for six three-component vectors. This implies that experiments involving vector trajectories have a high probability of yielding 0D statistical significance when there is, in fact, no 1D effect. Either (a) explicit a priori identification of 0D variables or (b) adoption of 1D methods can more tightly control α. Copyright © 2016 Elsevier Ltd. All rights reserved.
Genetic variant rs17225178 in the ARNT2 gene is associated with Asperger Syndrome.
Di Napoli, Agnese; Warrier, Varun; Baron-Cohen, Simon; Chakrabarti, Bhismadev
2015-01-01
Autism Spectrum Conditions (ASC) are neurodevelopmental conditions characterized by difficulties in communication and social interaction, alongside unusually repetitive behaviours and narrow interests. Asperger Syndrome (AS) is one subgroup of ASC and differs from classic autism in that in AS there is no language or general cognitive delay. Genetic, epigenetic and environmental factors are implicated in ASC and genes involved in neural connectivity and neurodevelopment are good candidates for studying the susceptibility to ASC. The aryl-hydrocarbon receptor nuclear translocator 2 (ARNT2) gene encodes a transcription factor involved in neurodevelopmental processes, neuronal connectivity and cellular responses to hypoxia. A mutation in this gene has been identified in individuals with ASC and single nucleotide polymorphisms (SNPs) have been nominally associated with AS and autistic traits in previous studies. In this study, we tested 34 SNPs in ARNT2 for association with AS in 118 cases and 412 controls of Caucasian origin. P values were adjusted for multiple comparisons, and linkage disequilibrium (LD) among the SNPs analysed was calculated in our sample. Finally, SNP annotation allowed functional and structural analyses of the genetic variants in ARNT2. We tested the replicability of our result using the genome-wide association studies (GWAS) database of the Psychiatric Genomics Consortium (PGC). We report statistically significant association of rs17225178 with AS. This SNP modifies transcription factor binding sites and regions that regulate the chromatin state in neural cell lines. It is also included in a LD block in our sample, alongside other genetic variants that alter chromatin regulatory regions in neural cells. These findings demonstrate that rs17225178 in the ARNT2 gene is associated with AS and support previous studies that pointed out an involvement of this gene in the predisposition to ASC.
Groen-Blokhuis, Maria M.; Pourcain, Beate St.; Greven, Corina U.; Pappa, Irene; Tiesler, Carla M.T.; Ang, Wei; Nolte, Ilja M.; Vilor-Tejedor, Natalia; Bacelis, Jonas; Ebejer, Jane L.; Zhao, Huiying; Davies, Gareth E.; Ehli, Erik A.; Evans, David M.; Fedko, Iryna O.; Guxens, Mònica; Hottenga, Jouke-Jan; Hudziak, James J.; Jugessur, Astanand; Kemp, John P.; Krapohl, Eva; Martin, Nicholas G.; Murcia, Mario; Myhre, Ronny; Ormel, Johan; Ring, Susan M.; Standl, Marie; Stergiakouli, Evie; Stoltenberg, Camilla; Thiering, Elisabeth; Timpson, Nicholas J.; Trzaskowski, Maciej; van der Most, Peter J.; Wang, Carol; Nyholt, Dale R.; Medland, Sarah E.; Neale, Benjamin; Jacobsson, Bo; Sunyer, Jordi; Hartman, Catharina A.; Whitehouse, Andrew J.O.; Pennell, Craig E.; Heinrich, Joachim; Plomin, Robert; Smith, George Davey; Tiemeier, Henning; Posthuma, Danielle; Boomsma, Dorret I.
2016-01-01
Objective To elucidate the influence of common genetic variants on childhood attention-deficit/hyperactivity disorder (ADHD) symptoms, to identify genetic variants that explain its high heritability, and to investigate the genetic overlap of ADHD symptom scores with ADHD diagnosis. Method Within the EArly Genetics and Lifecourse Epidemiology (EAGLE) consortium, genome-wide single nucleotide polymorphisms (SNPs) and ADHD symptom scores were available for 17,666 children (< 13 years) from nine population-based cohorts. SNP-based heritability was estimated in data from the three largest cohorts. Meta-analysis based on genome-wide association (GWA) analyses with SNPs was followed by gene-based association tests, and the overlap in results with a meta-analysis in the Psychiatric Genomics Consortium (PGC) case-control ADHD study was investigated. Results SNP-based heritability ranged from 5% to 34%, indicating that variation in common genetic variants influences ADHD symptom scores. The meta-analysis did not detect genome-wide significant SNPs, but three genes, lying close to each other with SNPs in high linkage disequilibrium (LD), showed a gene-wide significant association (p values between 1.46×10-6 and 2.66×10-6). One gene, WASL, is involved in neuronal development. Both SNP- and gene-based analyses indicated overlap with the PGC meta-analysis results with the genetic correlation estimated at 0.96. Conclusion The SNP-based heritability for ADHD symptom scores indicates a polygenic architecture and genes involved in neurite outgrowth are possibly involved. Continuous and dichotomous measures of ADHD appear to assess a genetically common phenotype. A next step is to combine data from population-based and case-control cohorts in genetic association studies to increase sample size and improve statistical power for identifying genetic variants. PMID:27663945
Force system generated by elastic archwires with vertical V bends: a three-dimensional analysis.
Upadhyay, Madhur; Shah, Raja; Peterson, Donald; Asaki, Takafumi; Yadav, Sumit; Agarwal, Sachin
2017-04-01
Our previous understanding of V-bend mechanics is primarily from two-dimensional (2D) analysis of archwire bracket interactions in the second order. These analyses do not take into consideration the three-dimensional (3D) nature of orthodontic appliances involving the third order. To quantify the force system generated in a 3D two bracket set up involving the molar and incisors with vertical V-bends. Maxillary molar and incisor brackets were arranged in a dental arch form and attached to load cells capable of measuring forces and moments in all three planes (x, y, and z) of space. Symmetrical V-bends (right and left sides) were placed at 11 different locations along rectangular beta-titanium archwires of various sizes at an angle of 150degrees. Each wire was evaluated for the 11 bend positions. Specifically, the vertical forces (Fz) and anterio-posterior moments (Mx) were analysed. Descriptive statistics were used to interpret the results. With increasing archwire size, Fz and Mx increased at the two brackets (P < 0.05). The vertical forces were linear and symmetric in nature, increasing in magnitude as the bends moved closer to either bracket. The Mx curves were asymmetric and non-linear displaying higher magnitudes for molar bracket. As the bends were moved closer to either bracket a distinct flattening of the incisor Mx curve was noted, implying no change in its magnitude. This article provides critical information on V-bend mechanics involving second order and third order archwire-bracket interactions. A model for determining this force system is described that might allow for easier translation to actual clinical practice. © The Author 2016. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com
Modelling brain activations and connectivity of pain modulated by having a loved one nearby
NASA Astrophysics Data System (ADS)
Tamam, Sofina; Ahmad, Asma Hayati; Kamil, Wan Ahmad
2018-06-01
This study is to model the connectivity between activated areas in the brain associated with pain responses in the presence and absence of a loved one. We used Th:YAG laser targeted onto the dorsum of the right hand of 17 Malay-female participants (mean age 20.59; SD 2.85 years) in two conditions: (1) in the absence of a loved one in the functional magnetic resonance imaging (fMRI) room (Alone condition), and (2) in the presence of a loved one (Support condition). The laser-induced pain stimuli were delivered according to an fMRI paradigm utilising blocked design comprising 15 blocks of activity and 15 blocks of rest. Brain activations and connectivity were analysed using statistical parametric mapping (SPM), dynamic causal modelling (DCM) and Bayesian model selection (BMS) analyses. Individual responses to pain were found to be divided into two categories: (1) Love Hurts (participants who reported more pain in the presence of a loved one) involved activations in thalamus (THA), parahippocampal gyrus (PHG) and hippocampus (HIP); and (2) Love Heals (participants who reported less pain in the presence of a loved one) involved activations in all parts of cingulate cortex. BMS showed that Love Heals could be represented by a cortical network involving the area of anterior cingulate cortex (ACC), middle cingulate cortex (MCC) and posterior cingulate cortex (PCC) in the intrinsic connectivity of ACC → PCC → MCC and ACC → MCC. There was no optimal model to explain the increase in pain threshold when accompanied by the loved one in Love Hurts. The present study reveals a new possible cortical network for the reduction of pain by having a loved one nearby.
Marino, Patricia; Siani, Carole; Bertucci, François; Roche, Henri; Martin, Anne-Laure; Viens, Patrice; Seror, Valérie
2011-09-01
The use of taxanes to treat node-positive (N+) breast cancer patients is associated with heterogeneous benefits as well as with morbidity and financial costs. This study aimed to assess the economic impact of using gene expression profiling to guide decision-making about chemotherapy, and to discuss the coverage/reimbursement issues involved. Retrospective data on 246 patients included in a randomised trial (PACS01) were analyzed. Tumours were genotyped using DNA microarrays (189-gene signature), and patients were classified depending on whether or not they were likely to benefit from chemotherapy regimens without taxanes. Standard anthracyclines plus taxane chemotherapy (strategy AT) was compared with the innovative strategy based on genomic testing (GEN). Statistical analyses involved bootstrap methods and sensitivity analyses. The AT and GEN strategies yielded similar 5-year metastasis-free survival rates. In comparison with AT, GEN was cost-effective when genomic testing costs were less than 2,090€. With genomic testing costs higher than 2,919€, AT was cost-effective. Considering a 30% decrease in the price of docetaxel (the patent rights being about to expire), GEN was cost-effective if the cost of genomic testing was in the 0€-1,139€, range; whereas AT was cost-effective if genomic testing costs were higher than 1,891€. The use of gene expression profiling to guide decision-making about chemotherapy for N+ breast cancer patients is potentially cost-effective. Since genomic testing and the drugs targeted in these tests yield greater well-being than the sum of those resulting from separate use, questions arise about how to deal with extra well-being in decision-making about coverage/reimbursement.
Harris, Michael; Radtke, Arthur S.
1976-01-01
Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.
ERIC Educational Resources Information Center
Neumann, David L.; Hood, Michelle
2009-01-01
A wiki was used as part of a blended learning approach to promote collaborative learning among students in a first year university statistics class. One group of students analysed a data set and communicated the results by jointly writing a practice report using a wiki. A second group analysed the same data but communicated the results in a…
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Cavalcante, Y L; Hauser-Davis, R A; Saraiva, A C F; Brandão, I L S; Oliveira, T F; Silveira, A M
2013-01-01
This paper compared and evaluated seasonal variations in physico-chemical parameters and metals at a hydroelectric power station reservoir by applying Multivariate Analyses and Artificial Neural Networks (ANN) statistical techniques. A Factor Analysis was used to reduce the number of variables: the first factor was composed of elements Ca, K, Mg and Na, and the second by Chemical Oxygen Demand. The ANN showed 100% correct classifications in training and validation samples. Physico-chemical analyses showed that water pH values were not statistically different between the dry and rainy seasons, while temperature, conductivity, alkalinity, ammonia and DO were higher in the dry period. TSS, hardness and COD, on the other hand, were higher during the rainy season. The statistical analyses showed that Ca, K, Mg and Na are directly connected to the Chemical Oxygen Demand, which indicates a possibility of their input into the reservoir system by domestic sewage and agricultural run-offs. These statistical applications, thus, are also relevant in cases of environmental management and policy decision-making processes, to identify which factors should be further studied and/or modified to recover degraded or contaminated water bodies. Copyright © 2012 Elsevier B.V. All rights reserved.
Li, Cheng-Wei; Tao, Ru-Jia; Zou, Dan-Feng; Li, Man-Hui; Xu, Xin; Cao, Wei-Jun
2018-02-16
Sarcoidosis is a multisystem disease characterised by the formation of granulomas within various organs, mainly the lungs. Several studies from different countries have been undertaken to investigate sarcoidosis with extrapulmonary involvement except from China. The objective of this study is to investigate a comparative clinical analysis in patients with pulmonary sarcoidosis with and without extrapulmonary involvement from China. Data from inpatients diagnosed with sarcoidosis at Shanghai Pulmonary Hospital (Shanghai, China) between January 2009 and December 2014 were retrospectively collected and analysed. Six hundred and thirty-six patients with biopsy-proven sarcoidosis were included in the study, including 378 isolated pulmonary sarcoidosis and 258 pulmonary sarcoidosis plus extrapulmonary involvement. Two hundred and fifty-eight (40.6%) patients with pulmonary sarcoidosis had extrapulmonary involvement. Extrapulmonary localisations were detected mostly in extrathoracic lymph nodes (n=147) and skin (n=86). Statistically significant differences were demonstrated between patients with pulmonary sarcoidosis plus extrapulmonary involvement and patients with isolated pulmonary sarcoidosis for fatigue (16.6%vs8.3%, P<0.05), serum ACE (SACE) levels (79.0±46.9 IU/L vs 69.7±38.7 IU/L, P<0.05), and high-resolution CT (HRCT) findings (53.8%vs46.2%, P<0.05). Extrapulmonary involvement is common in patients with pulmonary sarcoidosis, with the most common sites being extrathoracic lymph nodes and skin. Patients with sarcoidosis with extrapulmonary involvement are more symptomatic (fatigue), have higher SACE levels and more deteriorating HRCT findings, to which clinicians should pay attention. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Airaksinen, Noora K; Nurmi-Lüthje, Ilona S; Kataja, J Matti; Kröger, Heikki P J; Lüthje, Peter M J
2018-05-01
Most of the cycling accidents that occur in Finland do not end up in the official traffic accident statistics. Thus, there is minimal information on these accidents and their consequences, particularly in cases in which alcohol was involved. The focus of the present study is on cycling accidents and injuries involving alcohol in particular. Data on patients visiting the emergency department at North Kymi Hospital because of a cycling accident was prospectively collected for two years, from June 1, 2004 to May 31, 2006. Blood alcohol concentration (BAC) was measured on admission with a breath analyser. The severity of the cycling injuries was classified according to the Abbreviated Injury Scale (AIS). A total of 217 cycling accidents occurred. One third of the injured cyclists were involved with alcohol at the time of visiting the hospital. Of these, 85% were males. A blood alcohol concentration of ≥ 1.2 g/L was measured in nearly 90% of all alcohol-related cases. A positive BAC result was more common among males than females (p < 0.001), and head injuries were more common among cyclists where alcohol was involved (AI) (60%) than among sober cyclists (29%) (p < 0.001). Two thirds (64%) of the cyclists with AI were not wearing a bicycle helmet. The figure for serious injuries (MAIS ≥ 3) was similar in both groups. Intoxication with an alcohol level of more than 1.5 g/L and the age of 15 to 24 years were found to be risk factors for head injuries. The mean cost of treatment was higher among sober cyclists than among cyclists with AI (€2143 vs. €1629), whereas in respect of the cost of work absence, the situation was the opposite (€1348 vs. €1770, respectively). Cyclists involved with alcohol were, in most cases, heavily intoxicated and were not wearing a bicycle helmet. Head injuries were more common among these cyclists than among sober cyclists. As cycling continues to increase, it is important to monitor cycling accidents, improve the accident statistics and heighten awareness of the risks of head injuries when cycling under the influence of alcohol. Copyright © 2018 Elsevier Ltd. All rights reserved.
Carmichael, Suzan L; Yang, Wei; Ma, Chen; Roberts, Eric; Kegley, Susan; English, Paul; Lammer, Edward J; Witte, John S; Shaw, Gary M
2016-08-01
We examined risks associated with joint exposure of gene variants and pesticides. Analyses included 189 cases and 390 male controls born from 1991 to 2003 in California's San Joaquin Valley. We used logistic regression to examine risks associated with joint exposures of gene variants and pesticides that our previous work identified as associated with hypospadias. Genetic variables were based on variants in DGKK, genes involved in sex steroid synthesis/metabolism, and genes involved in genital tubercle development. Pesticide exposure was based on residential proximity to commercial agricultural pesticide applications. Odds ratios (ORs) were highest among babies with joint exposures, who had two- to fourfold increased risks; for example, the OR was 3.7 (95% confidence interval [CI], 0.8-16.5) among subjects with the risk-associated DGKK haplotype and pesticide exposure; OR, 1.5 (95% CI, 0.7-3.1) among subjects with the haplotype and no pesticide exposure; and OR, 0.9 (95% CI, 0.5-1.6) among subjects without the haplotype but with pesticide exposure, relative to subjects with neither. However, results did not provide statistical evidence that these risks were significantly greater than expected on an additive scale, relative to risks associated with one exposure at a time. We observed elevated risks associated with joint exposures to selected pesticides and genetic variants but no statistical evidence for interaction. Birth Defects Research (Part A) 106:653-658, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation
NASA Astrophysics Data System (ADS)
Muhammad, Syahidah; Frew, Russell; Hayman, Alan
2015-02-01
Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
Compound-specific isotope analysis of diesel fuels in a forensic investigation
Muhammad, Syahidah A.; Frew, Russell D.; Hayman, Alan R.
2015-01-01
Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin, i.e., the very subtle differences in isotopic values between the samples. PMID:25774366
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
NASA Technical Reports Server (NTRS)
Svalbonas, V.
1973-01-01
The theoretical analysis background for the STARS-2 (shell theory automated for rotational structures) program is presented. The theory involved in the axisymmetric nonlinear and unsymmetric linear static analyses, and the stability and vibrations (including critical rotation speed) analyses involving axisymmetric prestress are discussed. The theory for nonlinear static, stability, and vibrations analyses, involving shells with unsymmetric loadings are included.
DiLorio, Colleen; Dudley, William N; Soet, Johanna E; McCarty, Frances
2004-12-01
To examine sexual possibility situations (SPS) and protective practices associated with involvement in intimate sexual behaviors and the initiation of sexual intercourse among young adolescents and to determine if protective factors moderate the relationship between SPS and sexual behaviors. Data for these analyses were obtained from the baseline assessment for adolescents conducted as part of an HIV prevention study called "Keepin' it R.E.A.L.!" The study was conducted with a community-based organization (CBO) in an urban area serving a predominantly African-American population. In addition to items assessing SPS, intimate sexual behaviors, and initiation of sexual intercourse, adolescents provided information on the following protective factors: educational goals, self-concept, future time perspective, orientation to health, self-efficacy, outcome expectations, parenting, communication, values, and prosocial activities. Background personal information, including age and gender, was also collected. The analyses were conducted on data from 491 predominantly African-American adolescents, 61% of whom were boys. Variables were combined to form SPS and protective indices that were used in the first set of regression analyses. In a second set of analyses, the indices were unbundled and individual variables were entered into regression analyses. Both SPS and protective indices explained significant portions of variance in intimate sexual behaviors, and the SPS index explained a significant portion of variance in the initiation of sexual intercourse. The regression analysis using the unbundled SPS and protective factors revealed the following statistically significant predictors for intimate sexual behaviors: age, gender, time alone with groups of peers, time alone with a member of the opposite sex, behavior self-concept, popularity self-concept, self-efficacy for abstinence, outcome expectations for abstinence, parental control, personal values, and parental values. A similar regression analysis revealed that age, time alone with a member of the opposite sex, and personal values were significant predictors of initiation of sexual intercourse. These results provide evidence for the important role of protective factors in explaining early involvement in sexual behaviors and show that protective factors extend beyond personal characteristics to include both familial and peer factors.
NASA Astrophysics Data System (ADS)
Wallace, Jon Michael
2003-10-01
Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.
Statistical Literacy: Developing a Youth and Adult Education Statistical Project
ERIC Educational Resources Information Center
Conti, Keli Cristina; Lucchesi de Carvalho, Dione
2014-01-01
This article focuses on the notion of literacy--general and statistical--in the analysis of data from a fieldwork research project carried out as part of a master's degree that investigated the teaching and learning of statistics in adult education mathematics classes. We describe the statistical context of the project that involved the…
Interpretation of Statistical Data: The Importance of Affective Expressions
ERIC Educational Resources Information Center
Queiroz, Tamires; Monteiro, Carlos; Carvalho, Liliane; François, Karen
2017-01-01
In recent years, research on teaching and learning of statistics emphasized that the interpretation of data is a complex process that involves cognitive and technical aspects. However, it is a human activity that involves also contextual and affective aspects. This view is in line with research on affectivity and cognition. While the affective…
A Laboratory Experiment on the Statistical Theory of Nuclear Reactions
ERIC Educational Resources Information Center
Loveland, Walter
1971-01-01
Describes an undergraduate laboratory experiment on the statistical theory of nuclear reactions. The experiment involves measuring the relative cross sections for formation of a nucleus in its meta stable excited state and its ground state by applying gamma-ray spectroscopy to an irradiated sample. Involves 3-4 hours of laboratory time plus…
Severgnini, Marco; Bicciato, Silvio; Mangano, Eleonora; Scarlatti, Francesca; Mezzelani, Alessandra; Mattioli, Michela; Ghidoni, Riccardo; Peano, Clelia; Bonnal, Raoul; Viti, Federica; Milanesi, Luciano; De Bellis, Gianluca; Battaglia, Cristina
2006-06-01
Meta-analysis of microarray data is increasingly important, considering both the availability of multiple platforms using disparate technologies and the accumulation in public repositories of data sets from different laboratories. We addressed the issue of comparing gene expression profiles from two microarray platforms by devising a standardized investigative strategy. We tested this procedure by studying MDA-MB-231 cells, which undergo apoptosis on treatment with resveratrol. Gene expression profiles were obtained using high-density, short-oligonucleotide, single-color microarray platforms: GeneChip (Affymetrix) and CodeLink (Amersham). Interplatform analyses were carried out on 8414 common transcripts represented on both platforms, as identified by LocusLink ID, representing 70.8% and 88.6% of annotated GeneChip and CodeLink features, respectively. We identified 105 differentially expressed genes (DEGs) on CodeLink and 42 DEGs on GeneChip. Among them, only 9 DEGs were commonly identified by both platforms. Multiple analyses (BLAST alignment of probes with target sequences, gene ontology, literature mining, and quantitative real-time PCR) permitted us to investigate the factors contributing to the generation of platform-dependent results in single-color microarray experiments. An effective approach to cross-platform comparison involves microarrays of similar technologies, samples prepared by identical methods, and a standardized battery of bioinformatic and statistical analyses.
Ayurveda: Between Religion, Spirituality, and Medicine
Kessler, C.; Wischnewsky, M.; Michalsen, A.; Eisenmann, C.; Melzer, J.
2013-01-01
Ayurveda is playing a growing part in Europe. Questions regarding the role of religion and spirituality within Ayurveda are discussed widely. Yet, there is little data on the influence of religious and spiritual aspects on its European diffusion. Methods. A survey was conducted with a new questionnaire. It was analysed by calculating frequency variables and testing differences in distributions with the χ 2-Test. Principal Component Analyses with Varimax Rotation were performed. Results. 140 questionnaires were analysed. Researchers found that individual religious and spiritual backgrounds influence attitudes and expectations towards Ayurveda. Statistical relationships were found between religious/spiritual backgrounds and decisions to offer/access Ayurveda. Accessing Ayurveda did not exclude the simultaneous use of modern medicine and CAM. From the majority's perspective Ayurveda is simultaneously a science, medicine, and a spiritual approach. Conclusion. Ayurveda seems to be able to satisfy the individual needs of therapists and patients, despite worldview differences. Ayurvedic concepts are based on anthropologic assumptions including different levels of existence in healing approaches. Thereby, Ayurveda can be seen in accordance with the prerequisites for a Whole Medical System. As a result of this, intimate and individual therapist-patient relationships can emerge. Larger surveys involving bigger participant numbers with fully validated questionnaires are warranted to support these results. PMID:24368928
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Conceptual and statistical problems associated with the use of diversity indices in ecology.
Barrantes, Gilbert; Sandoval, Luis
2009-09-01
Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.
Phenotype/genotype correlations in Gaucher disease type 1: Clinical and therapeutic implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sibille, A.; Eng, C.M.; Kim, S.J.
1993-06-01
Gaucher disease is the most frequent lysosomal storage disease and the most prevalent genetic disease among Ashkenazi Jews. Gaucher disease type 1 is characterized by marked variability of the phenotype and by the absence of neuronopathic involvement. To test the hypothesis that this phenotypic variability was due to genetic compounds of several different mutant alleles, 161 symptomatic patients with Gaucher disease type 1 (> 90% Ashkenazi Jewish) were analyzed for clinical involvement, and their genotypes were determined. Qualitative and quantitative measures of disease involvement included age at onset of the disease manifestations, hepatic and splenic volumes, age at splenectomy, andmore » severity of bony disease. High statistically significant differences (P < .005) were found in each clinical parameter in patients with the N370S/N370S genotype compared with those patients with the N370S/84GG, N370S/L444P, and N370/ genotypes. The symptomatic N370S homozygotes had onset of their disease two to three decades later than patients with the other genotypes. In addition, patients with the latter genotypes have much more severely involved livers, spleens, and bones and had a higher incidence of splenectomy at an earlier age. These predictive genotype analyses provide the basis for genetic care delivery and therapeutic recommendations in patients affected with Gaucher disease type 1. 38 refs., 1 fig., 4 tabs.« less
NASA Technical Reports Server (NTRS)
Komar, P. D.
1984-01-01
The diversity of proposed origins for large Martian outflow channels results from the differing interpretations given to the landforms associated with the outflow channels. In an attempt to help limit the possible mechanisms of channel erosion, detailed studies of three of the channel features were done; the streamlined islands, longitudinal grooves and scour marks. This examination involved a comparison of the martian streamlined islands with various streamlined landforms on Earth including those found in the Channel Scabland in large rivers, glacial drumlins, and desert yardangs. The comparisons included statistical analyses of the landform lengths versus widths and positions of maximum width, and an examination of the degree of shape agreement with the geometric lemniscate which was in turn demonstrated to correspond closely with true airfoil shapes. The analyses showed that the shapes of the martian islands correspond closely to the streamlined islands in rivers and the Channel Scabland land. Drumlins show a much smaller correlation. Erosional rock islands formed by glaciers are very much different in shape.
The influence of car registration year on driver casualty rates in Great Britain.
Broughton, Jeremy
2012-03-01
A previous paper analysed data from the British national road accident reporting system to investigate the influence upon car driver casualty rates of the general type of car being driven and its year of first registration. A statistical model was fitted to accident data from 2001 to 2005, and this paper updates the principal results using accident data from 2003 to 2007. Attention focuses upon the role of year of first registration since this allows the influence of developments in car design upon occupant casualty numbers to be evaluated. Three additional topics are also examined with these accident data. Changes over time in frontal and side impacts are compared. Changes in the combined risk for the two drivers involved in a car-car collision are investigated, being the net result of changes in secondary safety and aggressivity. Finally, the results of the new model relating to occupant protection are related to an index that had been developed previously to analyse changes over time in the secondary safety of the car fleet. Copyright © 2011 Elsevier Ltd. All rights reserved.
Description and evaluation of an initiative to develop advanced practice nurses in mainland China.
Wong, Frances Kam Yuet; Peng, Gangyi; Kan, Eva C; Li, Yajie; Lau, Ada T; Zhang, Liying; Leung, Annie F; Liu, Xueqin; Leung, Vilna O; Chen, Weiju; Li, Ming
2010-05-01
This paper describes an initiative to develop Advanced Practice Nurses (APNs) in mainland China and evaluation of the outcomes of the described programme. The pioneer project was an APN postgraduate programme involving 38 students conducted in Guangzhou, China during 2004-2005. Data related to curriculum content and process, student performance, self-reported competence and programme effects were collected. Quantitative data such as demographic data, student performance were analysed using descriptive statistics and the pre and post self-reported practice of competence was compared using chi-square test. Qualitative data such as case reports and interviews were examined using thematic analyses. Reflective journals and case studies revealed the attributes of APNs in managing clinical cases at advanced level, applying theory into practice and exercising evidence-based practice. The relatively modest self-reported practice of competence suggested that the graduates were novice APNs and needed continued development after the completion of the programme. This study reports the experience of an initiative in China and suggests a useful curriculum framework for educating APNs. Copyright 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sanchez, Gerardo
A flipped laboratory model involves significant preparation by the students on lab material prior to entry to the laboratory. This allows laboratory time to be focused on active learning through experiments. The aim of this study was to observe changes in student performance through the transition from a traditional laboratory format, to a flipped format. The data showed that for both Anatomy and Physiology (I and II) laboratories a more normal distribution of grades was observed once labs were flipped and lecture grade averages increased. Chi square and analysis of variance tests showed grade changes to a statistically significant degree, with a p value of less than 0.05 on both analyses. Regression analyses gave decreasing numbers after the flipped labs were introduced with an r. 2 value of .485 for A&P I, and .564 for A&P II. Results indicate improved scores for the lecture part of the A&P course, decreased outlying scores above 100, and all score distributions approached a more normal distribution.
Wigington, Charles H; Sonderegger, Derek; Brussaard, Corina P D; Buchan, Alison; Finke, Jan F; Fuhrman, Jed A; Lennon, Jay T; Middelboe, Mathias; Suttle, Curtis A; Stock, Charles; Wilson, William H; Wommack, K Eric; Wilhelm, Steven W; Weitz, Joshua S
2017-11-01
The original publication of this Article included analysis of virus and microbial cell abundances and virus-to-microbial cell ratios. Data in the Article came from 25 studies intended to be exclusively from marine sites. However, 3 of the studies included in the original unified dataset were erroneously classified as marine sites during compilation. The records with mis-recorded longitude and latitude values were, in fact, taken from inland, freshwater sources. The three inland, freshwater datasets are ELA, TROUT and SWAT. The data from these three studies represent 163 of the 5,671 records in the original publication. In the updated version of the Article, all analyses have been recalculated using the same statistical analysis pipeline released via GitHub as part of the original publication. Removal of the three studies reduces the unified dataset to 5,508 records. Analyses involving all grouped datasets have been updated with changes noted in each figure. All key results remain qualitatively unchanged. All data and scripts used in this correction have been made available as a new, updated GitHub release to reflect the updated dataset and figures.
Comparison of two surface temperature measurement using thermocouples and infrared camera
NASA Astrophysics Data System (ADS)
Michalski, Dariusz; Strąk, Kinga; Piasecka, Magdalena
This paper compares two methods applied to measure surface temperatures at an experimental setup designed to analyse flow boiling heat transfer. The temperature measurements were performed in two parallel rectangular minichannels, both 1.7 mm deep, 16 mm wide and 180 mm long. The heating element for the fluid flowing in each minichannel was a thin foil made of Haynes-230. The two measurement methods employed to determine the surface temperature of the foil were: the contact method, which involved mounting thermocouples at several points in one minichannel, and the contactless method to study the other minichannel, where the results were provided with an infrared camera. Calculations were necessary to compare the temperature results. Two sets of measurement data obtained for different values of the heat flux were analysed using the basic statistical methods, the method error and the method accuracy. The experimental error and the method accuracy were taken into account. The comparative analysis showed that although the values and distributions of the surface temperatures obtained with the two methods were similar but both methods had certain limitations.
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.
1984-01-01
An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.
[Clinical research=design*measurements*statistical analyses].
Furukawa, Toshiaki
2012-06-01
A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
Nimptsch, Ulrike; Wengler, Annelene; Mansky, Thomas
2016-11-01
In Germany, nationwide hospital discharge data (DRG statistics provided by the research data centers of the Federal Statistical Office and the Statistical Offices of the 'Länder') are increasingly used as data source for health services research. Within this data hospitals can be separated via their hospital identifier ([Institutionskennzeichen] IK). However, this hospital identifier primarily designates the invoicing unit and is not necessarily equivalent to one hospital location. Aiming to investigate direction and extent of possible bias in hospital-level analyses this study examines the continuity of the hospital identifier within a cross-sectional and longitudinal approach and compares the results to official hospital census statistics. Within the DRG statistics from 2005 to 2013 the annual number of hospitals as classified by hospital identifiers was counted for each year of observation. The annual number of hospitals derived from DRG statistics was compared to the number of hospitals in the official census statistics 'Grunddaten der Krankenhäuser'. Subsequently, the temporal continuity of hospital identifiers in the DRG statistics was analyzed within cohorts of hospitals. Until 2013, the annual number of hospital identifiers in the DRG statistics fell by 175 (from 1,725 to 1,550). This decline affected only providers with small or medium case volume. The number of hospitals identified in the DRG statistics was lower than the number given in the census statistics (e.g., in 2013 1,550 IK vs. 1,668 hospitals in the census statistics). The longitudinal analyses revealed that the majority of hospital identifiers persisted in the years of observation, while one fifth of hospital identifiers changed. In cross-sectional studies of German hospital discharge data the separation of hospitals via the hospital identifier might lead to underestimating the number of hospitals and consequential overestimation of caseload per hospital. Discontinuities of hospital identifiers over time might impair the follow-up of hospital cohorts. These limitations must be taken into account in analyses of German hospital discharge data focusing on the hospital level. Copyright © 2016. Published by Elsevier GmbH.
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.
2005-01-01
This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
The Empirical Nature and Statistical Treatment of Missing Data
ERIC Educational Resources Information Center
Tannenbaum, Christyn E.
2009-01-01
Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
An analysis of empirical estimates of sexual aggression victimization and perpetration.
Spitzberg, B H
1999-01-01
Estimates of prevalence for several categories of sexual coercion, including rape and attempted rape, were statistically aggregated across 120 studies, involving over 100,000 subjects. According to the data, almost 13% of women and over 3% of men have been raped, and almost 5% of men claim to have perpetrated rape. In contrast, about 25% of women and men claim to have been sexually coerced and to have perpetrated sexual coercion. In general, the mediating variables examined--population type, decade, date of publication, and type of operationalization--were not consistently related to rates of victimization or perpetration. Nevertheless, the extensive variation among study estimates strongly suggests the possibility of systematic sources of variation that have yet to be identified. Further analyses are called for to disentangle such sources.
In-SITU Raman Spectroscopy of Single Microparticle Li-Intercalation Electrodes
NASA Technical Reports Server (NTRS)
Dokko, Kaoru; Shi, Qing-Fang; Stefan, Ionel C.; Scherson, Daniel A.
2003-01-01
Modifications in the vibrational properties of a single microparticle of LiMn2O4 induced by extraction and subsequent injection of Li(+) into the lattice have been monitored in situ via simultaneous acquisition of Raman scattering spectra and cyclic voltammetry data in 1M LiC1O4 solutions in ethylene carbonate (EC):diethyl carbonate (DEC) mixtures (1:1 by volume). Statistical analyses of the spectra in the range 15 < SOD < 45%, where SOD represents the state of discharge (in percent) of the nominally fully charged material, i.e. lambda-MnO2, were found to be consistent with the coexistence of two distinct phases of lithiated metal oxide in agreement with information derived from in situ X-ray diffraction (XRD) measurements involving more conventional battery-type electrodes.
Seeking a Balance between the Statistical and Scientific Elements in Psychometrics
ERIC Educational Resources Information Center
Wilson, Mark
2013-01-01
In this paper, I will review some aspects of psychometric projects that I have been involved in, emphasizing the nature of the work of the psychometricians involved, especially the balance between the statistical and scientific elements of that work. The intent is to seek to understand where psychometrics, as a discipline, has been and where it…
Evaluation and application of summary statistic imputation to discover new height-associated loci.
Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán
2018-05-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.
Evaluation and application of summary statistic imputation to discover new height-associated loci
2018-01-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485
[A study of behavior patterns between smokers and nonsmokers].
Kim, H S
1990-04-01
Clinical and epidemiologic studies of coronary heart disease (CHD) have from time to time over the last three decades found associations between prevalence of CHD and behavioral attributes and cigarette smoking. The main purpose of this study is reduced to major risk factor of coronary heart disease through prohibition of smoking and control of behavior pattern. The subjects consisted of 120 smokers and 90 nonsmokers who were married men older than 30 years working in officers. The officers were surveyed by means of questionnaire September 26 through October 6, 1989. The Instruments used for this study was a self-administered measurement tool composed of 59 items was made through modifications of Jenkuns Activity Survey (JAS). The Data were analysed by SAS (Statistical Analysis System) program personal computer. The statistical technique used for this study were Frequency, chi 2-test, t-test, ANOVA, Pearson Correlation Coefficient. The 15 items were chosen with items above 0.3 of the factor loading in the factor analysis. In the first factor analysis 19 factors were extracted and accounted for 86% of the total variance. However when the number of factors were limited to 3 in order to derive Jenkins classification, three factors were derived. There names are Job-Involvement, Speed & Impatience, Hard-Driving. Each of them includes 21 items, 21 and 9, respectively. The results of this study were as follow: 1. The score of the smoker group and non-smoker group in Job-Involvement (t = 5.7147, p less than 0.0001), Speed & Impatience (t = 4.6756, p less than .0001), Hard-Driving (t = 8.0822, p less than .0001) and total type A behavior pattern showed statistically significant differences (t = 8.1224, p less than .0001). 2. The score of type A behavior pattern by number of cigarettes smoked daily were not statistically significant differences. 3. The score of type A behavior pattern by duration of smoking were not significant differences. It was concluded that the relationship between smokers and non-smokers of type A behavior pattern was statistically significant difference but number of cigarettes smoked daily and duration of smoking were not significant differences. Therefore this study is needed to adequate nursing intervention of type A behavior pattern in order to elevated to educational effect for prohibition of cigarette smoking.
Objects and categories: feature statistics and object processing in the ventral stream.
Tyler, Lorraine K; Chiu, Shannon; Zhuang, Jie; Randall, Billi; Devereux, Barry J; Wright, Paul; Clarke, Alex; Taylor, Kirsten I
2013-10-01
Recognizing an object involves more than just visual analyses; its meaning must also be decoded. Extensive research has shown that processing the visual properties of objects relies on a hierarchically organized stream in ventral occipitotemporal cortex, with increasingly more complex visual features being coded from posterior to anterior sites culminating in the perirhinal cortex (PRC) in the anteromedial temporal lobe (aMTL). The neurobiological principles of the conceptual analysis of objects remain more controversial. Much research has focused on two neural regions-the fusiform gyrus and aMTL, both of which show semantic category differences, but of different types. fMRI studies show category differentiation in the fusiform gyrus, based on clusters of semantically similar objects, whereas category-specific deficits, specifically for living things, are associated with damage to the aMTL. These category-specific deficits for living things have been attributed to problems in differentiating between highly similar objects, a process that involves the PRC. To determine whether the PRC and the fusiform gyri contribute to different aspects of an object's meaning, with differentiation between confusable objects in the PRC and categorization based on object similarity in the fusiform, we carried out an fMRI study of object processing based on a feature-based model that characterizes the degree of semantic similarity and difference between objects and object categories. Participants saw 388 objects for which feature statistic information was available and named the objects at the basic level while undergoing fMRI scanning. After controlling for the effects of visual information, we found that feature statistics that capture similarity between objects formed category clusters in fusiform gyri, such that objects with many shared features (typical of living things) were associated with activity in the lateral fusiform gyri whereas objects with fewer shared features (typical of nonliving things) were associated with activity in the medial fusiform gyri. Significantly, a feature statistic reflecting differentiation between highly similar objects, enabling object-specific representations, was associated with bilateral PRC activity. These results confirm that the statistical characteristics of conceptual object features are coded in the ventral stream, supporting a conceptual feature-based hierarchy, and integrating disparate findings of category responses in fusiform gyri and category deficits in aMTL into a unifying neurocognitive framework.
Landstad, Bodil J; Gelin, Gunnar; Malmquist, Claes; Vinberg, Stig
2002-09-15
The study had two primary aims. The first aim was to combine a human resources costing and accounting approach (HRCA) with a quantitative statistical approach in order to get an integrated model. The second aim was to apply this integrated model in a quasi-experimental study in order to investigate whether preventive intervention affected sickness absence costs at the company level. The intervention studied contained occupational organizational measures, competence development, physical and psychosocial working environmental measures and individual and rehabilitation measures on both an individual and a group basis. The study is a quasi-experimental design with a non-randomized control group. Both groups involved cleaning jobs at predominantly female workplaces. The study plan involved carrying out before and after studies on both groups. The study included only those who were at the same workplace during the whole of the study period. In the HRCA model used here, the cost of sickness absence is the net difference between the costs, in the form of the value of the loss of production and the administrative cost, and the benefits in the form of lower labour costs. According to the HRCA model, the intervention used counteracted a rise in sickness absence costs at the company level, giving an average net effect of 266.5 Euros per person (full-time working) during an 8-month period. Using an analogue statistical analysis on the whole of the material, the contribution of the intervention counteracted a rise in sickness absence costs at the company level giving an average net effect of 283.2 Euros. Using a statistical method it was possible to study the regression coefficients in sub-groups and calculate the p-values for these coefficients; in the younger group the intervention gave a calculated net contribution of 605.6 Euros with a p-value of 0.073, while the intervention net contribution in the older group had a very high p-value. Using the statistical model it was also possible to study contributions of other variables and interactions. This study established that the HRCA model and the integrated model produced approximately the same monetary outcomes. The integrated model, however, allowed a deeper understanding of the various possible relationships and quantified the results with confidence intervals.
Formalizing the definition of meta-analysis in Molecular Ecology.
ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E
2015-08-01
Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.
Exploring partners' perspectives on participation in heart failure home care: a mixed-method design.
Näsström, Lena; Luttik, Marie Louise; Idvall, Ewa; Strömberg, Anna
2017-05-01
To describe the partners' perspectives on participation in the care for patients with heart failure receiving home care. Partners are often involved in care of patients with heart failure and have an important role in improving patients' well-being and self-care. Partners have described both negative and positive experiences of involvement, but knowledge of how partners of patients with heart failure view participation in care when the patient receives home care is lacking. A convergent parallel mixed-method design was used, including data from interviews and questionnaires. A purposeful sample of 15 partners was used. Data collection lasted between February 2010 - December 2011. Interviews were analysed with content analysis and data from questionnaires (participation, caregiving, health-related quality of life, depressive symptoms) were analysed statistically. Finally, results were merged, interpreted and labelled as comparable and convergent or as being inconsistent. Partners were satisfied with most aspects of participation, information and contact. Qualitative findings revealed four different aspects of participation: adapting to the caring needs and illness trajectory, coping with caregiving demands, interacting with healthcare providers and need for knowledge to comprehend the health situation. Results showed confirmatory results that were convergent and expanded knowledge that gave a broader understanding of partner participation in this context. The results revealed different levels of partner participation. Heart failure home care included good opportunities for both participation and contact during home visits, necessary to meet partners' ongoing need for information to comprehend the situation. © 2016 John Wiley & Sons Ltd.
Effects of the communities that care prevention system on youth reports of protective factors.
Kim, B K Elizabeth; Gloppen, Kari M; Rhew, Isaac C; Oesterle, Sabrina; Hawkins, J David
2015-07-01
Many interventions seeking to reduce problem behaviors and promote healthy youth development target both risk and protective factors, yet few studies have examined the effect of preventive interventions on overall levels of protection community wide. In a community-randomized controlled trial, this study tested the effect of Communities That Care (CTC) on protective factors in 24 communities across seven states. Data on protective factors were collected from a panel of 4407 youths in CTC and control communities followed from grade 5 through grade 8. Hierarchical linear modeling compared mean levels of 15 protective factors derived from the social development model in CTC and control communities in grade 8, adjusted for individual and community characteristics and baseline levels of protective factors in grade 5. Global test statistics were calculated to examine effects on protection overall and by domain. Analyses across all protective factors found significantly higher levels of overall protection in CTC compared to control communities. Analyses by domain found significantly higher levels of protection in CTC than control communities in the community, school, and peer/individual domains, but not in the family domain. Significantly higher levels of opportunities for prosocial involvement in the community, recognition for prosocial involvement in school, interaction with prosocial peers, and social skills among CTC compared to control youth contributed to the overall and domain-specific results. This is consistent with CTC's theory of change, which posits that strengthening protective factors is a mechanism through which CTC prevents behavior problems.
Cornwall, Amanda; Moore, Sally; Plant, Hilary
2008-07-01
This paper reports on a study exploring the usefulness of e-mail as a means of communication between nurse specialists and patients with lung cancer and their families. The study involved two lung cancer nurse specialists and 16 patients and family members who used e-mail with them during the 6-month study period. Data were collected from three sources: (1) e-mail contact between the nurse specialists and patients/family members, (2) patient/family member questionnaire and (3) a focus group/reflective session with the nurse specialists. Quantitative data collected from the e-mails and the questionnaires were analysed descriptively and are presented as summary statistics. Text data from the questionnaires and e-mails were analysed using content analysis. Findings suggest that e-mail can be an effective and convenient means of communication between nurse specialists, and patients and family members. Patients and family members reported high levels of satisfaction with this method of communication. It was found to be quick and easy, and patients and family members were satisfied with both the response and the speed of response from the nurse specialists. Nurse specialists were also positive about e-mail use and found that the benefits of using e-mail with patients/family members outweighed any disadvantages. Further investigation is recommended involving other health care professionals and different patient groups to ensure the safe and appropriate use of e-mail within health care.
Digitalizing historical high resolution water level data: Challenges and opportunities
NASA Astrophysics Data System (ADS)
Holinde, Lars; Hein, Hartmut; Barjenbruch, Ulrich
2017-04-01
Historical tide-gauge data offer the opportunities for determining variations in key characteristics for water level data and the analyses of past extreme events (storm surges). These information are important for calculating future trends and scenarios. But there are challenges involved due to the extensive effort needed to digitalize gauge sheets and quality control the resulting historical data. Based on these conditions, two main sources for inaccuracies in historical time series can be identified. First are several challenges due to the digitalization of the historical data, e.g. general quality of the sheets, multiple crossing lines of the observed water levels and additional comments on the sheet describing problems or additional information during the measurements. Second are problems during the measurements themselves. These can include the incorrect positioning of the sheets, trouble with the tide-gauge and maintenance. Errors resulting from these problems can be e.g. flat lines, discontinuities and outlier. Especially, the characterization of outliers has to be conducted carefully, to distinguish between real outliers and the appearance of extreme events. Methods for the quality control process involve the use of statistics, machine learning and neural networks. These will be described and applied to three different time series from tide gauge stations at the cost of Lower Saxony, Germany. Resulting difficulties and outcomes of the quality control process will be presented and explained. Furthermore, we will present a first glance at analyses for these time series.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Statistics without Tears: Complex Statistics with Simple Arithmetic
ERIC Educational Resources Information Center
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
Professional experience and traffic accidents/near-miss accidents among truck drivers.
Girotto, Edmarlon; Andrade, Selma Maffei de; González, Alberto Durán; Mesas, Arthur Eumann
2016-10-01
To investigate the relationship between the time working as a truck driver and the report of involvement in traffic accidents or near-miss accidents. A cross-sectional study was performed with truck drivers transporting products from the Brazilian grain harvest to the Port of Paranaguá, Paraná, Brazil. The drivers were interviewed regarding sociodemographic characteristics, working conditions, behavior in traffic and involvement in accidents or near-miss accidents in the previous 12 months. Subsequently, the participants answered a self-applied questionnaire on substance use. The time of professional experience as drivers was categorized in tertiles. Statistical analyses were performed through the construction of models adjusted by multinomial regression to assess the relationship between the length of experience as a truck driver and the involvement in accidents or near-miss accidents. This study included 665 male drivers with an average age of 42.2 (±11.1) years. Among them, 7.2% and 41.7% of the drivers reported involvement in accidents and near-miss accidents, respectively. In fully adjusted analysis, the 3rd tertile of professional experience (>22years) was shown to be inversely associated with involvement in accidents (odds ratio [OR] 0.29; 95% confidence interval [CI] 0.16-0.52) and near-miss accidents (OR 0.17; 95% CI 0.05-0.53). The 2nd tertile of professional experience (11-22 years) was inversely associated with involvement in accidents (OR 0.63; 95% CI 0.40-0.98). An evident relationship was observed between longer professional experience and a reduction in reporting involvement in accidents and near-miss accidents, regardless of age, substance use, working conditions and behavior in traffic. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mortality of Youth Offenders Along a Continuum of Justice System Involvement.
Aalsma, Matthew C; Lau, Katherine S L; Perkins, Anthony J; Schwartz, Katherine; Tu, Wanzhu; Wiehe, Sarah E; Monahan, Patrick; Rosenman, Marc B
2016-03-01
Black male youth are at high risk of homicide and criminal justice involvement. This study aimed to determine how early mortality among youth offenders varies based on race; gender; and the continuum of justice system involvement: arrest, detention, incarceration, and transfer to adult courts. Criminal and death records of 49,479 youth offenders (ages 10-18 years at first arrest) in Marion County, Indiana, from January 1, 1999, to December 31, 2011, were examined. Statistical analyses were completed in November 2014. From 1999 to 2011 (aggregate exposure, 386,709 person-years), 518 youth offender deaths occurred. The most common cause of death was homicide (48.2%). The mortality rate of youth offenders was nearly 1.5 times greater than that among community youth (standardized mortality ratio, 1.48). The youth offender mortality rate varied depending on the severity of justice system involvement. Arrested youth had the lowest rate of mortality (90/100,000), followed by detained youth (165/100,000); incarcerated youth (216/100,000); and youth transferred to adult court (313/100,000). A proportional hazards model demonstrated that older age, male gender, and more severe justice system involvement 5 years post-arrest predicted shorter time to mortality. Youth offenders face greater risk for early death than community youth. Among these, black male youth face higher risk of early mortality than their white male counterparts. However, regardless of race/ethnicity, mortality rates for youth offenders increase as youth involvement in the justice system becomes more protracted and severe. Thus, justice system involvement is a significant factor to target for intervention. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Dwivedi, Jaya; Namdev, Kuldeep K; Chilkoti, Deepak C; Verma, Surajpal; Sharma, Swapnil
2018-06-06
Therapeutic drug monitoring (TDM) of anti-epileptic drugs provides a valid clinical tool in optimization of overall therapy. However, TDM is challenging due to the high biological samples (plasma/blood) storage/shipment costs and the limited availability of laboratories providing TDM services. Sampling in the form of dry plasma spot (DPS) or dry blood spot (DBS) is a suitable alternative to overcome these issues. An improved, simple, rapid, and stability indicating method for quantification of pregabalin in human plasma and DPS has been developed and validated. Analyses were performed on liquid chromatography tandem mass spectrometer under positive ionization mode of electrospray interface. Pregabain-d4 was used as internal standard, and the chromatographic separations were performed on Poroshell 120 EC-C18 column using an isocratic mobile phase flow rate of 1 mL/min. Stability of pregabalin in DPS was evaluated under simulated real-time conditions. Extraction procedures from plasma and DPS samples were compared using statistical tests. The method was validated considering the FDA method validation guideline. The method was linear over the concentration range of 20-16000 ng/mL and 100-10000 ng/mL in plasma and DPS, respectively. DPS samples were found stable for only one week upon storage at room temperature and for at least four weeks at freezing temperature (-20 ± 5 °C). Method was applied for quantification of pregabalin in over 600 samples of a clinical study. Statistical analyses revealed that two extraction procedures in plasma and DPS samples showed statistically insignificant difference and can be used interchangeably without any bias. Proposed method involves simple and rapid steps of sample processing that do not require a pre- or post-column derivatization procedure. The method is suitable for routine pharmacokinetic analysis and therapeutic monitoring of pregabalin.
Mocellin, Simone; Pasquali, Sandro; Rossi, Carlo R; Nitti, Donato
2010-04-07
Based on previous meta-analyses of randomized controlled trials (RCTs), the use of interferon alpha (IFN-alpha) in the adjuvant setting improves disease-free survival (DFS) in patients with high-risk cutaneous melanoma. However, RCTs have yielded conflicting data on the effect of IFN-alpha on overall survival (OS). We conducted a systematic review and meta-analysis to examine the effect of IFN-alpha on DFS and OS in patients with high-risk cutaneous melanoma. The systematic review was performed by searching MEDLINE, EMBASE, Cancerlit, Cochrane, ISI Web of Science, and ASCO databases. The meta-analysis was performed using time-to-event data from which hazard ratios (HRs) and 95% confidence intervals (CIs) of DFS and OS were estimated. Subgroup and meta-regression analyses to investigate the effect of dose and treatment duration were also performed. Statistical tests were two-sided. The meta-analysis included 14 RCTs, published between 1990 and 2008, and involved 8122 patients, of which 4362 patients were allocated to the IFN-alpha arm. IFN-alpha alone was compared with observation in 12 of the 14 trials, and 17 comparisons (IFN-alpha vs comparator) were generated in total. IFN-alpha treatment was associated with a statistically significant improvement in DFS in 10 of the 17 comparisons (HR for disease recurrence = 0.82, 95% CI = 0.77 to 0.87; P < .001) and improved OS in four of the 14 comparisons (HR for death = 0.89, 95% CI = 0.83 to 0.96; P = .002). No between-study heterogeneity in either DFS or OS was observed. No optimal IFN-alpha dose and/or treatment duration or a subset of patients more responsive to adjuvant therapy was identified using subgroup analysis and meta-regression. In patients with high-risk cutaneous melanoma, IFN-alpha adjuvant treatment showed statistically significant improvement in both DFS and OS.
NASA Astrophysics Data System (ADS)
Queirós, S. M. D.; Tsallis, C.
2005-11-01
The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).
Mathematics authentic assessment on statistics learning: the case for student mini projects
NASA Astrophysics Data System (ADS)
Fauziah, D.; Mardiyana; Saputro, D. R. S.
2018-03-01
Mathematics authentic assessment is a form of meaningful measurement of student learning outcomes for the sphere of attitude, skill and knowledge in mathematics. The construction of attitude, skill and knowledge achieved through the fulfilment of tasks which involve active and creative role of the students. One type of authentic assessment is student mini projects, started from planning, data collecting, organizing, processing, analysing and presenting the data. The purpose of this research is to learn the process of using authentic assessments on statistics learning which is conducted by teachers and to discuss specifically the use of mini projects to improving students’ learning in the school of Surakarta. This research is an action research, where the data collected through the results of the assessments rubric of student mini projects. The result of data analysis shows that the average score of rubric of student mini projects result is 82 with 96% classical completeness. This study shows that the application of authentic assessment can improve students’ mathematics learning outcomes. Findings showed that teachers and students participate actively during teaching and learning process, both inside and outside of the school. Student mini projects also provide opportunities to interact with other people in the real context while collecting information and giving presentation to the community. Additionally, students are able to exceed more on the process of statistics learning using authentic assessment.
Magill, Molly
2012-01-01
Summary Evidence-based practice involves the consistent and critical consumption of the social work research literature. As methodologies advance, primers to guide such efforts are often needed. In the present work, common statistical methods for testing moderation and mediation are identified, summarized, and corresponding examples, drawn from the substance abuse, domestic violence, and mental health literature, are provided. Findings While methodologically complex, analyses of these third variable effects can provide an optimal fit for the complexity involved in the provision of evidence-based social work services. While a moderator may identify the trait or state requirement for a causal relationship to occur, a mediator is concerned with the transmission of that relationship. In social work practice, these are questions of “under what conditions and for whom?” and of the “how?” of behavior change. Implications Implications include a need for greater attention to these methods among practitioners and evaluation researchers. With knowledge gained through the present review, social workers can benefit from a more ecologically valid evidence base for practice. PMID:22833701
Effectiveness of Collision-Involved Motorcycle Helmets in Thailand
Wobrock, Jesse; Smith, Terry; Kasantikul, Vira; Whiting, William
2003-01-01
The purpose of this study was to analyze variables present in selected motorcycle crashes involving helmeted riders to find the best injury predictors. The helmets used in this study were collected from motorcycle crashes in Thailand. Pertinent data were collected, a conventional helmet impact drop test apparatus was used to quantify the head impact forces, and stepwise multiple regression analyses were performed. The results indicate that the geometry of the object impacting the head and GSI were the best predictors for MAIS (R2=.875) while geometry of the object, liner thickness and impact energy were the best predictors for ISS (R2=.911). Analysis of motor vehicle crashes in the United States in the year 2001 reveals that motorcyclist fatalities increased 7.2%, from 2,862 fatalities in 2000 to 3,067 in 2001 [NHTSA 2002]. In 2001, 59,000 motorcyclists were injured, which represents an increase of 2.0% from 2000. These statistics are indicative of the risk that motorcycle riders face in the traffic environment and warrant the need for further research focusing on injury potential in motorcycle crashes. PMID:12941212
A SYSTEMATIC REVIEW AND META-ANALYSIS OF DROPOUT RATES IN YOUTH SOCCER.
Møllerløkken, Nina Elise; Lorås, Håvard; Pedersen, Arve Vorland
2015-12-01
Despite the many benefits of involvement in youth sports, participation in them declines throughout childhood and adolescence. The present study performed a systematic review and meta-analysis of 12 studies reporting dropout rates in youth soccer, involving a total of 724,036 youths ages 10-18 years from five countries. The mixed effects meta-regression analyses took into account age and sex as statistical moderators of dropout rate. Potential articles were identified through computerized searches of the databases PubMed, MedLine, Embase, and SportDiscus up until August 2014, without any further time limit. Based on results reported in the 10 included articles, the annual weighted mean dropout rate is 23.9% across the included cohorts. Meta-regression indicated that annual dropout rates are stable from the ages of 10-19 years, with higher rates for girls (26.8%) compared to boys (21.4%). The present study suggests that youth soccer players are prone to dropout rates in which close to one-fourth of players leave the sport annually, which appears to be a consistent finding across ages 10-18 years.
Different effection of p.1125Val>Ala and rs11954856 in APC on Wnt signaling pathway.
Li, Fei-Feng; Zhao, Zhi-Xun; Yan, Peng; Wang, Song; Liu, Zheng; Zhang, Qiong; Zhang, Xiao-Ning; Sun, Chang-Hao; Wang, Xi-Shan; Wang, Gui-Yu; Liu, Shu-Lin
2017-09-19
Colorectal cancer (CRC) is among the most common and fatal forms of solid tumors worldwide and more than two thirds of CRC and adenomas patients have APC gene mutations. APC is a key regulator in the Wnt/β-catenin signaling pathway but its roles in CRC remains to be elucidated. In this study, we compared APC genes between CRC patients and controls to determine possible associations of nucleotide changes in the APC gene with the pathways involved in CRC pathogenesis. All participants received physical and enteroscopic examinations. The APC gene was sequenced for 300 Chinese Han CRC patients and 411 normal controls, and the expression levels of genes in the signaling pathway were analyzed using Western Blotting. Statistical analyses were conducted using SPSS (version 19.0) software. We found that rs11954856 in the APC gene was associated with colorectal cancer and could increase the expression levels of APC , β-catenin , TCF7L1 , TCF7L2 and LEF1 genes in the pathway in the CRC patients, demonstrating the involvement of APC in the pathological processes leading to CRC.
Conceptual Models of Depression in Primary Care Patients: A Comparative Study
Karasz, Alison; Garcia, Nerina; Ferri, Lucia
2009-01-01
Conventional psychiatric treatment models are based on a biopsychiatric model of depression. A plausible explanation for low rates of depression treatment utilization among ethnic minorities and the poor is that members of these communities do not share the cultural assumptions underlying the biopsychiatric model. The study examined conceptual models of depression among depressed patients from various ethnic groups, focusing on the degree to which patients’ conceptual models ‘matched’ a biopsychiatric model of depression. The sample included 74 primary care patients from three ethnic groups screening positive for depression. We administered qualitative interviews assessing patients’ conceptual representations of depression. The analysis proceeded in two phases. The first phase involved a strategy called ‘quantitizing’ the qualitative data. A rating scheme was developed and applied to the data by a rater blind to study hypotheses. The data was subjected to statistical analyses. The second phase of the analysis involved the analysis of thematic data using standard qualitative techniques. Study hypotheses were largely supported. The qualitative analysis provided a detailed picture of primary care patients’ conceptual models of depression and suggested interesting directions for future research. PMID:20182550
SEER Cancer Query Systems (CanQues)
These applications provide access to cancer statistics including incidence, mortality, survival, prevalence, and probability of developing or dying from cancer. Users can display reports of the statistics or extract them for additional analyses.
Morales, Daniel R; Slattery, Jim; Evans, Stephen; Kurz, Xavier
2018-01-15
Antidepressant exposure during pregnancy has been associated with an increased risk of autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) in several observational studies. We performed a systematic review of these studies to highlight the effect that important methodological limitations have on such analyses and to consider approaches to the conduct, reporting and interpretation of future studies. A review of MEDLINE and EMBASE identified case-control, cohort and sibling studies assessing the risk of ASD and ADHD with antidepressant use during pregnancy. Approaches to confounding adjustment were described. Crude and adjusted effect estimates for comparisons between antidepressant exposure during pregnancy vs. all unexposed women were first meta-analysed using a generic inverse variance method of analysis, followed by effect estimates for alternative pre-selected comparison groups. A total of 15 studies measuring ASD as an outcome (involving 3,585,686 children and 40,585 cases) and seven studies measuring ADHD as an outcome (involving 2,765,723 patients and 52,313 cases) were identified. Variation in confounding adjustment existed between studies. Updated effect estimates for the association between maternal antidepressant exposure during pregnancy vs. all unexposed women remained statistically significant for ASD (adjusted random-effects risk ratio [RaRR] 1.53, 95% confidence interval [CI] 1.31-1.78). Similar significant associations were observed using pre-pregnancy maternal antidepressant exposure (RaRR 1.48, 95% CI 1.29-1.71) and paternal antidepressant exposure during pregnancy (1.29, 95% CI 1.08-1.53), but analyses restricted to using women with a history of affective disorder (1.18, 95% CI 0.91-1.52) and sibling studies (0.96, 95% CI 0.65-1.42) were not statistically significant. Corresponding associations for risk of ADHD with exposure were: RaRR 1.38, 95% CI 1.13-1.69 (during pregnancy), RaRR 1.38, 95% CI 1.14-1.69 (during pre-pregnancy), RaRR 1.71, 95% CI 1.31-2.23 (paternal exposure), RaRR 0.98, 95% CI 0.77-1.24 (women with a history of affective disorder) and RaRR 0.88, 95% CI 0.70-1.11 (sibling studies). Existing observational studies measuring the risk of ASD and ADHD with antidepressant exposure are heterogeneous in their design. Classical comparisons between exposed and unexposed women during pregnancy are at high risk of residual confounding. Alternative comparisons and sibling designs may aid the interpretation of causality and their utility requires further evaluation, including understanding potential limitations of undertaking meta-analyses with such data.
Facilitating the Transition from Bright to Dim Environments
2016-03-04
For the parametric data, a multivariate ANOVA was used in determining the systematic presence of any statistically significant performance differences...performed. All significance levels were p < 0.05, and statistical analyses were performed with the Statistical Package for Social Sciences ( SPSS ...1950. Age changes in rate and level of visual dark adaptation. Journal of Applied Physiology, 2, 407–411. Field, A. 2009. Discovering statistics
Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V
2017-07-21
DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.
Imaging Depression in Adults with ASD
2017-10-01
collected temporally close enough to imaging data in Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk...Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk attrition between Phase 1 and 2, we chose to hold...supervision is ongoing (since 9/2014). • Co-l Dr. Lerner’s 2nd year Clinical Psychology PhD students have participated in ADOS- 2 Introductory Clinical
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Zeitz, Kathryn; Haghighi, Pari Delir; Burstein, Frada; Williams, Jeffrey
2013-06-01
The present study was designed to further understand the psychosocial drivers of crowds impacting on the demand for healthcare. This involved analysing different spectator crowds for medical usage at mass gatherings; more specifically, did different football team spectators (of the Australian Football League) generate different medical usage rates. In total, 317 games were analysed from 10 venues over 2 years. Data were analysed by the ANOVA and Pearson correlation tests. RESULTS; Spectators who supported different football teams generated statistically significant differences in patient presentation rates (PPR) (F15, 618=1.998, P=0.014). The present study confirmed previous findings that there is a positive correlation between the crowd size and PPR at mass gatherings but found a negative correlation between density and PPR (r = -0.206, n=317, P<0.0005). The present study has attempted to scientifically explore psychosocial elements of crowd behaviour as a driver of demand for emergency medical care. In measuring demand for emergency medical services there is a need to develop a more sophisticated understanding of a variety of drivers in addition to traditional metrics such as temperature, crowd size and other physical elements. In this study we saw that spectators who supported different football teams generated statistically significant differences in PPR. What is known about this topic? Understanding the drivers of emergency medical care is most important in the mass gathering setting. There has been minimal analysis of psychological 'crowd' variables. What does this paper add? This study explores the psychosocial impact of supporting a different team on the PPR of spectators at Australian Football League matches. The value of collecting and analysing these types of data sets is to support more balanced planning, better decision support and knowledge management, and more effective emergency medical demand management. What are the implications for practitioners? This information further expands the body of evidence being created to understand the drivers of emergency medical demand and usage. In addition, it supports the planning and management of emergency medical and health-related requirements by increasing our understanding of the effect of elements of 'crowd' that impact on medical usage and emergency healthcare.
Handwriting in healthy people aged 65 years and over.
van Drempt, Nadege; McCluskey, Annie; Lannin, Natasha A
2011-08-01
Handwriting is an important activity that is commonly affected by neurological and orthopaedic conditions. Handwriting research has predominantly involved children. Little is known about handwriting behaviour in healthy older adults. This study aims to describe the handwriting practices of 30 unimpaired adults aged 65 years and over. In this cross-sectional observational study, data were collected from 30 older adults using a self-report questionnaire, digital pen recordings over three days and a handwriting log. Data were analysed using descriptive statistics. The mean age of participants was 75.1 years (standard deviation=6.9). Variations in handwriting were evident in letter size, slant and spacing. Participants wrote very little--a median of 18 words per occasion (interquartile range=10.5-26.9 words). Most handwriting involved self-generated text (85%), not copied or transcribed text. Participants stood while writing for 17% of handwriting occasions. The most common reasons for handwriting were note taking (23%) and puzzles (22%). Legibility may not depend exclusively on the handwriting script that a beginning writer is taught, but may be a result of other factors as the person ages. A comprehensive adult handwriting assessment and retraining programme should be relevant to older adults, including common handwriting activities, involving self-generated text and few words. © 2011 The Authors. Australian Occupational Therapy Journal © 2011 Occupational Therapy Australia.
Memon, Aftab Hameed; Rahman, Ismail Abdul
2014-01-01
This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227
Moure-Leite, F R; Ramos-Jorge, J; Ramos-Jorge, M L; Paiva, S M; Vale, M P; Pordeus, I A
2011-12-01
To assess the impact of dental pain on the daily living of 5-year-old preschool children using reports from parents/guardians. A cross-sectional study was carried out involving 549 five-year-old children randomly selected from preschools in the city of Belo Horizonte, Brazil. Data were collected using a previously validated parent-reported questionnaire. The children received dental examinations from a single calibrated examiner. The following outcome variables were selected: age, gender, dental caries, filled teeth, missing teeth, caries involving pulp and social class. Simple and multiple logistic regression analyses were performed on the data. According to parents' reports, 11.1% of children were affected by dental pain in the previous 4 months and of these 72.6% had their daily activities hampered by pain. The majority of these children had difficulty in eating, brushing teeth, sleeping, playing and going to school. The impact of dental pain had a statistically significant association with gender (p=0.001), social class (p=0.009), dental caries (p<0.001), missing teeth (p<0.001), filled teeth (p<0.001) and caries involving pulp (p<0.001). The prevalence of difficulties performing tasks of daily living due to dental pain was relatively high among the children studied.
Roseman, Mary G; Riddell, Martha C; Haynes, Jessica N
2011-01-01
To review the literature, identifying proposed recommendations for school-based nutrition interventions, and evaluate kindergarten through 12th grade school-based nutrition interventions conducted from 2000-2008. Proposed recommendations from school-based intervention reviews were developed and used in conducting a content analysis of 26 interventions. Twenty-six school-based nutrition interventions in the United States first published in peer-reviewed journals from 2000-2008. VARIABLE MEASURED: Ten proposed recommendations based on prior analyses of school-based nutrition interventions: (1) behaviorally focused, (2) multicomponents, (3) healthful food/school environment, (4) family involvement, (5) self-assessments, (6) quantitative evaluation, (7) community involvement, (8) ethnic/heterogeneous groups, (9) multimedia technology, and (10) sequential and sufficient duration. Descriptive statistics. The most frequent recommendations used were: (1) behaviorally focused components (100%) and (2) quantitative evaluation of food behaviors (96%). Only 15% of the interventions included community involvement or ethnic/heterogeneous groups, whereas 31% included anthropometric measures. Five of the 10 proposed recommendations were included in over 50% of the interventions. Rising trend of overweight children warrants the need to synthesize findings from previous studies to inform research and program development and assist in identification of high-impact strategies and tactics. Copyright © 2011 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.
Memon, Aftab Hameed; Rahman, Ismail Abdul
2014-01-01
This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.
NASA Technical Reports Server (NTRS)
Falls, L. W.; Crutcher, H. L.
1976-01-01
Transformation of statistics from a dimensional set to another dimensional set involves linear functions of the original set of statistics. Similarly, linear functions will transform statistics within a dimensional set such that the new statistics are relevant to a new set of coordinate axes. A restricted case of the latter is the rotation of axes in a coordinate system involving any two correlated random variables. A special case is the transformation for horizontal wind distributions. Wind statistics are usually provided in terms of wind speed and direction (measured clockwise from north) or in east-west and north-south components. A direct application of this technique allows the determination of appropriate wind statistics parallel and normal to any preselected flight path of a space vehicle. Among the constraints for launching space vehicles are critical values selected from the distribution of the expected winds parallel to and normal to the flight path. These procedures are applied to space vehicle launches at Cape Kennedy, Florida.
A new statistical method for design and analyses of component tolerance
NASA Astrophysics Data System (ADS)
Movahedi, Mohammad Mehdi; Khounsiavash, Mohsen; Otadi, Mahmood; Mosleh, Maryam
2017-03-01
Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
Newgard, Craig D; Lewis, Roger J
2005-06-01
Current recommendations regarding children traveling in passenger vehicles equipped with passenger air bags are based, in part, on evidence that the air-bag-related risk of injury and death is higher for children < or =12 years of age. However, the age or body size required to allow a child to be seated safely in front of a passenger air bag is unknown. To evaluate specific cutoff points for age, height, and weight as effect modifiers of the association between the presence of a passenger air bag and serious injury among children involved in motor vehicle crashes (MVCs), while controlling for important crash factors. A national population-based cohort of children involved in MVCs and included in the National Automotive Sampling System (NASS) Crashworthiness Data System (CDS) database from 1995 to 2002 was studied. NASS CDS clusters, strata, and weights were included in all analyses. Children 0 to 18 years of age involved in MVCs and seated in the right front passenger seat. Serious injury, defined as an Abbreviated Injury Scale score of > or =3 for any body region. A total of 3790 patients (1 month to 18 years of age) were represented in the NASS CDS database during the 8-year period. Sixty children (1.6%) were seriously injured (Abbreviated Injury Scale score of > or =3). Among age, height, and weight, age of 0 to 14 years (versus 15-18 years) was the only consistent effect modifier of the association between air-bag presence (or air-bag deployment) and serious injury, particularly for crashes with a moderate probability of injury. In analyses stratified according to age and adjusted for important crash factors, children 0 to 14 years of age involved in frontal collisions seemed to be at increased risk of serious injury from air-bag presence (odds ratio [OR]: 2.66; 95% confidence interval [CI]: 0.23-30.9) and deployment (OR: 6.13; 95% CI: 0.30-126), although these values did not reach statistical significance. Among children 15 to 18 years of age involved in frontal collisions, there was a protective effect on injury from both air-bag presence (OR: 0.19; 95% CI: 0.05-0.75) and deployment (OR: 0.31; 95% CI: 0.09-0.99). These findings persisted in analyses involving all collision types. We did not identify similar cutoff points for height or weight. Children up to 14 years of age may be at risk for serious preventable injury when seated in front of a passenger air bag, and children 15 to 18 years of age seem to experience protective effects of air-bag presence and deployment. Age may be a better marker than height or weight for risk assessment regarding children and air bags.
40 CFR Appendix IV to Part 265 - Tests for Significance
Code of Federal Regulations, 2010 CFR
2010-07-01
... introductory statistics texts. ... student's t-test involves calculation of the value of a t-statistic for each comparison of the mean... parameter with its initial background concentration or value. The calculated value of the t-statistic must...
Knowledge translation and implementation in spinal cord injury: a systematic review
Noonan, VK; Wolfe, DL; Thorogood, NP; Park, SE; Hsieh, JT; Eng, JJ
2015-01-01
Objective To conduct a systematic review examining the effectiveness of knowledge translation (KT) interventions in changing clinical practice and patient outcomes. Methods MEDLINE/PubMed, CINAHL, EMBASE and PsycINFO were searched for studies published from January 1980 to July 2012 that reported and evaluated an implemented KT intervention in spinal cord injury (SCI) care. We reviewed and summarized results from studies that documented the implemented KT intervention, its impact on changing clinician behavior and patient outcomes as well as the facilitators and barriers encountered during the implementation. Results A total of 13 articles featuring 10 studies were selected and abstracted from 4650 identified articles. KT interventions included developing and implementing patient care protocols, providing clinician education and incorporating outcome measures into clinical practice. The methods (or drivers) to facilitate the implementation included organizing training sessions for clinical staff, introducing computerized reminders and involving organizational leaders. The methodological quality of studies was mostly poor. Only 3 out of 10 studies evaluated the success of the implementation using statistical analyses, and all 3 reported significant behavior change. Out of the 10 studies, 6 evaluated the effect of the implementation on patient outcomes using statistical analyses, with 4 reporting significant improvements. The commonly cited facilitators and barriers were communication and resources, respectively. Conclusion The field of KT in SCI is in its infancy with only a few relevant publications. However, there is some evidence that KT interventions may change clinician behavior and improve patient outcomes. Future studies should ensure rigorous study methods are used to evaluate KT interventions. PMID:24796445
Subbaraman, Meenakshi Sabina; Kerr, William C.
2017-01-01
Background Support for the legalization of recreational marijuana continues to increase across the United States and globally. In 2016, recreational marijuana was legalized in the most populous US state of California, as well as three other states. The primary aim of this study was to examine trends in support for recreational marijuana legalization in Washington, a state which has had legal recreational marijuana for almost four years, using data collected over the four years post-legalization. A secondary aim was to examine trends in support for the cultivation of marijuana for personal use. Methods Data come from geographically representative general population samples of adult (aged 18 and over) Washington residents collected over five timepoints (every six months) between January 2014 and April 2016 (N = 4,101). Random Digit Dial was used for recruitment. Statistical analyses involved bivariate comparisons of proportions across timepoints and subgroups (defined by age, gender, and marijuana user status), and multivariable logistic regression controlling for timepoint (time) to formally test for trend while controlling for demographic and substance use covariates. All analyses adjusted for probability of selection. Results Support for legalization in Washington has significantly increased: support was 64.0% (95% CI: 61.2%–67.8%) at timepoint 1 and 77.9% (95% CI: 73.2%–81.9%) at timepoint 5. With each six months’ passing, support increased 19% on average. We found no statistically significant change in support for home-growing. Conclusions Support for marijuana legalization has continued to significantly increase in a state that has experienced the policy change for almost four years. PMID:28448904
Recovery definitions: Do they change?
Kaskutas, Lee Ann; Witbrodt, Jane; Grella, Christine E.
2015-01-01
Background The term “recovery” is widely used in the substance abuse literature and clinical settings, but data have not been available to empirically validate how recovery is defined by individuals who are themselves in recovery. The “What Is Recovery?” project developed a 39-item definition of recovery based on a large nationwide online survey of individuals in recovery. The objective of this paper is to report on the stability of those definitions one to two years later. Methods To obtain a sample for studying recovery definitions that reflected the different pathways to recovery, the parent study involved intensive outreach. Follow-up interviews (n = 1237) were conducted online and by telephone among respondents who consented to participate in follow-up studies. Descriptive analyses considered endorsement of individual recovery items at both surveys, and t-tests of summary scores studied significant change in the sample overall and among key subgroups. To assess item reliability, Cronbach’s alpha was estimated. Results Rates of endorsement of individual items at both interviews was above 90% for a majority of the recovery elements, and there was about as much transition into endorsement as out of endorsement. Statistically significant t-test scores were of modest magnitude, and reliability statistics were high (ranging from .782 to .899). Conclusions Longitudinal analyses found little evidence of meaningful change in recovery definitions at follow-up. Results thus suggest that the recovery definitions developed in the parent “What Is Recovery?” survey represent stable definitions of recovery that can be used to guide service provision in Recovery-Oriented Systems of Care. PMID:26166666
Dental anomalies associated with cleft lip and palate in Northern Finland.
Lehtonen, V; Anttonen, V; Ylikontiola, L P; Koskinen, S; Pesonen, P; Sándor, G K
2015-12-01
Despite the reported occurrence of dental anomalies of cleft lip and palate, little is known about their prevalence in children from Northern Finland with cleft lip and palate. The aim was to investigate the prevalence of dental anomalies among patients with different types of clefts in Northern Finland. Design and Statistics: patient records of 139 subjects aged three years and older (with clefts treated in Oulu University Hospital, Finland during the period 1996-2010 (total n. 183) were analysed for dental anomalies including the number of teeth, morphological and developmental anomalies and their association with the cleft type. The analyses were carried out using Chi-square test and Fisher's exact test. Differences between the groups were considered statistically significant at p values < 0.05. More than half of the patients had clefts of the hard palate, 18% of the lip and palate, and 13% of the lip. At least one dental anomaly was detected in 47% of the study population. Almost one in three (26.6%) subjects had at least one anomaly and 17.9% had two or three anomalies. The most common type of anomaly in permanent teeth were missing teeth followed by supernumerary teeth. Supernumerary teeth were significantly more apparent when the lip was involved in the cleft compared with palatal clefts. Missing teeth were less prevalent among those 5 years or younger. The prevalence of different anomalies was significantly associated with the cleft type in both age groups. Dental anomalies are more prevalent among cleft children than in the general population in Finland. The most prevalent anomalies associated with cleft were missing and supernumerary teeth.
Subbaraman, Meenakshi Sabina; Kerr, William C
2017-06-01
Support for the legalization of recreational marijuana continues to increase across the United States and globally. In 2016, recreational marijuana was legalized in the most populous US state of California, as well as three other states. The primary aim of this study was to examine trends in support for recreational marijuana legalization in Washington, a state which has had legal recreational marijuana for almost four years, using data collected over the four years post-legalization. A secondary aim was to examine trends in support for the cultivation of marijuana for personal use. Data come from geographically representative general population samples of adult (aged 18 and over) Washington residents collected over five timepoints (every six months) between January 2014 and April 2016 (N=4101). Random Digit Dial was used for recruitment. Statistical analyses involved bivariate comparisons of proportions across timepoints and subgroups (defined by age, gender, and marijuana user status), and multivariable logistic regression controlling for timepoint (time) to formally test for trend while controlling for demographic and substance use covariates. All analyses adjusted for probability of selection. Support for legalization in Washington has significantly increased: support was 64.0% (95% CI: 61.2%-67.8%) at timepoint 1 and 77.9% (95% CI: 73.2%-81.9%) at timepoint 5. With each six months' passing, support increased 19% on average. We found no statistically significant change in support for home-growing. Support for marijuana legalization has continued to significantly increase in a state that has experienced the policy change for almost four years. Copyright © 2017. Published by Elsevier B.V.
Cost-effectiveness of two vocational rehabilitation programs for persons with severe mental illness.
Dixon, Lisa; Hoch, Jeffrey S; Clark, Robin; Bebout, Richard; Drake, Robert; McHugo, Greg; Becker, Deborah
2002-09-01
This study sought to determine differences in the cost-effectiveness of two vocational programs: individual placement and support (IPS), in which employment specialists within a mental health center help patients obtain competitive jobs and provide them with ongoing support, and enhanced vocational rehabilitation (EVR), in which stepwise services that involve prevocational experiences are delivered by rehabilitation agencies. A total of 150 unemployed inner-city patients with severe mental disorders who expressed an interest in competitive employment were randomly assigned to IPS or EVR programs and were followed for 18 months. Wages from all forms of employment and the number of weeks and hours of competitive employment were tracked monthly. Estimates were made of direct mental health costs and vocational costs. Incremental cost-effectiveness ratios (ICERs) were calculated for competitive employment outcomes and total wages. No statistically significant differences were found in the overall costs of IPS and EVR. Participation in the IPS program was associated with significantly more hours and weeks of competitive employment. However, the average combined earnings-earnings from competitive and noncompetitive employment-were virtually the same both programs. The ICER estimates indicated that participants in the IPS program worked in competitive employment settings for an additional week over the 18-month period at a cost of $283 ($13 an hour). The analyses suggest that IPS participants engaged in competitive employment at a higher cost. When combined earnings were used as the outcome, data from the statistical analyses were insufficient to enable any firm conclusions to be drawn. The findings illustrate the importance of choice of outcomes in evaluations of employment programs.
The influence of neighborhood unemployment on mortality after stroke.
Unrath, Michael; Wellmann, Jürgen; Diederichs, Claudia; Binse, Lisa; Kalic, Marianne; Heuschmann, Peter Ulrich; Berger, Klaus
2014-07-01
Few studies have investigated the impact of neighborhood characteristics on mortality after stroke. Aim of our study was to analyze the influence of district unemployment as indicator of neighborhood socioeconomic status (SES-NH) on poststroke mortality, and to compare these results with the mortality in the underlying general population. Our analyses involve 2 prospective cohort studies from the city of Dortmund, Germany. In the Dortmund Stroke Register (DOST), consecutive stroke patients (N=1883) were recruited from acute care hospitals. In the Dortmund Health Study (DHS), a random general population sample was drawn (n=2291; response rate 66.9%). Vital status was ascertained in the city's registration office and information on district unemployment was obtained from the city's statistical office. We performed multilevel survival analyses to examine the association between district unemployment and mortality. The association between neighborhood unemployment and mortality was weak and not statistically significant in the stroke cohort. Only stroke patients exposed to the highest district unemployment (fourth quartile) had slightly higher mortality risks. In the general population sample, higher district unemployment was significantly associated with higher mortality following a social gradient. After adjustment for education, health-related behavior and morbidity was made the strength of this association decreased. The impact of SES-NH on mortality was different for stroke patients and the general population. Differences in the association between SES-NH and mortality may be partly explained by disease-related characteristics of the stroke cohort such as homogeneous lifestyles, similar morbidity profiles, medical factors, and old age. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194
Research design and statistical methods in Indian medical journals: a retrospective survey.
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.
Statistical Literacy in the Data Science Workplace
ERIC Educational Resources Information Center
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
de Haan, Bianca; Karnath, Hans-Otto
2017-12-01
Nowadays, different anatomical atlases exist for the anatomical interpretation of the results from neuroimaging and lesion analysis studies that investigate the contribution of white matter fiber tract integrity to cognitive (dys)function. A major problem with the use of different atlases in different studies, however, is that the anatomical interpretation of neuroimaging and lesion analysis results might vary as a function of the atlas used. This issue might be particularly prominent in studies that investigate the contribution of white matter fiber tract integrity to cognitive (dys)function. We used a single large-sample dataset of right brain damaged stroke patients with and without cognitive deficit (here: spatial neglect) to systematically compare the influence of three different, widely-used white matter fiber tract atlases (1 histology-based atlas and 2 DTI tractography-based atlases) on conclusions concerning the involvement of white matter fiber tracts in the pathogenesis of cognitive dysfunction. We both calculated the overlap between the statistical lesion analysis results and each long association fiber tract (topological analyses) and performed logistic regressions on the extent of fiber tract damage in each individual for each long association white matter fiber tract (hodological analyses). For the topological analyses, our results suggest that studies that use tractography-based atlases are more likely to conclude that white matter integrity is critical for a cognitive (dys)function than studies that use a histology-based atlas. The DTI tractography-based atlases classified approximately 10 times as many voxels of the statistical map as being located in a long association white matter fiber tract than the histology-based atlas. For hodological analyses on the other hand, we observed that the conclusions concerning the overall importance of long association fiber tract integrity to cognitive function do not necessarily depend on the white matter atlas used, but conclusions may vary as a function of atlas used at the level of individual fiber tracts. Moreover, these analyses revealed that hodological studies that express the individual extent of injury to each fiber tract as a binomial variable are more likely to conclude that white matter integrity is critical for a cognitive function than studies that express the individual extent of injury to each fiber tract as a continuous variable. Copyright © 2017 Elsevier Inc. All rights reserved.
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H
2017-04-20
Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.
NASA Astrophysics Data System (ADS)
Figueroa, M. C.; Gregory, D. D.; Lyons, T. W.; Williford, K. H.
2017-12-01
Life processes affect trace element abundances in pyrite such that sedimentary and hydrothermal pyrite have significantly different trace element signatures. Thus, we propose that these biogeochemical data could be used to identify pyrite that formed biogenetically either early in our planet's history or on other planets, particularly Mars. The potential for this approach is elevated because pyrite is common in diverse sedimentary settings, and its trace element content can be preserved despite secondary overprints up to greenschist facies, thus minimizing the concerns about remobilization that can plague traditional whole rock studies. We are also including in-situ sulfur isotope analysis to further refine our understanding of the complex signatures of ancient pyrite. Sulfur isotope data can point straightforwardly to the involvement of life, because pyrite in sediments is inextricably linked to bacterial sulfate reduction and its diagnostic isotopic expressions. In addition to analyzing pyrite of known biological origin formed in the modern and ancient oceans under a range of conditions, we are building a data set for pyrite formed by hydrothermal and metamorphic processes to minimize the risk of false positives in life detection. We have used Random Forests (RF), a machine learning statistical technique with proven efficiency for classifying large geological datasets, to classify pyrite into biotic and abiotic end members. Coupling the trace element and sulfur isotope data from our analyses with a large existing dataset from diverse settings has yielded 4500 analyses with 18 different variables. Our initial results reveal the promise of the RF approach, correctly identifying biogenic pyrite 97 percent of the time. We will continue to couple new in-situ S-isotope and trace element analyses of biogenic pyrite grains from modern and ancient environments, using cutting-edge microanalytical techniques, with new data from high temperature settings. Our ultimately goal is a refined search tool with straightforward application in the search for early life on Earth and distant life recorded in meteorites, returned samples, and in situ measurements.
Silveira-Neto, Nicolau; Flores, Mateus Ericson; De Carli, João Paulo; Costa, Max Dória; Matos, Felipe de Souza; Paranhos, Luiz Renato; Linden, Maria Salete Sandini
2017-11-01
This research evaluated detail registration in peri-implant bone using two different cone beam computer tomography systems and a digital periapical radiograph. Three different image acquisition protocols were established for each cone beam computer tomography apparatus, and three clinical situations were simulated in an ex vivo fresh pig mandible: buccal bone defect, peri-implant bone defect, and bone contact. Data were subjected to two analyses: quantitative and qualitative. The quantitative analyses involved a comparison of real specimen measures using a digital caliper in three regions of the preserved buccal bone - A, B and E (control group) - to cone beam computer tomography images obtained with different protocols (kp1, kp2, kp3, ip1, ip2, and ip3). In the qualitative analyses, the ability to register peri-implant details via tomography and digital periapical radiography was verified, as indicated by twelve evaluators. Data were analyzed with ANOVA and Tukey's test (α=0.05). The quantitative assessment showed means statistically equal to those of the control group under the following conditions: buccal bone defect B and E with kp1 and ip1, peri-implant bone defect E with kp2 and kp3, and bone contact A with kp1, kp2, kp3, and ip2. Qualitatively, only bone contacts were significantly different among the assessments, and the p3 results differed from the p1 and p2 results. The other results were statistically equivalent. The registration of peri-implant details was influenced by the image acquisition protocol, although metal artifacts were produced in all situations. The evaluators preferred the Kodak 9000 3D cone beam computer tomography in most cases. The evaluators identified buccal bone defects better with cone beam computer tomography and identified peri-implant bone defects better with digital periapical radiography.
Henríquez-Hernández, Luis Alberto; Valenciano, Almudena; Foro-Arnalot, Palmira; Álvarez-Cubero, María Jesús; Cozar, José Manuel; Suárez-Novo, José Francisco; Castells-Esteve, Manel; Fernández-Gonzalo, Pablo; De-Paula-Carranza, Belén; Ferrer, Montse; Guedea, Ferrán; Sancho-Pardo, Gemma; Craven-Bartle, Jordi; Ortiz-Gordillo, María José; Cabrera-Roldán, Patricia; Rodríguez-Melcón, Juan Ignacio; Herrera-Ramos, Estefanía; Rodríguez-Gallego, Carlos; Lara, Pedro C
2015-07-01
Prostate cancer (PCa) is an androgen-dependent disease. Nonetheless, the role of single nucleotide polymorphisms (SNPs) in genes encoding androgen metabolism remains an unexplored area. To investigate the role of germline variations in cytochrome P450 17A1 (CYP17A1) and steroid-5α-reductase, α-polypeptides 1 and 2 (SRD5A1 and SRD5A2) genes in PCa. In total, 494 consecutive Spanish patients diagnosed with nonmetastatic localized PCa were included in this multicenter study and were genotyped for 32 SNPs in SRD5A1, SRD5A2, and CYP17A1 genes using a Biotrove OpenArray NT Cycler. Clinical data were available. Genotypic and allelic frequencies, as well as haplotype analyses, were determined using the web-based environment SNPator. All additional statistical analyses comparing clinical data and SNPs were performed using PASW Statistics 15. The call rate obtained (determined as the percentage of successful determinations) was 97.3% of detection. A total of 2 SNPs in SRD5A1-rs3822430 and rs1691053-were associated with prostate-specific antigen level at diagnosis. Moreover, G carriers for both SNPs were at higher risk of presenting initial prostate-specific antigen levels>20ng/ml (Exp(B) = 2.812, 95% CI: 1.397-5.657, P = 0.004) than those who are AA-AA carriers. Haplotype analyses showed that patients with PCa nonhomozygous for the haplotype GCTTGTAGTA were at an elevated risk of presenting bigger clinical tumor size (Exp(B) = 3.823, 95% CI: 1.280-11.416, P = 0.016), and higher Gleason score (Exp(B) = 2.808, 95% CI: 1.134-6.953, P = 0.026). SNPs in SRD5A1 seem to affect the clinical characteristics of Spanish patients with PCa. Copyright © 2015 Elsevier Inc. All rights reserved.
Al-Badriyeh, Daoud; Alameri, Marwah; Al-Okka, Randa
2017-01-01
Objective To perform a first-time analysis of the cost-effectiveness (CE) literature on chemotherapies, of all types, in cancer, in terms of trends and change over time, including the influence of industry funding. Design Systematic review. Setting A wide range of cancer-related research settings within healthcare, including health systems, hospitals and medical centres. Participants All literature comparative CE research of drug-based cancer therapies in the period 1986 to 2015. Primary and secondary outcome measures Primary outcomes are the literature trends in relation to journal subject category, authorship, research design, data sources, funds and consultation involvement. An additional outcome measure is the association between industry funding and study outcomes. Analysis Descriptive statistics and the χ2, Fisher exact or Somer's D tests were used to perform non-parametric statistics, with a p value of <0.05 as the statistical significance measure. Results Total 574 publications were analysed. The drug-related CE literature expands over time, with increased publishing in the healthcare sciences and services journal subject category (p<0.001). The retrospective data collection in studies increased over time (p<0.001). The usage of prospective data, however, has been decreasing (p<0.001) in relation to randomised clinical trials (RCTs), but is unchanging for non-RCT studies. The industry-sponsored CE studies have especially been increasing (p<0.001), in contrast to those sponsored by other sources. While paid consultation involvement grew throughout the years, the declaration of funding for this is relatively limited. Importantly, there is evidence that industry funding is associated with favourable result to the sponsor (p<0.001). Conclusions This analysis demonstrates clear trends in how the CE cancer research is presented to the practicing community, including in relation to journals, study designs, authorship and consultation, together with increased financial sponsorship by pharmaceutical industries, which may be more influencing study outcomes than other funding sources. PMID:28131999
Bagshaw, Clarence; Isdell, Allen E; Thiruvaiyaru, Dharma S; Brisbin, I Lehr; Sanchez, Susan
2014-06-01
More than thirty years have passed since canine parvovirus (CPV) emerged as a significant pathogen and it continues to pose a severe threat to world canine populations. Published information suggests that flies (Diptera) may play a role in spreading this virus; however, they have not been studied extensively and the degree of their involvement is not known. This investigation was directed toward evaluating the vector capacity of such flies and determining their potential role in the transmission and ecology of CPV. Molecular diagnostic methods were used in this cross-sectional study to detect the presence of CPV in flies trapped at thirty-eight canine facilities. The flies involved were identified as belonging to the house fly (Mucidae), flesh fly (Sarcophagidae) and blow/bottle fly (Calliphoridae) families. A primary surveillance location (PSL) was established at a canine facility in south-central South Carolina, USA, to identify fly-virus interaction within the canine facility environment. Flies trapped at this location were pooled monthly and assayed for CPV using polymerase chain reaction (PCR) methods. These insects were found to be positive for CPV every month from February through the end of November 2011. Fly vector behavior and seasonality were documented and potential environmental risk factors were evaluated. Statistical analyses were conducted to compare the mean numbers of each of the three fly families captured, and after determining fly CPV status (positive or negative), it was determined whether there were significant relationships between numbers of flies captured, seasonal numbers of CPV cases, temperature and rainfall. Flies were also sampled at thirty-seven additional canine facility surveillance locations (ASL) and at four non-canine animal industry locations serving as negative field controls. Canine facility risk factors were identified and evaluated. Statistical analyses were conducted on the number of CPV cases reported within the past year to determine the correlation of fly CPV status (positive or negative) for each facility, facility design (open or closed), mean number of dogs present monthly and number of flies captured. Significant differences occurred between fly CPV positive vs. negative sites with regard to their CPV case numbers, fly numbers captured, and number of dogs present. At the ASL, a statistically significant relationship was found between PCR-determined fly CPV status (positive or negative) and facility design (open vs. closed). Open-facility designs were likely to have more CPV outbreaks and more likely to have flies testing positive for CPV DNA. Copyright © 2014 Elsevier B.V. All rights reserved.
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
Smith, Jason E; Rockett, Mark; Squire, Rosalyn; Hayward, Christopher J; Creanor, Siobhan; Ewings, Paul; Barton, Andy; Pritchard, Colin; Benger, Jonathan Richard
2013-01-01
Introduction Pain is the commonest reason that patients present to an emergency department (ED), but it is often not treated effectively. Patient controlled analgesia (PCA) is used in other hospital settings but there is little evidence to support its use in emergency patients. We describe two randomised trials aiming to compare PCA to nurse titrated analgesia (routine care) in adult patients who present to the ED requiring intravenous opioid analgesia for the treatment of moderate to severe pain and are subsequently admitted to hospital. Methods and analysis Two prospective multi-centre open-label randomised trials of PCA versus routine care in emergency department patients who require intravenous opioid analgesia followed by admission to hospital; one trial involving patients with traumatic musculoskeletal injuries and the second involving patients with non-traumatic abdominal pain. In each trial, 200 participants will be randomised to receive either routine care or PCA, and followed for the first 12 h of their hospital stay. The primary outcome measure is hourly pain score recorded by the participant using a visual analogue scale (VAS) over the 12 h study period, with the primary statistical analyses based on the area under the curve of these pain scores. Secondary outcomes include total opioid use, side effects, time spent asleep, patient satisfaction, length of hospital stay and incremental cost effectiveness ratio. Ethics and dissemination The study is approved by the South Central—Southampton A Research Ethics Committee (REC reference 11/SC/0151). Data collection will be completed by August 2013, with statistical analyses starting after all final data queries are resolved. Dissemination plans include presentations at local, national and international scientific meetings held by relevant Colleges and societies. Publications should be ready for submission during 2014. A lay summary of the results will be available to study participants on request, and disseminated via a publically accessible website. Registration details The study is registered with the European Clinical Trials Database (EudraCT Number: 2011-000194-31) and is on the ISCRTN register (ISRCTN25343280). PMID:23418302
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.
Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita
2016-10-11
We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.
Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew
2012-08-08
Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Coordinate based random effect size meta-analysis of neuroimaging studies.
Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J
2017-06-01
Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.
The EUSTACE project: delivering global, daily information on surface air temperature
NASA Astrophysics Data System (ADS)
Ghent, D.; Rayner, N. A.
2017-12-01
Day-to-day variations in surface air temperature affect society in many ways; however, daily surface air temperature measurements are not available everywhere. A global daily analysis cannot be achieved with measurements made in situ alone, so incorporation of satellite retrievals is needed. To achieve this, in the EUSTACE project (2015-2018, https://www.eustaceproject.eu) we have developed an understanding of the relationships between traditional (land and marine) surface air temperature measurements and retrievals of surface skin temperature from satellite measurements, i.e. Land Surface Temperature, Ice Surface Temperature, Sea Surface Temperature and Lake Surface Water Temperature. Here we discuss the science needed to produce a fully-global daily analysis (or ensemble of analyses) of surface air temperature on the centennial scale, integrating different ground-based and satellite-borne data types. Information contained in the satellite retrievals is used to create globally-complete fields in the past, using statistical models of how surface air temperature varies in a connected way from place to place. This includes developing new "Big Data" analysis methods as the data volumes involved are considerable. We will present recent progress along this road in the EUSTACE project, i.e.: • identifying inhomogeneities in daily surface air temperature measurement series from weather stations and correcting for these over Europe; • estimating surface air temperature over all surfaces of Earth from surface skin temperature retrievals; • using new statistical techniques to provide information on higher spatial and temporal scales than currently available, making optimum use of information in data-rich eras. Information will also be given on how interested users can become involved.
Probabilistic treatment of the uncertainty from the finite size of weighted Monte Carlo data
NASA Astrophysics Data System (ADS)
Glüsenkamp, Thorsten
2018-06-01
Parameter estimation in HEP experiments often involves Monte Carlo simulation to model the experimental response function. A typical application are forward-folding likelihood analyses with re-weighting, or time-consuming minimization schemes with a new simulation set for each parameter value. Problematically, the finite size of such Monte Carlo samples carries intrinsic uncertainty that can lead to a substantial bias in parameter estimation if it is neglected and the sample size is small. We introduce a probabilistic treatment of this problem by replacing the usual likelihood functions with novel generalized probability distributions that incorporate the finite statistics via suitable marginalization. These new PDFs are analytic, and can be used to replace the Poisson, multinomial, and sample-based unbinned likelihoods, which covers many use cases in high-energy physics. In the limit of infinite statistics, they reduce to the respective standard probability distributions. In the general case of arbitrary Monte Carlo weights, the expressions involve the fourth Lauricella function FD, for which we find a new finite-sum representation in a certain parameter setting. The result also represents an exact form for Carlson's Dirichlet average Rn with n > 0, and thereby an efficient way to calculate the probability generating function of the Dirichlet-multinomial distribution, the extended divided difference of a monomial, or arbitrary moments of univariate B-splines. We demonstrate the bias reduction of our approach with a typical toy Monte Carlo problem, estimating the normalization of a peak in a falling energy spectrum, and compare the results with previously published methods from the literature.
Clinical Evaluation of Papilla Reconstruction Using Subepithelial Connective Tissue Graft
Kaushik, Alka; PK, Pal; Chopra, Deepak; Chaurasia, Vishwajit Rampratap; Masamatti, Vinaykumar S; DK, Suresh; Babaji, Prashant
2014-01-01
Objective: The aesthetics of the patient can be improved by surgical reconstruction of interdental papilla by using an advanced papillary flap interposed with subepithelial connective tissue graft. Materials and Methods: A total of fifteen sites from ten patients having black triangles/papilla recession in the maxillary anterior region were selected and subjected to presurgical evaluation. The sites were treated with interposed subepithelial connective tissue graft placed under a coronally advance flap. The integrity of the papilla was maintained by moving the whole of gingivopapillary unit coronally. The various parameters were analysed at different intervals. Results: There was a mean decrease in the papilla presence index score and distance from contact point to gingival margin, but it was statistically not significant. Also, there is increase in the width of the keratinized gingiva which was statistically highly significant. Conclusion: Advanced papillary flap with interposed sub–epithelial connective tissue graft can offer predictable results for the reconstruction of interdental papilla. If papilla loss occurs solely due to soft-tissue damage, reconstructive techniques can completely restore it; but if due to periodontal disease involving bone loss, reconstruction is generally incomplete and multiple surgical procedures may be required. PMID:25386529
Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent
2017-01-01
Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122
Bjelović, Miloš; Babić, Tamara; Dragicević, Igor; Corac, Aleksandar; Goran Trajković
2015-01-01
Recent data from the studies conducted in the Western countries have proved that patients with gastroesophageal reflux disease have significantly impaired health-related quality of life compared to general population. The study is aimed at evaluating the burden of reflux symptoms on patients'health-related quality of life. The study involved 1,593 patients with diagnosed gastroesophageal reflux disease.The Serbian version of a generic self-administered Centers for Disease Control and Prevention questionnaire was used. Statistical analyses included descriptive statistics, Pearson chi-square test and a multiple regression model. Among all participants, 43.9% reported fair or poor health. Mean value of unhealthy days during the past 30 days was 10.4 days, physically unhealthy days 6.4 days, mentally unhealthy days 5.3 days and activity limitation days 4.3 days. Furthermore, 24.8% participants reported having ≥ 14 unhealthy days, 14.9% had 14 physically unhealthy days, 11.8% reported 14 mentally unhealthy days, and 9.4% had ≥ 14 activity limitation days. This study addressed complex relationships between reflux symptoms and patients'impaired everyday lives.
Hitchings, Julia E.; Spoth, Richard L.
2010-01-01
Conduct problems are strong positive predictors of substance use and problem substance use among teens, whereas predictive associations of depressed mood with these outcomes are mixed. Conduct problems and depressed mood often co-occur, and such co-occurrence may heighten risk for negative outcomes. Thus, this study examined the interaction of conduct problems and depressed mood at age 11 in relation to substance use and problem use at age 18, and possible mediation through peer substance use at age 16. Analyses of multirater longitudinal data collected from 429 rural youths (222 girls) and their families were conducted using a methodology for testing latent variable interactions. The link between the conduct problems X depressed mood interaction and adolescent substance use was negative and statistically significant. Unexpectedly, positive associations of conduct problems with substance use were stronger at lower levels of depressed mood. A significant negative interaction in relation to peer substance use also was observed, and the estimated indirect effect of the interaction on adolescent use through peer use as a mediator was statistically significant. Findings illustrate the complexity of multiproblem youth. PMID:18455886
Enhancement of CFD validation exercise along the roof profile of a low-rise building
NASA Astrophysics Data System (ADS)
Deraman, S. N. C.; Majid, T. A.; Zaini, S. S.; Yahya, W. N. W.; Abdullah, J.; Ismail, M. A.
2018-04-01
The aim of this study is to enhance the validation of CFD exercise along the roof profile of a low-rise building. An isolated gabled-roof house having 26.6° roof pitch was simulated to obtain the pressure coefficient around the house. Validation of CFD analysis with experimental data requires many input parameters. This study performed CFD simulation based on the data from a previous study. Where the input parameters were not clearly stated, new input parameters were established from the open literatures. The numerical simulations were performed in FLUENT 14.0 by applying the Computational Fluid Dynamics (CFD) approach based on steady RANS equation together with RNG k-ɛ model. Hence, the result from CFD was analysed by using quantitative test (statistical analysis) and compared with CFD results from the previous study. The statistical analysis results from ANOVA test and error measure showed that the CFD results from the current study produced good agreement and exhibited the closest error compared to the previous study. All the input data used in this study can be extended to other types of CFD simulation involving wind flow over an isolated single storey house.
[Perception of professional identity in nursing amongst undergraduate students].
Albar, María-Jesús; Sivianes-Fernández, María
2016-01-01
To identify the perception of the nursing professional identity between first and fourth grade students. A descriptive study using a questionnaire. A random sample of 50 and 51 students were selected from the first and fourth grade, respectively. The questionnaire was prepared by expert consensus, and it included a sociodemographic data register, 14 items, and two open questions. Descriptive and bivariate analyses were performed on the data, using the Chi-squared test to determine the possible differences between both grades. SPSS 22.0 statistics software was employed. The open questions were submitted to a content analysis. Statistically significant differences were found between the items related to the diversity of roles that the nursing professionals can develop within the health care system (professional and academic), and between the autonomous nature of their practices. These data were confirmed by the information obtained with the open questions. Academic training is of great importance in the process of acquiring the professional identity of future professionals in nursing, but changing the public image of the profession is the responsibility of all the social agents involved in its development. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Perception and intentions to quit among waterpipe smokers in Qatar: a cross-sectional survey
Jaam, M.; Al-Marridi, W.; Fares, H.; Izham, M.; Kheir, N.
2016-01-01
Objective: To evaluate the perceptions and attitudes of waterpipe (shisha) smokers in Qatar regarding the health risks associated with addiction and to determine their intentions to quit. Methods: A cross-sectional survey was conducted among 181 self-reported waterpipe smokers. Participants were approached in public places as well as in shisha cafes in Qatar. The questionnaire included items related to perception, attitude and intention to quit. Both descriptive and inferential statistics were performed for data analyses, with P ≤ 0.05 considered statistically significant. Results: About 44% of the respondents believed that waterpipe smoking was safer than cigarette smoking, and more than 70% would not mind if their children became involved in waterpipe smoking. More than half of the current smokers wanted to quit smoking shisha at some point, and 17% identified health concerns as the main motivating factor for their intention to quit. Conclusion: A large proportion of shisha smokers viewed shisha as a safer alternative to cigarettes, yet they admitted to intending to quit. These findings underscore the need to design educational interventions and awareness campaigns as well as impose stringent laws on waterpipe smoking in public places in Qatar. PMID:27051611
Prevalence of dry eye syndrome in residents of surgical specialties.
Castellanos-González, José Alberto; Torres-Martínez, Verónica; Martínez-Ruiz, Adriana; Fuentes-Orozco, Clotilde; Rendón-Félix, Jorge; Irusteta-Jiménez, Leire; Márquez-Valdez, Aída Rebeca; Cortés-Lares, José Antonio; González-Ojeda, Alejandro
2016-07-16
The aim of this study was to determine the prevalence and severity of dry eye syndrome in a group of Mexican residents of different surgical specialties. A cross-sectional descriptive study where the residents were studied using the Ocular Surface Disease Index, together with diagnostic tests for dry eye syndrome, such as tear breakup time, Oxford Schema, Schirmer's test I, and meibomian gland dysfunction testing. Statistical analyses were performed by Pearson's chi-squared test for categorical variables and student's t-test for quantitative variables. Any P value < 0.05 was considered statistically significant. One hundred and twenty-three residents were included (246 eyes); 90 (73 %) were male and 33 (27 %) were female. The mean age was 27.8 ± 2.1 years. A higher number of residents with dry eye syndrome was found in the cardiothoracic surgery (75 %) and otorhinolaryngology (71 %) specialties; 70 % of them reported ocular symptoms, with teardrop quality involvement in >50 % of them. We found a prevalence of 56 % for mild-to-moderate/severe stages of the condition. Their presence in the operating room predisposes surgical residents to dry eye syndrome because of environmental conditions.
NASA Astrophysics Data System (ADS)
Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert
2016-05-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.
Failures of reproduction: problematising 'success' in assisted reproductive technology.
Peters, Kathleen; Jackson, Debra; Rudge, Trudy
2007-06-01
This paper scrutinises the many ways in which 'success' is portrayed in representing assisted reproductive technology (ART) services and illuminates how these definitions differ from those held by participant couples. A qualitative approach informed by feminist perspectives guided this study and aimed to problematise the concept of 'success' by examining literature from ART clinics, government reports on ART, and by analysing narratives of couples who have accessed ART services. As many ART services have varying definitions of 'success' and as statistics are manipulated to promote further patronage of ART services, the likelihood of 'success' is often overstated. This paper is concerned with the effects this promotion has on the participants. We suggest that this very mobilisation of statistical success changes the ability of those who access ART services to make productive decisions about themselves inside these treatment regimes, as the basis for decision-making is hidden by the way numbers, objectivity and clinical reasoning operate to maintain participation in the program. In such an operation, the powerful mix of hope and technology kept participants enrolled far longer than they originally planned. Moreover, how success rates are manipulated raises ethical issues for all involved: clients, counsellors, and nursing and medical professionals.
Language experience changes subsequent learning.
Onnis, Luca; Thiessen, Erik
2013-02-01
What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. Copyright © 2012 Elsevier B.V. All rights reserved.
Akboğa, Özge; Baradan, Selim
2017-02-07
Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods.
Dwan, Kerry; Altman, Douglas G.; Clarke, Mike; Gamble, Carrol; Higgins, Julian P. T.; Sterne, Jonathan A. C.; Williamson, Paula R.; Kirkham, Jamie J.
2014-01-01
Background Most publications about selective reporting in clinical trials have focussed on outcomes. However, selective reporting of analyses for a given outcome may also affect the validity of findings. If analyses are selected on the basis of the results, reporting bias may occur. The aims of this study were to review and summarise the evidence from empirical cohort studies that assessed discrepant or selective reporting of analyses in randomised controlled trials (RCTs). Methods and Findings A systematic review was conducted and included cohort studies that assessed any aspect of the reporting of analyses of RCTs by comparing different trial documents, e.g., protocol compared to trial report, or different sections within a trial publication. The Cochrane Methodology Register, Medline (Ovid), PsycInfo (Ovid), and PubMed were searched on 5 February 2014. Two authors independently selected studies, performed data extraction, and assessed the methodological quality of the eligible studies. Twenty-two studies (containing 3,140 RCTs) published between 2000 and 2013 were included. Twenty-two studies reported on discrepancies between information given in different sources. Discrepancies were found in statistical analyses (eight studies), composite outcomes (one study), the handling of missing data (three studies), unadjusted versus adjusted analyses (three studies), handling of continuous data (three studies), and subgroup analyses (12 studies). Discrepancy rates varied, ranging from 7% (3/42) to 88% (7/8) in statistical analyses, 46% (36/79) to 82% (23/28) in adjusted versus unadjusted analyses, and 61% (11/18) to 100% (25/25) in subgroup analyses. This review is limited in that none of the included studies investigated the evidence for bias resulting from selective reporting of analyses. It was not possible to combine studies to provide overall summary estimates, and so the results of studies are discussed narratively. Conclusions Discrepancies in analyses between publications and other study documentation were common, but reasons for these discrepancies were not discussed in the trial reports. To ensure transparency, protocols and statistical analysis plans need to be published, and investigators should adhere to these or explain discrepancies. Please see later in the article for the Editors' Summary PMID:24959719
A SURVEY OF LABORATORY AND STATISTICAL ISSUES RELATED TO FARMWORKER EXPOSURE STUDIES
Developing internally valid, and perhaps generalizable, farmworker exposure studies is a complex process that involves many statistical and laboratory considerations. Statistics are an integral component of each study beginning with the design stage and continuing to the final da...
P-Value Club: Teaching Significance Level on the Dance Floor
ERIC Educational Resources Information Center
Gray, Jennifer
2010-01-01
Courses: Beginning research methods and statistics courses, as well as advanced communication courses that require reading research articles and completing research projects involving statistics. Objective: Students will understand the difference between significant and nonsignificant statistical results based on p-value.
ERIC Educational Resources Information Center
Tractenberg, Rochelle E.
2017-01-01
Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards "big" data: while automated analyses can exploit massive amounts of data, the interpretation--and possibly more importantly, the replication--of results are…
Jeffrey P. Prestemon
2009-01-01
Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T; Pereira, Carol; Rosenkranz, Susan L; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu Jeanne; Wang, Rui; Lok, Judith; Evans, Scott R
2017-03-15
The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
Developmental and Behavioral Needs and Service Use for Young Children in Child Welfare
Stahmer, Aubyn C.; Leslie, Laurel K.; Hurlburt, Michael; Barth, Richard P.; Webb, Mary Bruce; Landsverk, John; Zhang, Jinjin
2006-01-01
Objective To determine the level of developmental and behavioral need in young children entering child welfare (CW), estimate early intervention services use, and examine variation in need and service use based on age and level of involvement with CW by using a national probability sample in the United States. Methods As part of the National Survey of Child and Adolescent Well-Being, data were collected on 2813 children <6 years old for whom possible abuse or neglect was investigated by CW agencies. Analyses used descriptive statistics to determine developmental and behavioral needs across 5 domains (cognition, behavior, communication, social, and adaptive functioning) and service use. Logistic regression was used to examine the relationship between independent variables (age, gender, race-ethnicity, maltreatment history, level of CW involvement, and developmental or behavior problems) and service use. Results Results indicate that age and level of CW involvement predict service use when controlling for need. Both toddlers (41.8%) and preschoolers (68.1%) in CW have high developmental and behavioral needs; however, few children are receiving services for these issues (22.7% overall). Children that remain with their biological parents have similar needs to those in out-of-home care but are less likely to use services. Children <3 years of age are least likely to use services. Conclusions Children referred to CW have high developmental and behavioral need regardless of the level of CW involvement. Both age and level of involvement influence service use when controlling for need. Mechanisms need to be developed to address disparities in access to intervention. PMID:16199698
Advance care planning for nursing home residents with dementia: policy vs. practice.
Ampe, Sophie; Sevenants, Aline; Smets, Tinne; Declercq, Anja; Van Audenhove, Chantal
2016-03-01
The aims of this study were: to evaluate the advance care planning policy for people with dementia in nursing homes; to gain insight in the involvement of residents with dementia and their families in advance care planning, and in the relationship between the policy and the actual practice of advance care planning. Through advance care planning, nursing home residents with dementia are involved in care decisions, anticipating their reduced decision-making capacity. However, advance care planning is rarely realized for this group. Prevalence and outcomes have been researched, but hardly any research has focused on the involvement of residents/families in advance care planning. Observational cross-sectional study in 20 nursing homes. The ACP audit assessed the views of the nursing homes' staff on the advance care planning policy. In addition, individual conversations were analysed with 'ACP criteria' (realization of advance care planning) and the 'OPTION' instrument (involvement of residents/families). June 2013-September 2013. Nursing homes generally met three quarters of the pre-defined criteria for advance care planning policy. In almost half of the conversations, advance care planning was explained and discussed substantively. Generally, healthcare professionals only managed to involve residents/families on a baseline skill level. There were no statistically significant correlations between policy and practice. The evaluations of the policy were promising, but the actual practice needs improvement. Future assessment of both policy and practice is recommended. Further research should focus on communication interventions for implementing advance care planning in the daily practice. © 2015 John Wiley & Sons Ltd.
Glass-Kaastra, Shiona K.; Pearl, David L.; Reid-Smith, Richard J.; McEwen, Beverly; Slavic, Durda; McEwen, Scott A.; Fairles, Jim
2014-01-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection. PMID:24688133
Glass-Kaastra, Shiona K; Pearl, David L; Reid-Smith, Richard J; McEwen, Beverly; Slavic, Durda; McEwen, Scott A; Fairles, Jim
2014-04-01
Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection.
Jolley, Jennifer; Lomelin, Daniel; Simorov, Anton; Tadaki, Carl; Oleynikov, Dmitry
2016-09-01
Surgical procedures have a learning curve regarding the number of cases required for proficiency. Consequently, involvement of less experienced resident surgeons may impact patients and the healthcare system. This study examines basic and advanced laparoscopic procedures performed between 2010 and 2011 and evaluates the resident surgeon participation effect. Basic laparoscopic procedures (BL), appendectomy (LA), cholecystectomy (LC), and advanced Nissen fundoplication (LN) were queried from the American College of Surgeons National Surgical Quality Improvement Program database. Cases were identified using Current Procedural Terminology codes. Analyses were performed using IBM SPSS Statistics v.22, α-level = 0.05. Multiple logistic regression was used, accounting for age, race, gender, admission status, wound classification, and ASA classification. In total, 71,819 surgeries were reviewed, 66,327 BL (37,636 LC and 28,691 LA) and 5492 LN. Median age was 48 years for LC and 37 years for LA. In sum, 72.2 % of LC and 49.5 % of LA patients were female. LN median age was 59 years, and 67.7 % of patients were female. For BL, resident involvement was not significantly associated with mortality, morbidity, and return to the OR. Readmission was not related to resident involvement in LC. In LA, resident-involved surgeries had increased readmission and longer OR time, but decreased LOS. In LC, resident involvement was associated with longer LOS and OR time. Resident involvement was not a significant factor in the odds of mortality, morbidity, return to OR, or readmission in LN. Surgeries involving residents had increased odds of having longer LOS, and of lengthier surgery time. We demonstrate resident involvement is safe and does not result in poorer patient outcomes. Readmissions and LOS were higher in BL, and operative times were longer in all surgeries. Resident operations do appear to have real consequences for patients and may impact the healthcare system financially.
ERIC Educational Resources Information Center
Idris, Khairiani; Yang, Kai-Lin
2017-01-01
This article reports the results of a mixed-methods approach to develop and validate an instrument to measure Indonesian pre-service teachers' conceptions of statistics. First, a phenomenographic study involving a sample of 44 participants uncovered six categories of conceptions of statistics. Second, an instrument of conceptions of statistics was…
Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.
2017-07-17
The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized log-normal, generalized Pareto, and Weibull. Uncertainties in streamflow estimates for corresponding AEP are depicted and quantified as two primary forms: quantile (aleatoric [random sampling] uncertainty) and distribution-choice (epistemic [model] uncertainty). Sampling uncertainties of a given distribution are relatively straightforward to compute from analytical or Monte Carlo-based approaches. Distribution-choice uncertainty stems from choices of potentially applicable probability distributions for which divergence among the choices increases as AEP decreases. Conventional goodness-of-fit statistics, such as Cramér-von Mises, and L-moment ratio diagrams are demonstrated in order to hone distribution choice. The results generally show that distribution choice uncertainty is larger than sampling uncertainty for very low AEP values.
Bullying as a risk for poor sleep quality among high school students in China.
Zhou, Ying; Guo, Lan; Lu, Ci-yong; Deng, Jian-xiong; He, Yuan; Huang, Jing-hui; Huang, Guo-liang; Deng, Xue-qing; Gao, Xue
2015-01-01
To determine whether involvement in bullying as a bully, victim, or bully-victim was associated with a higher risk of poor sleep quality among high school students in China. A cross-sectional study was conducted. A total of 23,877 high school students were surveyed in six cities in Guangdong Province. All students were asked to complete the adolescent health status questionnaire, which included the Chinese version of the Pittsburgh Sleep Quality Index (PSQI) and bullying involvement. Descriptive statistics were used to evaluate sleep quality and the prevalence of school bullying. Multi-level logistic regression analyses were conducted to examine the association between being victimized and bullying others with sleep quality. Among the 23,877 students, 6,127 (25.66%) reported having poor sleep quality, and 10.89% reported being involved in bullying behaviors. Of the respondents, 1,410 (5.91%) were pure victims of bullying, 401 (1.68%) were bullies and 784 (3.28%) were bully-victims. Frequently being involved in bullying behaviors (being bullied or bullying others) was related to increased risks of poor sleep quality compared with adolescents who were not involved in bullying behaviors. After adjusting for age, sex, and other confounding factors, the students who were being bullied (OR=2.05, 95%CI=1.81-2.32), bullied others (OR=2.30, 95%CI=1.85-2.86) or both (OR=2.58, 95%CI=2.20-3.03) were at a higher risk for poor sleep quality. Poor sleep quality among high school students is highly prevalent, and school bullying is prevalent among adolescents in China. The present results suggested that being involved in school bullying might be a risk factor for poor sleep quality among adolescents.
Bullying as a Risk for Poor Sleep Quality among High School Students in China
Lu, Ci-yong; Deng, Jian-xiong; Huang, Jing-hui; Huang, Guo-liang; Deng, Xue-qing; Gao, Xue
2015-01-01
Objective To determine whether involvement in bullying as a bully, victim, or bully-victim was associated with a higher risk of poor sleep quality among high school students in China. Methods A cross-sectional study was conducted. A total of 23,877 high school students were surveyed in six cities in Guangdong Province. All students were asked to complete the adolescent health status questionnaire, which included the Chinese version of the Pittsburgh Sleep Quality Index (PSQI) and bullying involvement. Descriptive statistics were used to evaluate sleep quality and the prevalence of school bullying. Multi-level logistic regression analyses were conducted to examine the association between being victimized and bullying others with sleep quality. Results Among the 23,877 students, 6,127 (25.66%) reported having poor sleep quality, and 10.89% reported being involved in bullying behaviors. Of the respondents, 1,410 (5.91%) were pure victims of bullying, 401 (1.68%) were bullies and 784 (3.28%) were bully-victims. Frequently being involved in bullying behaviors (being bullied or bullying others) was related to increased risks of poor sleep quality compared with adolescents who were not involved in bullying behaviors. After adjusting for age, sex, and other confounding factors, the students who were being bullied (OR=2.05, 95%CI=1.81-2.32), bullied others (OR=2.30, 95%CI=1.85-2.86) or both (OR=2.58, 95%CI=2.20-3.03) were at a higher risk for poor sleep quality. Conclusions Poor sleep quality among high school students is highly prevalent, and school bullying is prevalent among adolescents in China. The present results suggested that being involved in school bullying might be a risk factor for poor sleep quality among adolescents. PMID:25811479
ERIC Educational Resources Information Center
Kadhi, T.; Holley, D.; Rudley, D.; Garrison, P.; Green, T.
2010-01-01
The following report gives the statistical findings of the 2010 Thurgood Marshall School of Law (TMSL) Texas Bar results. This data was pre-existing and was given to the Evaluator by email from the Dean. Then, in-depth statistical analyses were run using the SPSS 17 to address the following questions: 1. What are the statistical descriptors of the…
A statistical package for computing time and frequency domain analysis
NASA Technical Reports Server (NTRS)
Brownlow, J.
1978-01-01
The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.
NASA Astrophysics Data System (ADS)
Nasaruddin, N. H.; Yusoff, A. N.; Kaur, S.
2014-11-01
The objective of this multiple-subjects functional magnetic resonance imaging (fMRI) study was to identify the common brain areas that are activated when viewing black-and-white checkerboard pattern stimuli of various shapes, pattern and size and to investigate specific brain areas that are involved in processing static and moving visual stimuli. Sixteen participants viewed the moving (expanding ring, rotating wedge, flipping hour glass and bowtie and arc quadrant) and static (full checkerboard) stimuli during an fMRI scan. All stimuli have black-and-white checkerboard pattern. Statistical parametric mapping (SPM) was used in generating brain activation. Differential analyses were implemented to separately search for areas involved in processing static and moving stimuli. In general, the stimuli of various shapes, pattern and size activated multiple brain areas mostly in the left hemisphere. The activation in the right middle temporal gyrus (MTG) was found to be significantly higher in processing moving visual stimuli as compared to static stimulus. In contrast, the activation in the left calcarine sulcus and left lingual gyrus were significantly higher for static stimulus as compared to moving stimuli. Visual stimulation of various shapes, pattern and size used in this study indicated left lateralization of activation. The involvement of the right MTG in processing moving visual information was evident from differential analysis, while the left calcarine sulcus and left lingual gyrus are the areas that are involved in the processing of static visual stimulus.
Maher, Nigel Gordon; Hoffman, Gary Russell
2014-03-01
Neck dissections that include sublevel IIb increase the risk of postoperative shoulder dysfunction. The purpose of this investigation was to document the incidence of level IIb metastatic lymphatic spread in a group of patients undergoing neck dissection as part of the surgical management of cutaneous squamous cell carcinoma of the head and neck. A retrospective review of the pathology records taken from 1 surgeon from June 2006 through June 2013 was carried out. The predictor variable was the primary tumor site. The outcome variable was the metastatic nodal involvement according to neck level and sublevel. Secondary variables included T stage, pathologist, tumor depth, and the presence of perineural, perilymphatic, and perivascular invasion. Data analyses were by descriptive statistics. Thirty-six patients with a total of 40 neck dissections met the inclusion criteria. The average primary site tumor depth was 14.7 mm, and there were 16 cases of poorly differentiated squamous cell carcinoma. Sublevel IIb was involved in 7.5% of cases, all of which occurred from lateralized primary sites of the head and neck. Cutaneous squamous cell carcinoma arising from the auricle and neck sites adjacent to sublevel IIb may have increased risk of metastatic involvement of sublevel IIb nodes. Further studies with larger numbers are required to determine the risk of metastasis to sublevel IIb from midline sites of the face. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.
[Information management in multicenter studies: the Brazilian longitudinal study for adult health].
Duncan, Bruce Bartholow; Vigo, Álvaro; Hernandez, Émerson; Luft, Vivian Cristine; Ahlert, Hubert; Bergmann, Kaiser; Mota, Eduardo
2013-06-01
Information management in large multicenter studies requires a specialized approach. The Estudo Longitudinal da Saúde do Adulto (ELSA-Brasil - Brazilian Longitudinal Study for Adult Health) has created a Datacenter to enter and manage its data system. The aim of this paper is to describe the steps involved, including the information entry, transmission and management methods. A web system was developed in order to allow, in a safe and confidential way, online data entry, checking and editing, as well as the incorporation of data collected on paper. Additionally, a Picture Archiving and Communication System was implemented and customized for echocardiography and retinography. It stores the images received from the Investigation Centers and makes them available at the Reading Centers. Finally, data extraction and cleaning processes were developed to create databases in formats that enable analyses in multiple statistical packages.
[Colonic perforation during colonoscopy. 100 cases].
Hureau, J; Avtan, L; Germain, M; Blanc, D; Chaussade, G
1992-01-01
The analysis of 100 cases of colon perforation during colposcopic examinations highly demonstrates such a statement. The perforation risk during colposcopies is generally of the order of 0.2% for a diagnosis coloscopy. According to the statistic data used, it can reach 0.5 to 3% in therapy coloscopy. This is a risk inherent to the technique used. It is thus required to analyse the causes and take the appropriate measures to reduce it to a minimum. Mortality due to such a complication remains high (14%), i.e about 0.015 to 0.1% (#2/10000) of all colposcopies. In 11% of the patients, serious sequelae are to be observed. This demonstrates the significance of the medico-legal problem set by these perforations during colposcopies. The whole personnel responsibility can be involved: colposcopist, surgeon, anesthetist and hospital unit.
NASA Technical Reports Server (NTRS)
Reichert, B. A.; Hingst, W. R.; Okiishi, T. H.
1991-01-01
An ethylene trace gas technique was used to map out fluid transport and mixing within a circular to rectangular transition duct. Ethylene gas was injected at several points in a cross stream plane upstream of the transition duct. Ethylene concentration contours were determined at several cross stream measurement planes spaced axially within the duct. The flow involved a uniform inlet flow at a Mach number level of 0.5. Statistical analyses were used to quantitatively interpret the trace gas results. Also, trace gas data were considered along with aerodynamic and surface flow visualization results to ascertain transition duct flow phenomena. Convection of wall boundary layer fluid by vortices produced regions of high total pressure loss in the duct. The physical extent of these high loss regions is governed by turbulent diffusion.
Sarkar, Rajarshi
2014-04-01
Although 3rd generation TSH assays are the most widely used immunoassays, credible comparison studies, specially involving Indian sub-populations are practically non-existent. To compare the TSH measurements between chemiluminescence (Architect) and electrochemiluminescence (Cobas) inmmunoassays in an urban ambulatory Indian population. 1,615 subjects were selected randomly from the usual laboratory workflow, their TSH measured in Architect and Cobas and the paired data thus generated were statistically analysed. TSH values of Cobas were observed to be higher than the Architect values by 28.7 %, with a significant proportional difference between the two, but majority of the Cobas values (above 90 %) were within the limits of agreement with Architect values. In situations where both the instruments are in use simultaneously, a standardization of the methods is imperative, in larger interest of the patient populace.