Sample records for subsequent statistical analysis

  1. Applications of statistics to medical science (1) Fundamental concepts.

    PubMed

    Watanabe, Hiroshi

    2011-01-01

    The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.

  2. Applications of statistics to medical science, II overview of statistical procedures for general use.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.

  3. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  4. Applications of statistics to medical science, III. Correlation and regression.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    In this third part of a series surveying medical statistics, the concepts of correlation and regression are reviewed. In particular, methods of linear regression and logistic regression are discussed. Arguments related to survival analysis will be made in a subsequent paper.

  5. Impact of Medicare Advantage Prescription Drug Plan Star Ratings on Enrollment before and after Implementation of Quality-Related Bonus Payments in 2012

    PubMed Central

    Li, Pengxiang; Doshi, Jalpa A.

    2016-01-01

    Objective Since 2007, the Centers for Medicare and Medicaid Services have published 5-star quality rating measures to aid consumers in choosing Medicare Advantage Prescription Drug Plans (MAPDs). We examined the impact of these star ratings on Medicare Advantage Prescription Drug (MAPD) enrollment before and after 2012, when star ratings became tied to bonus payments for MAPDs that could be used to improve plan benefits and/or reduce premiums in the subsequent year. Methods A longitudinal design and multivariable hybrid models were used to assess whether star ratings had a direct impact on concurrent year MAPD contract enrollment (by influencing beneficiary choice) and/or an indirect impact on subsequent year MAPD contract enrollment (because ratings were linked to bonus payments). The main analysis was based on contract-year level data from 2009–2015. We compared effects of star ratings in the pre-bonus payment period (2009–2011) and post-bonus payment period (2012–2015). Extensive sensitivity analyses varied the analytic techniques, unit of analysis, and sample inclusion criteria. Similar analyses were conducted separately using stand-alone PDP contract-year data; since PDPs were not eligible for bonus payments, they served as an external comparison group. Result The main analysis included 3,866 MAPD contract-years. A change of star rating had no statistically significant effect on concurrent year enrollment in any of the pre-, post-, or pre-post combined periods. On the other hand, star rating increase was associated with a statistically significant increase in the subsequent year enrollment (a 1-star increase associated with +11,337 enrollees, p<0.001) in the post-bonus payment period but had a very small and statistically non-significant effect on subsequent year enrollment in the pre-bonus payment period. Further, the difference in effects on subsequent year enrollment was statistically significant between the pre- and post-periods (p = 0.011). Sensitivity analyses indicated that the findings were robust. No statistically significant effect of star ratings was found on concurrent or subsequent year enrollment in the pre- or post-period in the external comparison group of stand-alone PDP contracts. Conclusion Star ratings had no direct impact on concurrent year MAPD enrollment before or after the introduction of bonus payments tied to star ratings. However, after the introduction of these bonus payments, MAPD star ratings had a significant indirect impact of increasing subsequent year enrollment, likely via the reinvestment of bonuses to provide lower premiums and/or additional member benefits in the following year. PMID:27149092

  6. Impact of Medicare Advantage Prescription Drug Plan Star Ratings on Enrollment before and after Implementation of Quality-Related Bonus Payments in 2012.

    PubMed

    Li, Pengxiang; Doshi, Jalpa A

    2016-01-01

    Since 2007, the Centers for Medicare and Medicaid Services have published 5-star quality rating measures to aid consumers in choosing Medicare Advantage Prescription Drug Plans (MAPDs). We examined the impact of these star ratings on Medicare Advantage Prescription Drug (MAPD) enrollment before and after 2012, when star ratings became tied to bonus payments for MAPDs that could be used to improve plan benefits and/or reduce premiums in the subsequent year. A longitudinal design and multivariable hybrid models were used to assess whether star ratings had a direct impact on concurrent year MAPD contract enrollment (by influencing beneficiary choice) and/or an indirect impact on subsequent year MAPD contract enrollment (because ratings were linked to bonus payments). The main analysis was based on contract-year level data from 2009-2015. We compared effects of star ratings in the pre-bonus payment period (2009-2011) and post-bonus payment period (2012-2015). Extensive sensitivity analyses varied the analytic techniques, unit of analysis, and sample inclusion criteria. Similar analyses were conducted separately using stand-alone PDP contract-year data; since PDPs were not eligible for bonus payments, they served as an external comparison group. The main analysis included 3,866 MAPD contract-years. A change of star rating had no statistically significant effect on concurrent year enrollment in any of the pre-, post-, or pre-post combined periods. On the other hand, star rating increase was associated with a statistically significant increase in the subsequent year enrollment (a 1-star increase associated with +11,337 enrollees, p<0.001) in the post-bonus payment period but had a very small and statistically non-significant effect on subsequent year enrollment in the pre-bonus payment period. Further, the difference in effects on subsequent year enrollment was statistically significant between the pre- and post-periods (p = 0.011). Sensitivity analyses indicated that the findings were robust. No statistically significant effect of star ratings was found on concurrent or subsequent year enrollment in the pre- or post-period in the external comparison group of stand-alone PDP contracts. Star ratings had no direct impact on concurrent year MAPD enrollment before or after the introduction of bonus payments tied to star ratings. However, after the introduction of these bonus payments, MAPD star ratings had a significant indirect impact of increasing subsequent year enrollment, likely via the reinvestment of bonuses to provide lower premiums and/or additional member benefits in the following year.

  7. Statistical analysis of Thematic Mapper Simulator data for the geobotanical discrimination of rock types in southwest Oregon

    NASA Technical Reports Server (NTRS)

    Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.

    1984-01-01

    An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.

  8. Urological research in sub-Saharan Africa: a retrospective cohort study of abstracts presented at the Nigerian Association of Urological Surgeons conferences.

    PubMed

    Bello, Jibril Oyekunle

    2013-11-14

    Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.

  9. Is the spatial distribution of brain lesions associated with closed-head injury predictive of subsequent development of attention-deficit/hyperactivity disorder? Analysis with brain-image database

    NASA Technical Reports Server (NTRS)

    Herskovits, E. H.; Megalooikonomou, V.; Davatzikos, C.; Chen, A.; Bryan, R. N.; Gerring, J. P.

    1999-01-01

    PURPOSE: To determine whether there is an association between the spatial distribution of lesions detected at magnetic resonance (MR) imaging of the brain in children after closed-head injury and the development of secondary attention-deficit/hyperactivity disorder (ADHD). MATERIALS AND METHODS: Data obtained from 76 children without prior history of ADHD were analyzed. MR images were obtained 3 months after closed-head injury. After manual delineation of lesions, images were registered to the Talairach coordinate system. For each subject, registered images and secondary ADHD status were integrated into a brain-image database, which contains depiction (visualization) and statistical analysis software. Using this database, we assessed visually the spatial distributions of lesions and performed statistical analysis of image and clinical variables. RESULTS: Of the 76 children, 15 developed secondary ADHD. Depiction of the data suggested that children who developed secondary ADHD had more lesions in the right putamen than children who did not develop secondary ADHD; this impression was confirmed statistically. After Bonferroni correction, we could not demonstrate significant differences between secondary ADHD status and lesion burdens for the right caudate nucleus or the right globus pallidus. CONCLUSION: Closed-head injury-induced lesions in the right putamen in children are associated with subsequent development of secondary ADHD. Depiction software is useful in guiding statistical analysis of image data.

  10. Gender Differences in Students' Mathematics Game Playing

    ERIC Educational Resources Information Center

    Lowrie, Tom; Jorgensen, Robyn

    2011-01-01

    The investigation monitored the digital game-playing behaviours of 428 primary-aged students (aged 10-12 years). Chi-square analysis revealed that boys tend to spend more time playing digital games than girls while boys and girls play quite different game genres. Subsequent analysis revealed statistically significant gender differences in terms of…

  11. Use of statistical study methods for the analysis of the results of the imitation modeling of radiation transfer

    NASA Astrophysics Data System (ADS)

    Alekseenko, M. A.; Gendrina, I. Yu.

    2017-11-01

    Recently, due to the abundance of various types of observational data in the systems of vision through the atmosphere and the need for their processing, the use of various methods of statistical research in the study of such systems as correlation-regression analysis, dynamic series, variance analysis, etc. is actual. We have attempted to apply elements of correlation-regression analysis for the study and subsequent prediction of the patterns of radiation transfer in these systems same as in the construction of radiation models of the atmosphere. In this paper, we present some results of statistical processing of the results of numerical simulation of the characteristics of vision systems through the atmosphere obtained with the help of a special software package.1

  12. Characterization of Inclusion Populations in Mn-Si Deoxidized Steel

    NASA Astrophysics Data System (ADS)

    García-Carbajal, Alfonso; Herrera-Trejo, Martín; Castro-Cedeño, Edgar-Ivan; Castro-Román, Manuel; Martinez-Enriquez, Arturo-Isaias

    2017-12-01

    Four plant heats of Mn-Si deoxidized steel were conducted to follow the evolution of the inclusion population through ladle furnace (LF) treatment and subsequent vacuum treatment (VT). The liquid steel was sampled, and the chemical composition and size distribution of the inclusion populations were characterized. The Gumbel generalized extreme-value (GEV) and generalized Pareto (GP) distributions were used for the statistical analysis of the inclusion size distributions. The inclusions found at the beginning of the LF treatment were mostly fully liquid SiO2-Al2O3-MnO inclusions, which then evolved into fully liquid SiO2-Al2O3-CaO-MgO and partly liquid SiO2-CaO-MgO-(Al2O3-MgO) inclusions detected at the end of the VT. The final fully liquid inclusions had a desirable chemical composition for plastic behavior in subsequent metallurgical operations. The GP distribution was found to be undesirable for statistical analysis. The GEV distribution approach led to shape parameter values different from the zero value hypothesized from the Gumbel distribution. According to the GEV approach, some of the final inclusion size distributions had statistically significant differences, whereas the Gumbel approach predicted no statistically significant differences. The heats were organized according to indicators of inclusion cleanliness and a statistical comparison of the size distributions.

  13. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  14. A new strategy for statistical analysis-based fingerprint establishment: Application to quality assessment of Semen sojae praeparatum.

    PubMed

    Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting

    2018-08-30

    Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  16. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann

    2017-07-01

    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  17. Advanced statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Heron, K. H.

    1994-09-01

    A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.

  18. Transcriptomic and bioinformatics analysis of the early time-course of the response to prostaglandin F2 alpha in the bovine corpus luteum

    USDA-ARS?s Scientific Manuscript database

    RNA expression analysis was performed on the corpus luteum tissue at five time points after prostaglandin F2 alpha treatment of midcycle cows using an Affymetrix Bovine Gene v1 Array. The normalized linear microarray data was uploaded to the NCBI GEO repository (GSE94069). Subsequent statistical ana...

  19. A Novel Image Encryption Algorithm Based on DNA Subsequence Operation

    PubMed Central

    Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng

    2012-01-01

    We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912

  20. Open dumps in the Hellenic prefecture of Laconia: statistical analysis of characteristics and restoration prioritization on the basis of a field survey.

    PubMed

    Tsatsarelis, Thomas; Antonopoulos, Ioannis; Karagiannidis, Avraam; Perkoulidis, George

    2007-10-01

    This study presents an assessment of the current status of open dumps in Laconia prefecture of Peloponnese in southern Greece, where all open dumps are targeted for closure by 2008. An extensive field survey was conducted in 2005 to register existing sites in the prefecture. The data collected included the site area and age, waste depth, type of disposed waste, distance from nearest populated area, local geographical features and observed practices of open burning and soil coverage. On the basis of the collected data, a GIS database was developed, and the above parameters were statistically analysed. Subsequently, a decision tool for the restoration of open dumps was implemented, which led to the prioritization of site restorations and specific decisions about appropriate restoration steps for each site. The sites requiring restoration were then further classified using Principal Component Analysis, in order to categorize them into groups suitable for similar restoration work, thus facilitating fund allocation and subsequent restoration project management.

  1. qFeature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-09-14

    This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.

  2. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  3. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  4. Discrepancies Between Plastic Surgery Meeting Abstracts and Subsequent Full-Length Manuscript Publications.

    PubMed

    Denadai, Rafael; Araujo, Gustavo Henrique; Pinho, Andre Silveira; Denadai, Rodrigo; Samartine, Hugo; Raposo-Amaral, Cassio Eduardo

    2016-10-01

    The purpose of this bibliometric study was to assess the discrepancies between plastic surgery meeting abstracts and subsequent full-length manuscript publications. Abstracts presented at the Brazilian Congress of Plastic Surgery from 2010 to 2011 were compared with matching manuscript publications. Discrepancies between the abstract and the subsequent manuscript were categorized as major (changes in the purpose, methods, study design, sample size, statistical analysis, results, and conclusions) and minor (changes in the title and authorship) variations. The overall discrepancy rate was 96 %, with at least one major (76 %) and/or minor (96 %) variation. There were inconsistencies between the study title (56 %), authorship (92 %), purpose (6 %), methods (20 %), study design (36 %), sample size (51.2 %), statistical analysis (14 %), results (20 %), and conclusions (8 %) of manuscripts compared with their corresponding meeting abstracts. As changes occur before manuscript publication of plastic surgery meeting abstracts, caution should be exercised in referencing abstracts or altering surgical practices based on abstracts' content. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  5. RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.

    PubMed

    Glaab, Enrico; Schneider, Reinhard

    2015-07-01

    High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  6. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  7. Statistics, Uncertainty, and Transmitted Variation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  8. Characterization and classification of oral tissues using excitation and emission matrix: a statistical modeling approach

    NASA Astrophysics Data System (ADS)

    Kanniyappan, Udayakumar; Gnanatheepaminstein, Einstein; Prakasarao, Aruna; Dornadula, Koteeswaran; Singaravelu, Ganesan

    2017-02-01

    Cancer is one of the most common human threats around the world and diagnosis based on optical spectroscopy especially fluorescence technique has been established as the standard approach among scientist to explore the biochemical and morphological changes in tissues. In this regard, the present work aims to extract spectral signatures of the various fluorophores present in oral tissues using parallel factor analysis (PARAFAC). Subsequently, the statistical analysis also to be performed to show its diagnostic potential in distinguishing malignant, premalignant from normal oral tissues. Hence, the present study may lead to the possible and/or alternative tool for oral cancer diagnosis.

  9. Phenotypic mapping of metabolic profiles using self-organizing maps of high-dimensional mass spectrometry data.

    PubMed

    Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A

    2014-07-01

    A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.

  10. Evaluation of Evidence of Statistical Support and Corroboration of Subgroup Claims in Randomized Clinical Trials.

    PubMed

    Wallach, Joshua D; Sullivan, Patrick G; Trepanowski, John F; Sainani, Kristin L; Steyerberg, Ewout W; Ioannidis, John P A

    2017-04-01

    Many published randomized clinical trials (RCTs) make claims for subgroup differences. To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses. This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract. Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings. Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials. The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses. Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null. A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.

  11. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE PAGES

    Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...

    2014-12-02

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  12. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  13. Cognition, comprehension and application of biostatistics in research by Indian postgraduate students in periodontics.

    PubMed

    Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar

    2014-01-01

    Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed.

  14. Grain boundary oxidation and an analysis of the effects of pre-oxidation on subsequent fatigue life

    NASA Technical Reports Server (NTRS)

    Oshida, Y.; Liu, H. W.

    1986-01-01

    The effects of preoxidation on subsequent fatigue life were studied. Surface oxidation and grain boundary oxidation of a nickel-base superalloy (TAZ-8A) were studied at 600 to 1000 C for 10 to 1000 hours in air. Surface oxides were identified and the kinetics of surface oxidation was discussed. Grain boundary oxide penetration and morphology were studied. Pancake type grain boundary oxide penetrates deeper and its size is larger, therefore, it is more detrimental to fatigue life than cone-type grain boundary oxide. Oxide penetration depth, a (sub m), is related to oxidation temperature, T, and exposure time, t, by an empirical relation of the Arrhenius type. Effects of T and t on statistical variation of a (sub m) were analyzed according to the Weibull distribution function. Once the oxide is cracked, it serves as a fatigue crack nucleus. Statistical variation of the remaining fatigue life, after the formation of an oxide crack of a critical length, is related directly to the statistical variation of grain boundary oxide penetration depth.

  15. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    PubMed

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  16. Statistics teaching in medical school: opinions of practising doctors.

    PubMed

    Miles, Susan; Price, Gill M; Swift, Louise; Shepstone, Lee; Leinster, Sam J

    2010-11-04

    The General Medical Council expects UK medical graduates to gain some statistical knowledge during their undergraduate education; but provides no specific guidance as to amount, content or teaching method. Published work on statistics teaching for medical undergraduates has been dominated by medical statisticians, with little input from the doctors who will actually be using this knowledge and these skills after graduation. Furthermore, doctor's statistical training needs may have changed due to advances in information technology and the increasing importance of evidence-based medicine. Thus there exists a need to investigate the views of practising medical doctors as to the statistical training required for undergraduate medical students, based on their own use of these skills in daily practice. A questionnaire was designed to investigate doctors' views about undergraduate training in statistics and the need for these skills in daily practice, with a view to informing future teaching. The questionnaire was emailed to all clinicians with a link to the University of East Anglia Medical School. Open ended questions were included to elicit doctors' opinions about both their own undergraduate training in statistics and recommendations for the training of current medical students. Content analysis was performed by two of the authors to systematically categorize and describe all the responses provided by participants. 130 doctors responded, including both hospital consultants and general practitioners. The findings indicated that most had not recognised the value of their undergraduate teaching in statistics and probability at the time, but had subsequently found the skills relevant to their career. Suggestions for improving undergraduate teaching in these areas included referring to actual research and ensuring relevance to, and integration with, clinical practice. Grounding the teaching of statistics in the context of real research studies and including examples of typical clinical work may better prepare medical students for their subsequent career.

  17. Cognition, comprehension and application of biostatistics in research by Indian postgraduate students in periodontics

    PubMed Central

    Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar

    2014-01-01

    Background: Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. Aim: The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. Materials and Methods: A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Results: Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Conclusion: Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed. PMID:24744547

  18. Statistical power analysis of cardiovascular safety pharmacology studies in conscious rats.

    PubMed

    Bhatt, Siddhartha; Li, Dingzhou; Flynn, Declan; Wisialowski, Todd; Hemkens, Michelle; Steidl-Nichols, Jill

    2016-01-01

    Cardiovascular (CV) toxicity and related attrition are a major challenge for novel therapeutic entities and identifying CV liability early is critical for effective derisking. CV safety pharmacology studies in rats are a valuable tool for early investigation of CV risk. Thorough understanding of data analysis techniques and statistical power of these studies is currently lacking and is imperative for enabling sound decision-making. Data from 24 crossover and 12 parallel design CV telemetry rat studies were used for statistical power calculations. Average values of telemetry parameters (heart rate, blood pressure, body temperature, and activity) were logged every 60s (from 1h predose to 24h post-dose) and reduced to 15min mean values. These data were subsequently binned into super intervals for statistical analysis. A repeated measure analysis of variance was used for statistical analysis of crossover studies and a repeated measure analysis of covariance was used for parallel studies. Statistical power analysis was performed to generate power curves and establish relationships between detectable CV (blood pressure and heart rate) changes and statistical power. Additionally, data from a crossover CV study with phentolamine at 4, 20 and 100mg/kg are reported as a representative example of data analysis methods. Phentolamine produced a CV profile characteristic of alpha adrenergic receptor antagonism, evidenced by a dose-dependent decrease in blood pressure and reflex tachycardia. Detectable blood pressure changes at 80% statistical power for crossover studies (n=8) were 4-5mmHg. For parallel studies (n=8), detectable changes at 80% power were 6-7mmHg. Detectable heart rate changes for both study designs were 20-22bpm. Based on our results, the conscious rat CV model is a sensitive tool to detect and mitigate CV risk in early safety studies. Furthermore, these results will enable informed selection of appropriate models and study design for early stage CV studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Dissecting Sequences of Regulation and Cognition: Statistical Discourse Analysis of Primary School Children's Collaborative Learning

    ERIC Educational Resources Information Center

    Molenaar, Inge; Chiu, Ming Ming

    2014-01-01

    Extending past research showing that regulative activities (metacognitive and relational) can aid learning, this study tests whether sequences of cognitive, metacognitive and relational activities affect subsequent cognition. Scaffolded by a computer avatar, 54 primary school students (working in 18 groups of 3) discussed writing a report about a…

  20. Prediction of monthly-seasonal precipitation using coupled SVD patterns between soil moisture and subsequent precipitation

    Treesearch

    Yongqiang Liu

    2003-01-01

    It was suggested in a recent statistical correlation analysis that predictability of monthly-seasonal precipitation could be improved by using coupled singular value decomposition (SVD) pattems between soil moisture and precipitation instead of their values at individual locations. This study provides predictive evidence for this suggestion by comparing skills of two...

  1. Factors Influencing Student Prerequisite Preparation for and Subsequent Performance in College Chemistry Two: A Statistical Investigation

    ERIC Educational Resources Information Center

    Easter, David C.

    2010-01-01

    For students entering Chemistry Two following a Chemistry One course, an assessment exam was given and the results were evaluated in combination with other variables to develop a predictive model that forecasts student achievement in the course. Variables considered in the analysis included student major, GPA, classification (student standing:…

  2. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  3. Variations in neutrophil count in preterm infants with respiratory distress syndrome who subsequently developed chronic lung disease.

    PubMed

    Kohelet, D; Arbel, E; Ballin, A; Goldberg, M

    2000-01-01

    Neutrophil counts were studied in 62 preterm infants receiving mechanical ventilation for neonatal respiratory distress syndrome (NRDS). Exploratory analysis indicated that the severity of NRDS, as demonstrated by fractional inspired oxygen (FiO2), mean airway pressure (MAP), arterial-alveolar PO2 ratio (a/APO2) and oxygenation index (OI), was correlated with percentage change of neutrophil counts during the first 5 days of life. Further analysis demonstrated that infants with NRDS who subsequently developed chronic lung disease (CLD) (n = 21) had statistically significant differences in variation of neutrophil counts when compared with the remainder (n = 41) without CLD (-35.0% +/- 4.3 vs. -16.9% +/- 5.8, p < 0.02). It is concluded that significant variations in neutrophil counts during the first 5 days of life may be found in infants with NRDS who subsequently develop CLD and that these changes may have predictive value regarding the development of CLD.

  4. Data Treatment for LC-MS Untargeted Analysis.

    PubMed

    Riccadonna, Samantha; Franceschi, Pietro

    2018-01-01

    Liquid chromatography-mass spectrometry (LC-MS) untargeted experiments require complex chemometrics strategies to extract information from the experimental data. Here we discuss "data preprocessing", the set of procedures performed on the raw data to produce a data matrix which will be the starting point for the subsequent statistical analysis. Data preprocessing is a crucial step on the path to knowledge extraction, which should be carefully controlled and optimized in order to maximize the output of any untargeted metabolomics investigation.

  5. Combined data preprocessing and multivariate statistical analysis characterizes fed-batch culture of mouse hybridoma cells for rational medium design.

    PubMed

    Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup

    2010-10-01

    We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Predicting Potential Changes in Suitable Habitat and Distribution by 2100 for Tree Species of the Eastern United States

    Treesearch

    Louis R Iverson; Anantha M. Prasad; Mark W. Schwartz; Mark W. Schwartz

    2005-01-01

    We predict current distribution and abundance for tree species present in eastern North America, and subsequently estimate potential suitable habitat for those species under a changed climate with 2 x CO2. We used a series of statistical models (i.e., Regression Tree Analysis (RTA), Multivariate Adaptive Regression Splines (MARS), Bagging Trees (...

  8. Potential errors and misuse of statistics in studies on leakage in endodontics.

    PubMed

    Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J

    2013-04-01

    To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.

  9. EVALUATION OF THE EXTRACELLULAR MATRIX OF INJURED SUPRASPINATUS IN RATS

    PubMed Central

    Almeida, Luiz Henrique Oliveira; Ikemoto, Roberto; Mader, Ana Maria; Pinhal, Maria Aparecida Silva; Munhoz, Bruna; Murachovsky, Joel

    2016-01-01

    ABSTRACT Objective: To evaluate the evolution of injuries of the supraspinatus muscle by immunohistochemistry (IHC) and anatomopathological analysis in animal model (Wistar rats). Methods: Twenty-five Wistar rats were submitted to complete injury of the supraspinatus tendon, then subsequently sacrificed in groups of five animals at the following periods: immediately after the injury, 24h after the injury, 48h after, 30 days after and three months after the injury. All groups underwent histological and IHC analysis. Results: Regarding vascular proliferation and inflammatory infiltrate, we found a statistically significant difference between groups 1(control group) and 2 (24h after injury). IHC analysis showed that expression of vascular endothelial growth factor (VEGF) showed a statistically significant difference between groups 1 and 2, and collagen type 1 (Col-1) evaluation presented a statistically significant difference between groups 1 and 4. Conclusion: We observed changes in the extracellular matrix components compatible with remodeling and healing. Remodeling is more intense 24h after injury. However, VEGF and Col-1 are substantially increased at 24h and 30 days after the injury, respectively. Level of Evidence I, Experimental Study. PMID:26997907

  10. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  12. Earth science research

    NASA Technical Reports Server (NTRS)

    Botkin, Daniel B.

    1987-01-01

    The analysis of ground-truth data from the boreal forest plots in the Superior National Forest, Minnesota, was completed. Development of statistical methods was completed for dimension analysis (equations to estimate the biomass of trees from measurements of diameter and height). The dimension-analysis equations were applied to the data obtained from ground-truth plots, to estimate the biomass. Classification and analyses of remote sensing images of the Superior National Forest were done as a test of the technique to determine forest biomass and ecological state by remote sensing. Data was archived on diskette and tape and transferred to UCSB to be used in subsequent research.

  13. The statistical big bang of 1911: ideology, technological innovation and the production of medical statistics.

    PubMed

    Higgs, W

    1996-12-01

    This paper examines the relationship between intellectual debate, technologies for analysing information, and the production of statistics in the General Register Office (GRO) in London in the early twentieth century. It argues that controversy between eugenicists and public health officials respecting the cause and effect of class-specific variations in fertility led to the introduction of questions in the 1911 census on marital fertility. The increasing complexity of the census necessitated a shift from manual to mechanised forms of data processing within the GRO. The subsequent increase in processing power allowed the GRO to make important changes to the medical and demographic statistics it published in the annual Reports of the Registrar General. These included substituting administrative sanitary districts for registration districts as units of analysis, consistently transferring deaths in institutions back to place of residence, and abstracting deaths according to the International List of Causes of Death.

  14. Ripening-dependent metabolic changes in the volatiles of pineapple (Ananas comosus (L.) Merr.) fruit: II. Multivariate statistical profiling of pineapple aroma compounds based on comprehensive two-dimensional gas chromatography-mass spectrometry.

    PubMed

    Steingass, Christof Björn; Jutzi, Manfred; Müller, Jenny; Carle, Reinhold; Schmarr, Hans-Georg

    2015-03-01

    Ripening-dependent changes of pineapple volatiles were studied in a nontargeted profiling analysis. Volatiles were isolated via headspace solid phase microextraction and analyzed by comprehensive 2D gas chromatography and mass spectrometry (HS-SPME-GC×GC-qMS). Profile patterns presented in the contour plots were evaluated applying image processing techniques and subsequent multivariate statistical data analysis. Statistical methods comprised unsupervised hierarchical cluster analysis (HCA) and principal component analysis (PCA) to classify the samples. Supervised partial least squares discriminant analysis (PLS-DA) and partial least squares (PLS) regression were applied to discriminate different ripening stages and describe the development of volatiles during postharvest storage, respectively. Hereby, substantial chemical markers allowing for class separation were revealed. The workflow permitted the rapid distinction between premature green-ripe pineapples and postharvest-ripened sea-freighted fruits. Volatile profiles of fully ripe air-freighted pineapples were similar to those of green-ripe fruits postharvest ripened for 6 days after simulated sea freight export, after PCA with only two principal components. However, PCA considering also the third principal component allowed differentiation between air-freighted fruits and the four progressing postharvest maturity stages of sea-freighted pineapples.

  15. Enhancing Research in Networking & System Security, and Forensics, in Puerto Rico

    DTIC Science & Technology

    2015-03-03

    Researcher and her research revolves around using Cognitive Systems, which are machines that can think, listen and see in order to help the disabled ...Subsequence. The implementation is been conducted using R- Language because of its statistical and analysis abilities. Because it works using a command line...Technology. 14-AUG-13, . : , Eduardo Melendez. FROM RANDOM EMBEDDING TECHNIQUES TO ENTROPY USING IMAGEPOINT ADJACENT SHADE VALUES, 12th Annual

  16. The role of ensemble-based statistics in variational assimilation of cloud-affected observations from infrared imagers

    NASA Astrophysics Data System (ADS)

    Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris

    2017-04-01

    Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the apparent benefit from using ensembles for cloudy radiance assimilation in an EnVar context.

  17. Incidence and rates of visual field progression after longitudinally measured optic disc change in glaucoma.

    PubMed

    Chauhan, Balwantray C; Nicolela, Marcelo T; Artes, Paul H

    2009-11-01

    To determine whether glaucoma patients with progressive optic disc change have subsequent visual field progression earlier and at a faster rate compared with those without disc change. Prospective, longitudinal, cohort study. Eighty-one patients with open-angle glaucoma. Patients underwent confocal scanning laser tomography and standard automated perimetry every 6 months. The complete follow-up was divided into initial and subsequent periods. Two initial periods-first 3 years (Protocol A) and first half of the total follow-up (Protocol B)-were used, with the respective remainder being the subsequent follow-up. Disc change during the initial follow-up was determined with liberal, moderate, or conservative criteria of the Topographic Change Analysis. Subsequent field progression was determined with significant pattern deviation change in >or=3 locations (criterion used in the Early Manifest Glaucoma Trial). As a control analysis, field change during the initial follow-up was determined with significant pattern deviation change in >or=1, >or=2, or >or=3 locations. Survival time to subsequent field progression, rates of mean deviation (MD) change, and positive and negative likelihood ratios. The median (interquartile range) total follow-up was 11.0 (8.0-12.0) years with 22 (18-24) examinations. More patients had disc changes during the initial follow-up compared with field changes. The mean time to field progression was consistently shorter (protocol A, 0.8-1.7 years; protocol B, 0.3-0.7 years) in patients with prior disc change. In the control analysis, patients with prior field change had statistically earlier subsequent field progression (protocol A, 2.9-3.0 years; protocol B, 0.7-0.9). Similarly, patients with either prior disc or field change always had worse mean rates of subsequent MD change, although the distributions overlapped widely. Patients with subsequent field progression were up to 3 times more likely to have prior disc change compared with those without, and up to 5 times more likely to have prior field change compared with those without. Longitudinally measured optic disc change is predictive of subsequent visual field progression and may be an efficacious end point for functional outcomes in clinical studies and trials in glaucoma.

  18. On Statistical Analysis of Neuroimages with Imperfect Registration

    PubMed Central

    Kim, Won Hwa; Ravi, Sathya N.; Johnson, Sterling C.; Okonkwo, Ozioma C.; Singh, Vikas

    2016-01-01

    A variety of studies in neuroscience/neuroimaging seek to perform statistical inference on the acquired brain image scans for diagnosis as well as understanding the pathological manifestation of diseases. To do so, an important first step is to register (or co-register) all of the image data into a common coordinate system. This permits meaningful comparison of the intensities at each voxel across groups (e.g., diseased versus healthy) to evaluate the effects of the disease and/or use machine learning algorithms in a subsequent step. But errors in the underlying registration make this problematic, they either decrease the statistical power or make the follow-up inference tasks less effective/accurate. In this paper, we derive a novel algorithm which offers immunity to local errors in the underlying deformation field obtained from registration procedures. By deriving a deformation invariant representation of the image, the downstream analysis can be made more robust as if one had access to a (hypothetical) far superior registration procedure. Our algorithm is based on recent work on scattering transform. Using this as a starting point, we show how results from harmonic analysis (especially, non-Euclidean wavelets) yields strategies for designing deformation and additive noise invariant representations of large 3-D brain image volumes. We present a set of results on synthetic and real brain images where we achieve robust statistical analysis even in the presence of substantial deformation errors; here, standard analysis procedures significantly under-perform and fail to identify the true signal. PMID:27042168

  19. Advanced functional network analysis in the geosciences: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-04-01

    Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.

  20. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison

    PubMed Central

    Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth

    2006-01-01

    Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497

  2. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  3. Subsequent childbirth after a previous traumatic birth.

    PubMed

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  4. Bayesian selection of Markov models for symbol sequences: application to microsaccadic eye movements.

    PubMed

    Bettenbühl, Mario; Rusconi, Marco; Engbert, Ralf; Holschneider, Matthias

    2012-01-01

    Complex biological dynamics often generate sequences of discrete events which can be described as a Markov process. The order of the underlying Markovian stochastic process is fundamental for characterizing statistical dependencies within sequences. As an example for this class of biological systems, we investigate the Markov order of sequences of microsaccadic eye movements from human observers. We calculate the integrated likelihood of a given sequence for various orders of the Markov process and use this in a Bayesian framework for statistical inference on the Markov order. Our analysis shows that data from most participants are best explained by a first-order Markov process. This is compatible with recent findings of a statistical coupling of subsequent microsaccade orientations. Our method might prove to be useful for a broad class of biological systems.

  5. Superposed epoch analysis of physiological fluctuations: possible space weather connections

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Cornélissen, Germaine; Halberg, Franz; Brown, Denzel; Washington, Brien

    2018-03-01

    There is a strong connection between space weather and fluctuations in technological systems. Some studies also suggest a statistical connection between space weather and subsequent fluctuations in the physiology of living creatures. This connection, however, has remained controversial and difficult to demonstrate. Here we present support for a response of human physiology to forcing from the explosive onset of the largest of space weather events—space storms. We consider a case study with over 16 years of high temporal resolution measurements of human blood pressure (systolic, diastolic) and heart rate variability to search for associations with space weather. We find no statistically significant change in human blood pressure but a statistically significant drop in heart rate during the main phase of space storms. Our empirical findings shed light on how human physiology may respond to exogenous space weather forcing.

  6. Superposed epoch analysis of physiological fluctuations: possible space weather connections.

    PubMed

    Wanliss, James; Cornélissen, Germaine; Halberg, Franz; Brown, Denzel; Washington, Brien

    2018-03-01

    There is a strong connection between space weather and fluctuations in technological systems. Some studies also suggest a statistical connection between space weather and subsequent fluctuations in the physiology of living creatures. This connection, however, has remained controversial and difficult to demonstrate. Here we present support for a response of human physiology to forcing from the explosive onset of the largest of space weather events-space storms. We consider a case study with over 16 years of high temporal resolution measurements of human blood pressure (systolic, diastolic) and heart rate variability to search for associations with space weather. We find no statistically significant change in human blood pressure but a statistically significant drop in heart rate during the main phase of space storms. Our empirical findings shed light on how human physiology may respond to exogenous space weather forcing.

  7. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2015-10-01

    surgical procedures and subsequent collection of tissues have been developed and are currently used on a regular basis. Major Task 4: Evaluating the...needed to evaluate the utility of the inhibitory antibody to reduce the flexion contracture of injured knee joints. The employed techniques include...second surgery to remove a pin, and it did not change by the end of the 32nd week 1. Major Task 5: Task 4. Data analysis and statistical evaluation

  8. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs.

    PubMed

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-05-28

    Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.

  9. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs

    PubMed Central

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-01-01

    Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045

  10. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  11. Prolonged instability prior to a regime shift

    USGS Publications Warehouse

    Spanbauer, Trisha; Allen, Craig R.; Angeler, David G.; Eason, Tarsha; Fritz, Sherilyn C.; Garmestani, Ahjond S.; Nash, Kirsty L.; Stone, Jeffery R.

    2014-01-01

    Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia.

  12. Statistical properties of the radiation belt seed population

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyd, A. J.; Spence, H. E.; Huang, C. -L.

    Here, we present a statistical analysis of phase space density data from the first 26 months of the Van Allen Probes mission. In particular, we investigate the relationship between the tens and hundreds of keV seed electrons and >1 MeV core radiation belt electron population. Using a cross-correlation analysis, we find that the seed and core populations are well correlated with a coefficient of ≈0.73 with a time lag of 10–15 h. We present evidence of a seed population threshold that is necessary for subsequent acceleration. The depth of penetration of the seed population determines the inner boundary of themore » acceleration process. However, we show that an enhanced seed population alone is not enough to produce acceleration in the higher energies, implying that the seed population of hundreds of keV electrons is only one of several conditions required for MeV electron radiation belt acceleration.« less

  13. Statistical properties of the radiation belt seed population

    DOE PAGES

    Boyd, A. J.; Spence, H. E.; Huang, C. -L.; ...

    2016-07-25

    Here, we present a statistical analysis of phase space density data from the first 26 months of the Van Allen Probes mission. In particular, we investigate the relationship between the tens and hundreds of keV seed electrons and >1 MeV core radiation belt electron population. Using a cross-correlation analysis, we find that the seed and core populations are well correlated with a coefficient of ≈0.73 with a time lag of 10–15 h. We present evidence of a seed population threshold that is necessary for subsequent acceleration. The depth of penetration of the seed population determines the inner boundary of themore » acceleration process. However, we show that an enhanced seed population alone is not enough to produce acceleration in the higher energies, implying that the seed population of hundreds of keV electrons is only one of several conditions required for MeV electron radiation belt acceleration.« less

  14. Low-cost digital image processing at the University of Oklahoma

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.

    1981-01-01

    Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.

  15. Analysis and discussion on the experimental data of electrolyte analyzer

    NASA Astrophysics Data System (ADS)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  16. Statistically Derived Subtypes and Associations with Cerebrospinal Fluid and Genetic Biomarkers in Mild Cognitive Impairment: A Latent Profile Analysis.

    PubMed

    Eppig, Joel S; Edmonds, Emily C; Campbell, Laura; Sanderson-Cimino, Mark; Delano-Wood, Lisa; Bondi, Mark W

    2017-08-01

    Research demonstrates heterogeneous neuropsychological profiles among individuals with mild cognitive impairment (MCI). However, few studies have included visuoconstructional ability or used latent mixture modeling to statistically identify MCI subtypes. Therefore, we examined whether unique neuropsychological MCI profiles could be ascertained using latent profile analysis (LPA), and subsequently investigated cerebrospinal fluid (CSF) biomarkers, genotype, and longitudinal clinical outcomes between the empirically derived classes. A total of 806 participants diagnosed by means of the Alzheimer's Disease Neuroimaging Initiative (ADNI) MCI criteria received a comprehensive neuropsychological battery assessing visuoconstructional ability, language, attention/executive function, and episodic memory. Test scores were adjusted for demographic characteristics using standardized regression coefficients based on "robust" normal control performance (n=260). Calculated Z-scores were subsequently used in the LPA, and CSF-derived biomarkers, genotype, and longitudinal clinical outcome were evaluated between the LPA-derived MCI classes. Statistical fit indices suggested a 3-class model was the optimal LPA solution. The three-class LPA consisted of a mixed impairment MCI class (n=106), an amnestic MCI class (n=455), and an LPA-derived normal class (n=245). Additionally, the amnestic and mixed classes were more likely to be apolipoprotein e4+ and have worse Alzheimer's disease CSF biomarkers than LPA-derived normal subjects. Our study supports significant heterogeneity in MCI neuropsychological profiles using LPA and extends prior work (Edmonds et al., 2015) by demonstrating a lower rate of progression in the approximately one-third of ADNI MCI individuals who may represent "false-positive" diagnoses. Our results underscore the importance of using sensitive, actuarial methods for diagnosing MCI, as current diagnostic methods may be over-inclusive. (JINS, 2017, 23, 564-576).

  17. From random microstructures to representative volume elements

    NASA Astrophysics Data System (ADS)

    Zeman, J.; Šejnoha, M.

    2007-06-01

    A unified treatment of random microstructures proposed in this contribution opens the way to efficient solutions of large-scale real world problems. The paper introduces a notion of statistically equivalent periodic unit cell (SEPUC) that replaces in a computational step the actual complex geometries on an arbitrary scale. A SEPUC is constructed such that its morphology conforms with images of real microstructures. Here, the appreciated two-point probability function and the lineal path function are employed to classify, from the statistical point of view, the geometrical arrangement of various material systems. Examples of statistically equivalent unit cells constructed for a unidirectional fibre tow, a plain weave textile composite and an irregular-coursed masonry wall are given. A specific result promoting the applicability of the SEPUC as a tool for the derivation of homogenized effective properties that are subsequently used in an independent macroscopic analysis is also presented.

  18. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  19. [Pathogenetic therapy of mastopathies in the prevention of breast cancer].

    PubMed

    Iaritsyn, S S; Sidorenko, L N

    1979-01-01

    The breast cancer morbidity among the population of the city of Leningrad has been analysed. It was shown that there is a tendency to the increased number of breast cancer patients. In this respect attention is given to the prophylactic measures, accomplished in Leningrad City oncological dyspensary. As proved statistically, the pathogenetic therapy of mastopathy is a factor contributing to less risk of malignant transformation. For the statistical analysis the authors used the data of 132 breast cancer patients; previously operated upon for local fibroadenomatosis, and the data of 259 control patients. It was found that among the patients with fibroadenomatosis who subsequently developed cancer of the mammary gland, the proportion of untreated patients was 2.8 times as much as in the control group.

  20. The Role of Remote Sensing in Assessing Forest Biomass in Appalachian South Carolina

    NASA Technical Reports Server (NTRS)

    Shain, W.; Nix, L.

    1982-01-01

    Information is presented on the use of color infrared aerial photographs and ground sampling methods to quantify standing forest biomass in Appalachian South Carolina. Local tree biomass equations are given and subsequent evaluation of stand density and size classes using remote sensing methods is presented. Methods of terrain analysis, environmental hazard rating, and subsequent determination of accessibility of forest biomass are discussed. Computer-based statistical analyses are used to expand individual cover-type specific ground sample data to area-wide cover type inventory figures based on aerial photographic interpretation and area measurement. Forest biomass data are presented for the study area in terms of discriminant size classes, merchantability limits, accessibility (as related to terrain and yield/harvest constraints), and potential environmental impact of harvest.

  1. Experience in the management of ECMO therapy as a mortality risk factor.

    PubMed

    Guilló Moreno, V; Gutiérrez Martínez, A; Romero Berrocal, A; Sánchez Castilla, M; García-Fernández, J

    2018-02-01

    The extracorporeal oxygenation membrane (ECMO) is a system that provides circulatory and respiratory assistance to patients in cardiac or respiratory failure refractory to conventional treatment. It is a therapy with numerous associated complications and high mortality. Multidisciplinary management and experienced teams increase survival. Our purpose is to evaluate and analyse the effect of the learning curve on mortality. Retrospective and observational study of 31 patients, from January 2012 to December 2015. Patients were separated into 2periods. These periods were divided by the establishment of an ECMO protocol. We compared the quantitative variables by performing the Mann-Whitney U test. For the categorical qualitative variables we performed the chi-square test or Fisher exact statistic as appropriate. The survival curve was computed using the Kaplan-Meier method, and the analysis of statistical significance using the Log-rank test. Data analysis was performed with the STATA programme 14. Survival curves show the tendency to lower mortality in the subsequent period (P=0.0601). The overall mortality rate in the initial period was higher than in the subsequent period (P=0.042). In another analysis, we compared the characteristics of the 2groups and concluded that they were homogeneous. The degree of experience is an independent factor for mortality. The application of a care protocol is fundamental to facilitate the management of ECMO therapy. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  2. Analysis of Loss-of-Offsite-Power Events 1997-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Nancy Ellen; Schroeder, John Alton

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations weremore » determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.« less

  3. A change in humidification system can eliminate endotracheal tube occlusion.

    PubMed

    Doyle, Alex; Joshi, Manasi; Frank, Peter; Craven, Thomas; Moondi, Parvez; Young, Peter

    2011-12-01

    Inadequate airway humidification can result in endotracheal tube occlusion. There is evidence that heat and moisture exchangers (HMEs) are more prone to endotracheal tube occlusion than heated humidifiers (HHs) that contain a heated wire circuit. We aimed to compare the incidence of endotracheal tube occlusion while introducing a new dual-heated wire circuit HH in place of an established hydrophobic HME. This was a prospective observational study. All patients who required intubation were included in our analysis. Univariate statistical analysis was performed using a Fisher exact test. P < .05 was considered statistically significant. There were 158 patients in the HME group and 88 patients in the HH group. The incidence of endotracheal tube occlusion was 5.7% in the HME group and 0% in the HH group. Statistical analysis revealed a significant difference between the 2 groups (P = .02). In light of this finding, we changed our practice to provide humidification exclusively by HH. In the subsequent 18-month period, there were no further episodes of endotracheal tube occlusion. Our study demonstrates that there is a significant increase in the incidence of endotracheal tube occlusion when using a hydrophobic HME compared with an HH and that using a dual-heated wire circuit HH can eliminate endotracheal tube occlusion. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Statistical analysis of the effect of temperature and inlet humidities on the parameters of a semiempirical model of the internal resistance of a polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    The internal resistance of a PEM fuel cell depends on the operation conditions and on the current delivered by the cell. This work's goal is to obtain a semiempirical model able to reproduce the effect of the operation current on the internal resistance of an individual cell of a commercial PEM fuel cell stack; and to perform a statistical analysis in order to study the effect of the operation temperature and the inlet humidities on the parameters of the model. First, the internal resistance of the individual fuel cell operating in different operation conditions was experimentally measured for different DC currents, using the high frequency intercept of the impedance spectra. Then, a semiempirical model based on Springer and co-workers' model was proposed. This model is able to successfully reproduce the experimental trends. Subsequently, the curves of resistance versus DC current obtained for different operation conditions were fitted to the semiempirical model, and an analysis of variance (ANOVA) was performed in order to determine which factors have a statistically significant effect on each model parameter. Finally, a response surface method was applied in order to obtain a regression model.

  5. Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Nemec, Marian

    2017-01-01

    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.

  6. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  7. Asymptotic modal analysis and statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Dowell, Earl H.

    1988-01-01

    Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.

  8. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  9. A method for the measurement and analysis of ride vibrations of transportation systems

    NASA Technical Reports Server (NTRS)

    Catherines, J. J.; Clevenson, S. A.; Scholl, H. F.

    1972-01-01

    The measurement and recording of ride vibrations which affect passenger comfort in transportation systems and the subsequent data-reduction methods necessary for interpreting the data present exceptional instrumentation requirements and necessitate the use of computers for specialized analysis techniques. A method is presented for both measuring and analyzing ride vibrations of the type encountered in ground and air transportation systems. A portable system for measuring and recording low-frequency, low-amplitude accelerations and specialized data-reduction procedures are described. Sample vibration measurements in the form of statistical parameters representative of typical transportation systems are also presented to demonstrate the utility of the techniques.

  10. Efficacy of a randomized cell phone-based counseling intervention in postponing subsequent pregnancy among teen mothers.

    PubMed

    Katz, Kathy S; Rodan, Margaret; Milligan, Renee; Tan, Sylvia; Courtney, Lauren; Gantz, Marie; Blake, Susan M; McClain, Lenora; Davis, Maurice; Kiely, Michele; Subramanian, Siva

    2011-12-01

    Adolescent mothers in Washington, DC have a high rate of subsequent teen pregnancies, often within 24 months. Children of teen mothers are at risk for adverse psychosocial outcomes. When adolescents are strongly attached to parents, schools, and positive peers, they may be less likely to repeat a pregnancy. This study tested the efficacy of a counseling intervention delivered by cell phone and focused on postponing subsequent teen pregnancies by strengthening healthy relationships, reproductive practices, and positive youth assets. The objective of this study was to compare time to a repeat pregnancy between the intervention and usual care groups, and, secondarily, to determine whether treatment intensity influenced time to subsequent conception. Primiparous pregnant teens ages 15-19, were recruited in Washington, DC. Of 849 teens screened, 29.3% (n = 249) met inclusion criteria, consented to participate, and completed baseline measures. They were then randomized to the intervention (N = 124) or to usual care (N = 125). Intervention group teens received cell phones for 18 months of counseling sessions, and quarterly group sessions. Follow-up measures assessed subsequent pregnancy through 24 months post-delivery. A survival analysis compared time to subsequent conception in the two treatment groups. Additional models examined the effect of treatment intensity. By 24 months, 31% of the intervention and 36% of usual care group teens had a subsequent pregnancy. Group differences were not statistically significant in intent-to-treat analysis. Because there was variability in the degree of exposure of teens to the curriculum, a survival analysis accounting for treatment intensity was performed and a significant interaction with age was detected. Participants who were aged 15-17 years at delivery showed a significant reduction in subsequent pregnancy with increased levels of intervention exposure (P < 0.01), but not those ≥ 18 years. Adolescents ≥ 18 years faced considerable challenges to treatment success. Individual, social, and contextual factors are all important to consider in the prevention of repeat teen pregnancy. Cell phone-based approaches to counseling may not be the most ideal for addressing complex, socially-mediated behaviors such as this, except for selective subgroups. A lack of resources within the community for older teens may interfere with program success.

  11. Professional Development in Statistics, Technology, and Cognitively Demanding Tasks: Classroom Implementation and Obstacles

    ERIC Educational Resources Information Center

    Foley, Gregory D.; Khoshaim, Heba Bakr; Alsaeed, Maha; Er, S. Nihan

    2012-01-01

    Attending professional development programmes can support teachers in applying new strategies for teaching mathematics and statistics. This study investigated (a) the extent to which the participants in a professional development programme subsequently used the techniques they had learned when teaching mathematics and statistics and (b) the…

  12. 49 CFR 369.11 - Quarterly reports of passenger revenues, expenses, and statistics.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 5 2011-10-01 2011-10-01 false Quarterly reports of passenger revenues, expenses, and statistics. 369.11 Section 369.11 Transportation Other Regulations Relating to Transportation..., and statistics. Commencing with reports for the quarter ended March 31, 1968, and for subsequent...

  13. 49 CFR 369.11 - Quarterly reports of passenger revenues, expenses, and statistics.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Quarterly reports of passenger revenues, expenses, and statistics. 369.11 Section 369.11 Transportation Other Regulations Relating to Transportation..., and statistics. Commencing with reports for the quarter ended March 31, 1968, and for subsequent...

  14. Kinetic analysis of single molecule FRET transitions without trajectories

    NASA Astrophysics Data System (ADS)

    Schrangl, Lukas; Göhring, Janett; Schütz, Gerhard J.

    2018-03-01

    Single molecule Förster resonance energy transfer (smFRET) is a popular tool to study biological systems that undergo topological transitions on the nanometer scale. smFRET experiments typically require recording of long smFRET trajectories and subsequent statistical analysis to extract parameters such as the states' lifetimes. Alternatively, analysis of probability distributions exploits the shapes of smFRET distributions at well chosen exposure times and hence works without the acquisition of time traces. Here, we describe a variant that utilizes statistical tests to compare experimental datasets with Monte Carlo simulations. For a given model, parameters are varied to cover the full realistic parameter space. As output, the method yields p-values which quantify the likelihood for each parameter setting to be consistent with the experimental data. The method provides suitable results even if the actual lifetimes differ by an order of magnitude. We also demonstrated the robustness of the method to inaccurately determine input parameters. As proof of concept, the new method was applied to the determination of transition rate constants for Holliday junctions.

  15. A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.

    PubMed

    Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain

    2015-10-01

    Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.

  16. Length bias correction in gene ontology enrichment analysis using logistic regression.

    PubMed

    Mi, Gu; Di, Yanming; Emerson, Sarah; Cumbie, Jason S; Chang, Jeff H

    2012-01-01

    When assessing differential gene expression from RNA sequencing data, commonly used statistical tests tend to have greater power to detect differential expression of genes encoding longer transcripts. This phenomenon, called "length bias", will influence subsequent analyses such as Gene Ontology enrichment analysis. In the presence of length bias, Gene Ontology categories that include longer genes are more likely to be identified as enriched. These categories, however, are not necessarily biologically more relevant. We show that one can effectively adjust for length bias in Gene Ontology analysis by including transcript length as a covariate in a logistic regression model. The logistic regression model makes the statistical issue underlying length bias more transparent: transcript length becomes a confounding factor when it correlates with both the Gene Ontology membership and the significance of the differential expression test. The inclusion of the transcript length as a covariate allows one to investigate the direct correlation between the Gene Ontology membership and the significance of testing differential expression, conditional on the transcript length. We present both real and simulated data examples to show that the logistic regression approach is simple, effective, and flexible.

  17. The extraction and integration framework: a two-process account of statistical learning.

    PubMed

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  18. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    NASA Astrophysics Data System (ADS)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  19. Natural variation reveals relationships between pre-stress carbohydrate nutritional status and subsequent responses to xenobiotic and oxidative stress in Arabidopsis thaliana.

    PubMed

    Ramel, Fanny; Sulmon, Cécile; Gouesbet, Gwenola; Couée, Ivan

    2009-12-01

    Soluble sugars are involved in responses to stress, and act as signalling molecules that activate specific or hormone cross-talk transduction pathways. Thus, exogenous sucrose treatment efficiently induces tolerance to the herbicide atrazine in Arabidopsis thaliana plantlets, at least partially through large-scale modifications of expression of stress-related genes. Availability of sugars in planta for stress responses is likely to depend on complex dynamics of soluble sugar accumulation, sucrose-starch partition and organ allocation. The question of potential relationships between endogenous sugar levels and stress responses to atrazine treatment was investigated through analysis of natural genetic accessions of A. thaliana. Parallel quantitative and statistical analysis of biochemical parameters and of stress-sensitive physiological traits was carried out on a set of 11 accessions. Important natural variation was found between accessions of A. thaliana in pre-stress shoot endogenous sugar levels and responses of plantlets to subsequent atrazine stress. Moreover, consistent trends and statistically significant correlations were detected between specific endogenous sugar parameters, such as the pre-stress end of day sucrose level in shoots, and physiological markers of atrazine tolerance. These significant relationships between endogenous carbohydrate metabolism and stress response therefore point to an important integration of carbon nutritional status and induction of stress tolerance in plants. The specific correlation between pre-stress sucrose level and greater atrazine tolerance may reflect adaptive mechanisms that link sucrose accumulation, photosynthesis-related stress and sucrose induction of stress defences.

  20. The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis

    NASA Astrophysics Data System (ADS)

    Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali

    2018-04-01

    The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.

  1. Waiting time distribution revealing the internal spin dynamics in a double quantum dot

    NASA Astrophysics Data System (ADS)

    Ptaszyński, Krzysztof

    2017-07-01

    Waiting time distribution and the zero-frequency full counting statistics of unidirectional electron transport through a double quantum dot molecule attached to spin-polarized leads are analyzed using the quantum master equation. The waiting time distribution exhibits a nontrivial dependence on the value of the exchange coupling between the dots and the gradient of the applied magnetic field, which reveals the oscillations between the spin states of the molecule. The zero-frequency full counting statistics, on the other hand, is independent of the aforementioned quantities, thus giving no insight into the internal dynamics. The fact that the waiting time distribution and the zero-frequency full counting statistics give a nonequivalent information is associated with two factors. Firstly, it can be explained by the sensitivity to different timescales of the dynamics of the system. Secondly, it is associated with the presence of the correlation between subsequent waiting times, which makes the renewal theory, relating the full counting statistics and the waiting time distribution, no longer applicable. The study highlights the particular usefulness of the waiting time distribution for the analysis of the internal dynamics of mesoscopic systems.

  2. Do cognitive attributions for smoking predict subsequent smoking development?

    PubMed

    Guo, Qian; Unger, Jennifer B; Azen, Stanley P; MacKinnon, David P; Johnson, C Anderson

    2012-03-01

    To develop more effective anti-smoking programs, it is important to understand the factors that influence people to smoke. Guided by attribution theory, a longitudinal study was conducted to investigate how individuals' cognitive attributions for smoking were associated with subsequent smoking development and through which pathways. Middle and high school students in seven large cities in China (N=12,382; 48.5% boys and 51.5% girls) completed two annual surveys. Associations between cognitive attributions for smoking and subsequent smoking initiation and progression were tested with multilevel analysis, taking into account plausible moderation effects of gender and baseline smoking status. Mediation effects of susceptibility to smoking were investigated using statistical mediation analysis (MacKinnon, 2008). Six out of eight tested themes of cognitive attributions were associated with subsequent smoking development. Curiosity (β=0.11, p<0.001) and autonomy (β=0.08, p=0.019) were associated with smoking initiation among baseline non-smokers. Coping (β=0.07, p<0.001) and social image (β=0.10, p=<.0001) were associated with smoking progression among baseline lifetime smokers. Social image (β=0.05, p=0.043), engagement (β=0.07, p=0.003), and mental enhancement (β=0.15, p<0.001) were associated with smoking progression among baseline past 30-day smokers. More attributions were associated with smoking development among males than among females. Susceptibility to smoking partially mediated most of the associations, with the proportion of mediated effects ranging from 4.3% to 30.8%. This study identifies the roles that cognitive attributions for smoking play in subsequent smoking development. These attributions could be addressed in smoking prevention programs. Published by Elsevier Ltd.

  3. Do Cognitive Attributions for Smoking Predict Subsequent Smoking Development?

    PubMed Central

    Guo, Qian; Unger, Jennifer B.; Azen, Stanley P.; MacKinnon, David P.; Johnson, C. Anderson

    2011-01-01

    To develop more effective anti-smoking programs, it is important to understand the factors that influence people to smoke. Guided by attribution theory, a longitudinal study was conducted to investigate how individuals’ cognitive attributions for smoking were associated with subsequent smoking development and through which pathways. Middle and high school students in seven large cities in China (N=12,382; 48.5% boys and 51.5% girls) completed two annual surveys. Associations between cognitive attributions for smoking and subsequent smoking initiation and progression were tested with multilevel analysis, taking into account plausible moderation effects of gender and baseline smoking status. Mediation effects of susceptibility to smoking were investigated using statistical mediation analysis (MacKinnon, 2008). Six out of eight tested themes of cognitive attributions were associated with subsequent smoking development. Curiosity (β=0.11, p<0.001) and autonomy (β=0.08, p=0.019) were associated with smoking initiation among baseline non-smokers. Coping (β=0.07, p<0.001) and social image (β=0.10, p=<.0001) were associated with smoking progression among baseline lifetime smokers. Social image (β=0.05, p=0.043), engagement (β=0.07, p=0.003), and mental enhancement (β=0.15, p<0.001) were associated with smoking progression among baseline past 30-day smokers. More attributions were associated with smoking development among males than among females. Susceptibility to smoking partially mediated most of the associations, with the proportion of mediated effects ranging from 4.3% to 30.8%. This study identifies the roles that cognitive attributions for smoking play in subsequent smoking development. These attributions could be addressed in smoking prevention programs. PMID:22112425

  4. Wash-out in N{sub 2}-dominated leptogenesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hahn-Woernle, F., E-mail: fhahnwo@mppmu.mpg.de

    2010-08-01

    We study the wash-out of a cosmological baryon asymmetry produced via leptogenesis by subsequent interactions. Therefore we focus on a scenario in which a lepton asymmetry is established in the out-of-equilibrium decays of the next-to-lightest right-handed neutrino. We apply the full classical Boltzmann equations without the assumption of kinetic equilibrium and including all quantum statistical factors to calculate the wash-out of the lepton asymmetry by interactions of the lightest right-handed state. We include scattering processes with top quarks in our analysis. This is of particular interest since the wash-out is enhanced by scatterings and the use of mode equations withmore » quantum statistical distribution functions. In this way we provide a restriction on the parameter space for this scenario.« less

  5. Automated Tracking of Cell Migration with Rapid Data Analysis.

    PubMed

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  6. Upside/Downside statistical mechanics of nonequilibrium Brownian motion. I. Distributions, moments, and correlation functions of a free particle.

    PubMed

    Craven, Galen T; Nitzan, Abraham

    2018-01-28

    Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.

  7. Upside/Downside statistical mechanics of nonequilibrium Brownian motion. I. Distributions, moments, and correlation functions of a free particle

    NASA Astrophysics Data System (ADS)

    Craven, Galen T.; Nitzan, Abraham

    2018-01-01

    Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.

  8. Principal component analysis and neurocomputing-based models for total ozone concentration over different urban regions of India

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Goutami; Chattopadhyay, Surajit; Chakraborthy, Parthasarathi

    2012-07-01

    The present study deals with daily total ozone concentration time series over four metro cities of India namely Kolkata, Mumbai, Chennai, and New Delhi in the multivariate environment. Using the Kaiser-Meyer-Olkin measure, it is established that the data set under consideration are suitable for principal component analysis. Subsequently, by introducing rotated component matrix for the principal components, the predictors suitable for generating artificial neural network (ANN) for daily total ozone prediction are identified. The multicollinearity is removed in this way. Models of ANN in the form of multilayer perceptron trained through backpropagation learning are generated for all of the study zones, and the model outcomes are assessed statistically. Measuring various statistics like Pearson correlation coefficients, Willmott's indices, percentage errors of prediction, and mean absolute errors, it is observed that for Mumbai and Kolkata the proposed ANN model generates very good predictions. The results are supported by the linearly distributed coordinates in the scatterplots.

  9. Quantitative structure-activity relationships by neural networks and inductive logic programming. I. The inhibition of dihydrofolate reductase by pyrimidines

    NASA Astrophysics Data System (ADS)

    Hirst, Jonathan D.; King, Ross D.; Sternberg, Michael J. E.

    1994-08-01

    Neural networks and inductive logic programming (ILP) have been compared to linear regression for modelling the QSAR of the inhibition of E. coli dihydrofolate reductase (DHFR) by 2,4-diamino-5-(substitured benzyl)pyrimidines, and, in the subsequent paper [Hirst, J.D., King, R.D. and Sternberg, M.J.E., J. Comput.-Aided Mol. Design, 8 (1994) 421], the inhibition of rodent DHFR by 2,4-diamino-6,6-dimethyl-5-phenyl-dihydrotriazines. Cross-validation trials provide a statistically rigorous assessment of the predictive capabilities of the methods, with training and testing data selected randomly and all the methods developed using identical training data. For the ILP analysis, molecules are represented by attributes other than Hansch parameters. Neural networks and ILP perform better than linear regression using the attribute representation, but the difference is not statistically significant. The major benefit from the ILP analysis is the formulation of understandable rules relating the activity of the inhibitors to their chemical structure.

  10. Practicing Statistics by Creating Exercises for Fellow Students

    ERIC Educational Resources Information Center

    Bebermeier, Sarah; Reiss, Katharina

    2016-01-01

    This article outlines the execution of a workshop in which students were encouraged to actively review the course contents on descriptive statistics by creating exercises for their fellow students. In a first-year statistics course in psychology, 39 out of 155 students participated in the workshop. In a subsequent evaluation, the workshop was…

  11. An analysis of suicide trends in Scotland 1950-2014: comparison with England & Wales.

    PubMed

    Dougall, Nadine; Stark, Cameron; Agnew, Tim; Henderson, Rob; Maxwell, Margaret; Lambert, Paul

    2017-12-20

    Scotland has disproportionately high rates of suicide compared with England. An analysis of trends may help reveal whether rates appear driven more by birth cohort, period or age. A 'birth cohort effect' for England & Wales has been previously reported by Gunnell et al. (B J Psych 182:164-70, 2003). This study replicates this analysis for Scotland, makes comparisons between the countries, and provides information on 'vulnerable' cohorts. Suicide and corresponding general population data were obtained from the National Records of Scotland, 1950 to 2014. Age and gender specific mortality rates were estimated. Age, period and cohort patterns were explored graphically by trend analysis. A pattern was found whereby successive male birth cohorts born after 1940 experienced higher suicide rates, in increasingly younger age groups, echoing findings reported for England & Wales. Young men (aged 20-39) were found to have a marked and statistically significant increase in suicide between those in the 1960 and 1965 birth cohorts. The 1965 cohort peaked in suicide rate aged 35-39, and the subsequent 1970 cohort peaked even younger, aged 25-29; it is possible that these 1965 and 1970 cohorts are at greater mass vulnerability to suicide than earlier cohorts. This was reflected in data for England & Wales, but to a lesser extent. Suicide rates associated with male birth cohorts subsequent to 1975 were less severe, and not statistically significantly different from earlier cohorts, suggestive of an amelioration of any possible influential 'cohort' effect. Scottish female suicide rates for all age groups converged and stabilised over time. Women have not been as affected as men, with less variation in patterns by different birth cohorts and with a much less convincing corresponding pattern suggestive of a 'cohort' effect. Trend analysis is useful in identifying 'vulnerable' cohorts, providing opportunities to develop suicide prevention strategies addressing these cohorts as they age.

  12. Mortality and long-term exposure to ambient air pollution: ongoing analyses based on the American Cancer Society cohort.

    PubMed

    Krewski, Daniel; Burnett, Richard; Jerrett, Michael; Pope, C Arden; Rainham, Daniel; Calle, Eugenia; Thurston, George; Thun, Michael

    This article provides an overview of previous analysis and reanalysis of the American Cancer Society (ACS) cohort, along with an indication of current ongoing analyses of the cohort with additional follow-up information through to 2000. Results of the first analysis conducted by Pope et al. (1995) showed that higher average sulfate levels were associated with increased mortality, particularly from cardiopulmonary disease. A reanalysis of the ACS cohort, undertaken by Krewski et al. (2000), found the original risk estimates for fine-particle and sulfate air pollution to be highly robust against alternative statistical techniques and spatial modeling approaches. A detailed investigation of covariate effects found a significant modifying effect of education with risk of mortality associated with fine particles declining with increasing educational attainment. Pope et al. (2002) subsequently reported results of a subsequent study using an additional 10 yr of follow-up of the ACS cohort. This updated analysis included gaseous copollutant and new fine-particle measurements, more comprehensive information on occupational exposures, dietary variables, and the most recent developments in statistical modeling integrating random effects and nonparametric spatial smoothing into the Cox proportional hazards model. Robust associations between ambient fine particulate air pollution and elevated risks of cardiopulmonary and lung cancer mortality were clearly evident, providing the strongest evidence to date that long-term exposure to fine particles is an important health risk. Current ongoing analysis using the extended follow-up information will explore the role of ecologic, economic, and, demographic covariates in the particulate air pollution and mortality association. This analysis will also provide insight into the role of spatial autocorrelation at multiple geographic scales, and whether critical instances in time of exposure to fine particles influence the risk of mortality from cardiopulmonary and lung cancer. Information on the influence of covariates at multiple scales and of critical exposure time windows can assist policymakers in establishing timelines for regulatory interventions that maximize population health benefits.

  13. "Suicide shall cease to be a crime": suicide and undetermined death trends 1970-2000 before and after the decriminalization of suicide in Ireland 1993.

    PubMed

    Osman, Mugtaba; Parnell, Andrew C; Haley, Clifford

    2017-02-01

    Suicide is criminalized in more than 100 countries around the world. A dearth of research exists into the effect of suicide legislation on suicide rates and available statistics are mixed. This study investigates 10,353 suicide deaths in Ireland that took place between 1970 and 2000. Irish 1970-2000 annual suicide data were obtained from the Central Statistics Office and modelled via a negative binomial regression approach. We examined the effect of suicide legislation on different age groups and on both sexes. We used Bonferroni correction for multiple modelling. Statistical analysis was performed using the R statistical package version 3.1.2. The coefficient for the effect of suicide act on overall suicide deaths was -9.094 (95 % confidence interval (CI) -34.086 to 15.899), statistically non-significant (p = 0.476). The coefficient for the effect suicide act on undetermined deaths was statistically significant (p < 0.001) and was estimated to be -644.4 (95 % CI -818.6 to -469.9). The results of our study indicate that legalization of suicide is not associated with a significant increase in subsequent suicide deaths. However, undetermined death verdict rates have significantly dropped following legalization of suicide.

  14. Factors Associated With Surgery Clerkship Performance and Subsequent USMLE Step Scores.

    PubMed

    Dong, Ting; Copeland, Annesley; Gangidine, Matthew; Schreiber-Gregory, Deanna; Ritter, E Matthew; Durning, Steven J

    2018-03-12

    We conducted an in-depth empirical investigation to achieve a better understanding of the surgery clerkship from multiple perspectives, including the influence of clerkship sequence on performance, the relationship between self-logged work hours and performance, as well as the association between surgery clerkship performance with subsequent USMLE Step exams' scores. The study cohort consisted of medical students graduating between 2015 and 2018 (n = 687). The primary measures of interest were clerkship sequence (internal medicine clerkship before or after surgery clerkship), self-logged work hours during surgery clerkship, surgery NBME subject exam score, surgery clerkship overall grade, and Step 1, Step 2 CK, and Step 3 exam scores. We reported the descriptive statistics and conducted correlation analysis, stepwise linear regression analysis, and variable selection analysis of logistic regression to answer the research questions. Students who completed internal medicine clerkship prior to surgery clerkship had better performance on surgery subject exam. The subject exam score explained an additional 28% of the variance of the Step 2 CK score, and the clerkship overall score accounted for an additional 24% of the variance after the MCAT scores and undergraduate GPA were controlled. Our finding suggests that the clerkship sequence does matter when it comes to performance on the surgery NBME subject exam. Performance on the surgery subject exam is predictive of subsequent performance on future USMLE Step exams. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  15. Qualitative Analysis of Commercial Social Network Profiles

    NASA Astrophysics Data System (ADS)

    Melendez, Lester; Wolfson, Ouri; Adjouadi, Malek; Rishe, Naphtali

    Social-networking sites have become an integral part of many users' daily internet routine. Commercial enterprises have been quick to recognize this and are subsequently creating profiles for many of their products and services. Commercial enterprises use social network profiles to target and interact with potential customers as well as to provide a gateway for users of the product or service to interact with each other. Many commercial enterprises use the statistics from their product or service's social network profile to tout the popularity and success of the product or service being showcased. They will use statistics such as number of friends, number of daily visits, number of interactions, and other similar measurements to quantify their claims. These statistics are often not a clear indication of the true popularity and success of the product. In this chapter the term product is used to refer to any tangible or intangible product, service, celebrity, personality, film, book, or other entity produced by a commercial enterprise.

  16. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    NASA Astrophysics Data System (ADS)

    He, Weihua; Xin, Jianting; Zhao, Yongqiang; Chu, Genbai; Xi, Tao; Shui, Min; Lu, Feng; Gu, Yuqiu

    2017-06-01

    This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  17. Assessment of variations in thermal cycle life data of thermal barrier coated rods

    NASA Astrophysics Data System (ADS)

    Hendricks, R. C.; McDonald, G.

    An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.

  18. Assessment of variations in thermal cycle life data of thermal barrier coated rods

    NASA Technical Reports Server (NTRS)

    Hendricks, R. C.; Mcdonald, G.

    1981-01-01

    An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.

  19. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  20. The Relationship Between Surface Curvature and Abdominal Aortic Aneurysm Wall Stress.

    PubMed

    de Galarreta, Sergio Ruiz; Cazón, Aitor; Antón, Raúl; Finol, Ender A

    2017-08-01

    The maximum diameter (MD) criterion is the most important factor when predicting risk of rupture of abdominal aortic aneurysms (AAAs). An elevated wall stress has also been linked to a high risk of aneurysm rupture, yet is an uncommon clinical practice to compute AAA wall stress. The purpose of this study is to assess whether other characteristics of the AAA geometry are statistically correlated with wall stress. Using in-house segmentation and meshing algorithms, 30 patient-specific AAA models were generated for finite element analysis (FEA). These models were subsequently used to estimate wall stress and maximum diameter and to evaluate the spatial distributions of wall thickness, cross-sectional diameter, mean curvature, and Gaussian curvature. Data analysis consisted of statistical correlations of the aforementioned geometry metrics with wall stress for the 30 AAA inner and outer wall surfaces. In addition, a linear regression analysis was performed with all the AAA wall surfaces to quantify the relationship of the geometric indices with wall stress. These analyses indicated that while all the geometry metrics have statistically significant correlations with wall stress, the local mean curvature (LMC) exhibits the highest average Pearson's correlation coefficient for both inner and outer wall surfaces. The linear regression analysis revealed coefficients of determination for the outer and inner wall surfaces of 0.712 and 0.516, respectively, with LMC having the largest effect on the linear regression equation with wall stress. This work underscores the importance of evaluating AAA mean wall curvature as a potential surrogate for wall stress.

  1. Automatically visualise and analyse data on pathways using PathVisioRPC from any programming environment.

    PubMed

    Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T

    2015-08-23

    Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.

  2. Ankle plantarflexion strength in rearfoot and forefoot runners: a novel clusteranalytic approach.

    PubMed

    Liebl, Dominik; Willwacher, Steffen; Hamill, Joseph; Brüggemann, Gert-Peter

    2014-06-01

    The purpose of the present study was to test for differences in ankle plantarflexion strengths of habitually rearfoot and forefoot runners. In order to approach this issue, we revisit the problem of classifying different footfall patterns in human runners. A dataset of 119 subjects running shod and barefoot (speed 3.5m/s) was analyzed. The footfall patterns were clustered by a novel statistical approach, which is motivated by advances in the statistical literature on functional data analysis. We explain the novel statistical approach in detail and compare it to the classically used strike index of Cavanagh and Lafortune (1980). The two groups found by the new cluster approach are well interpretable as a forefoot and a rearfoot footfall groups. The subsequent comparison study of the clustered subjects reveals that runners with a forefoot footfall pattern are capable of producing significantly higher joint moments in a maximum voluntary contraction (MVC) of their ankle plantarflexor muscles tendon units; difference in means: 0.28Nm/kg. This effect remains significant after controlling for an additional gender effect and for differences in training levels. Our analysis confirms the hypothesis that forefoot runners have a higher mean MVC plantarflexion strength than rearfoot runners. Furthermore, we demonstrate that our proposed stochastic cluster analysis provides a robust and useful framework for clustering foot strikes. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Evolutionary dynamics of selfish DNA explains the abundance distribution of genomic subsequences

    PubMed Central

    Sheinman, Michael; Ramisch, Anna; Massip, Florian; Arndt, Peter F.

    2016-01-01

    Since the sequencing of large genomes, many statistical features of their sequences have been found. One intriguing feature is that certain subsequences are much more abundant than others. In fact, abundances of subsequences of a given length are distributed with a scale-free power-law tail, resembling properties of human texts, such as Zipf’s law. Despite recent efforts, the understanding of this phenomenon is still lacking. Here we find that selfish DNA elements, such as those belonging to the Alu family of repeats, dominate the power-law tail. Interestingly, for the Alu elements the power-law exponent increases with the length of the considered subsequences. Motivated by these observations, we develop a model of selfish DNA expansion. The predictions of this model qualitatively and quantitatively agree with the empirical observations. This allows us to estimate parameters for the process of selfish DNA spreading in a genome during its evolution. The obtained results shed light on how evolution of selfish DNA elements shapes non-trivial statistical properties of genomes. PMID:27488939

  5. Classification of edible oils by employing 31P and 1H NMR spectroscopy in combination with multivariate statistical analysis. A proposal for the detection of seed oil adulteration in virgin olive oils.

    PubMed

    Vigli, Georgia; Philippidis, Angelos; Spyros, Apostolos; Dais, Photis

    2003-09-10

    A combination of (1)H NMR and (31)P NMR spectroscopy and multivariate statistical analysis was used to classify 192 samples from 13 types of vegetable oils, namely, hazelnut, sunflower, corn, soybean, sesame, walnut, rapeseed, almond, palm, groundnut, safflower, coconut, and virgin olive oils from various regions of Greece. 1,2-Diglycerides, 1,3-diglycerides, the ratio of 1,2-diglycerides to total diglycerides, acidity, iodine value, and fatty acid composition determined upon analysis of the respective (1)H NMR and (31)P NMR spectra were selected as variables to establish a classification/prediction model by employing discriminant analysis. This model, obtained from the training set of 128 samples, resulted in a significant discrimination among the different classes of oils, whereas 100% of correct validated assignments for 64 samples were obtained. Different artificial mixtures of olive-hazelnut, olive-corn, olive-sunflower, and olive-soybean oils were prepared and analyzed by (1)H NMR and (31)P NMR spectroscopy. Subsequent discriminant analysis of the data allowed detection of adulteration as low as 5% w/w, provided that fresh virgin olive oil samples were used, as reflected by their high 1,2-diglycerides to total diglycerides ratio (D > or = 0.90).

  6. Integrating statistical and clinical research elements in intervention-related grant applications: summary from an NIMH workshop.

    PubMed

    Sherrill, Joel T; Sommers, David I; Nierenberg, Andrew A; Leon, Andrew C; Arndt, Stephan; Bandeen-Roche, Karen; Greenhouse, Joel; Guthrie, Donald; Normand, Sharon-Lise; Phillips, Katharine A; Shear, M Katherine; Woolson, Robert

    2009-01-01

    The authors summarize points for consideration generated in a National Institute of Mental Health (NIMH) workshop convened to provide an opportunity for reviewers from different disciplines-specifically clinical researchers and statisticians-to discuss how their differing and complementary expertise can be well integrated in the review of intervention-related grant applications. A 1-day workshop was convened in October, 2004. The workshop featured panel presentations on key topics followed by interactive discussion. This article summarizes the workshop and subsequent discussions, which centered on topics including weighting the statistics/data analysis elements of an application in the assessment of the application's overall merit; the level of statistical sophistication appropriate to different stages of research and for different funding mechanisms; some key considerations in the design and analysis portions of applications; appropriate statistical methods for addressing essential questions posed by an application; and the role of the statistician in the application's development, study conduct, and interpretation and dissemination of results. A number of key elements crucial to the construction and review of grant applications were identified. It was acknowledged that intervention-related studies unavoidably involve trade-offs. Reviewers are helped when applications acknowledge such trade-offs and provide good rationale for their choices. Clear linkage among the design, aims, hypotheses, and data analysis plan and avoidance of disconnections among these elements also strengthens applications. The authors identify multiple points to consider when constructing intervention-related grant applications. The points are presented here as questions and do not reflect institute policy or comprise a list of best practices, but rather represent points for consideration.

  7. Identifying technical aliases in SELDI mass spectra of complex mixtures of proteins

    PubMed Central

    2013-01-01

    Background Biomarker discovery datasets created using mass spectrum protein profiling of complex mixtures of proteins contain many peaks that represent the same protein with different charge states. Correlated variables such as these can confound the statistical analyses of proteomic data. Previously we developed an algorithm that clustered mass spectrum peaks that were biologically or technically correlated. Here we demonstrate an algorithm that clusters correlated technical aliases only. Results In this paper, we propose a preprocessing algorithm that can be used for grouping technical aliases in mass spectrometry protein profiling data. The stringency of the variance allowed for clustering is customizable, thereby affecting the number of peaks that are clustered. Subsequent analysis of the clusters, instead of individual peaks, helps reduce difficulties associated with technically-correlated data, and can aid more efficient biomarker identification. Conclusions This software can be used to pre-process and thereby decrease the complexity of protein profiling proteomics data, thus simplifying the subsequent analysis of biomarkers by decreasing the number of tests. The software is also a practical tool for identifying which features to investigate further by purification, identification and confirmation. PMID:24010718

  8. Negative emotionality moderates associations among attachment, toddler sleep, and later problem behaviors.

    PubMed

    Troxel, Wendy M; Trentacosta, Christopher J; Forbes, Erika E; Campbell, Susan B

    2013-02-01

    Secure parent-child relationships are implicated in children's self-regulation, including the ability to self-soothe at bedtime. Sleep, in turn, may serve as a pathway linking attachment security with subsequent emotional and behavioral problems in children. We used path analysis to examine the direct relationship between attachment security and maternal reports of sleep problems during toddlerhood and the degree to which sleep serves as a pathway linking attachment with subsequent teacher-reported emotional and behavioral problems. We also examined infant negative emotionality as a vulnerability factor that may potentiate attachment-sleep-adjustment outcomes. Data were drawn from 776 mother-infant dyads participating in the National Institute of Child and Human Development Study of Early Child Care. After statistically adjusting for mother and child characteristics, including child sleep and emotional and behavioral problems at 24 months, we found no evidence for a statistically significant direct path between attachment security and sleep problems at 36 months; however, there was a direct relationship between sleep problems at 36 months and internalizing problems at 54 months. Path models that examined the moderating influence of infant negative emotionality demonstrated significant direct relationships between attachment security and toddler sleep problems and between sleep problems and subsequent emotional and behavioral problems, but only among children characterized by high negative emotionality at 6 months. In addition, among this subset, there was a significant indirect path between attachment and internalizing problems through sleep problems. These longitudinal findings implicate sleep as one critical pathway linking attachment security with adjustment difficulties, particularly among temperamentally vulnerable children. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. Negative Emotionality Moderates Associations among Attachment, Toddler Sleep, and Later Problem Behaviors

    PubMed Central

    Troxel, Wendy M.; Trentacosta, Christopher J.; Forbes, Erika E.; Campbell, Susan B.

    2013-01-01

    Secure parent-child relationships are implicated in children’s self-regulation, including the ability to self-soothe at bedtime. Sleep, in turn, may serve as a pathway linking attachment security with subsequent emotional and behavioral problems in children. We used path analysis to examine the direct relationship between attachment security and maternal-reports of sleep problems during toddlerhood, and the degree to which sleep serves as a pathway linking attachment with subsequent teacher-reported emotional and behavioral problems. We also examined infant negative emotionality as a vulnerability factor that may potentiate attachment-sleep-adjustment outcomes. Data were drawn from 776 mother-infant dyads participating in the NICHD Study of Early Child Care (SECC). In the full sample, after statistically adjusting for mother and child characteristics, including child sleep and emotional and behavioral problems at 24 months, we did not find evidence for a statistically significant direct path between attachment security and sleep problems at 36 months; however, there was a direct relationship between sleep problems at 36 months and internalizing problems at 54 months. Path models that examined the moderating influence of infant negative emotionality demonstrated significant direct relationships between attachment security and toddler sleep problems, and sleep problems and subsequent emotional and behavioral problems, but only among children characterized by high negative emotionality at 6 months of age. In addition, among this subset, there was a significant indirect path between attachment and internalizing problems through sleep problems. These longitudinal findings implicate sleep as one critical pathway linking attachment security with adjustment difficulties, particularly among temperamentally vulnerable children. PMID:23421840

  10. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  11. Numerical analysis of the effect of surface roughness on mechanical fields in polycrystalline aggregates

    NASA Astrophysics Data System (ADS)

    Guilhem, Yoann; Basseville, Stéphanie; Curtit, François; Stéphan, Jean-Michel; Cailletaud, Georges

    2018-06-01

    This paper is dedicated to the study of the influence of surface roughness on local stress and strain fields in polycrystalline aggregates. Finite element computations are performed with a crystal plasticity model on a 316L stainless steel polycrystalline material element with different roughness states on its free surface. The subsequent analysis of the plastic strain localization patterns shows that surface roughness strongly affects the plastic strain localization induced by crystallography. Nevertheless, this effect mainly takes place at the surface and vanishes under the first layer of grains, which implies the existence of a critical perturbed depth. A statistical analysis based on the plastic strain distribution obtained for different roughness levels provides a simple rule to define the size of the affected zone depending on the rough surface parameters.

  12. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  13. Prescription drug advertising trends: a study of oral hypoglycemics.

    PubMed

    Mehta, K K; Sorofman, B A; Rowland, C R

    1989-01-01

    A content analysis of oral hypoglycemic drug advertisements was performed in selected medical journals published in the United States from 1963 to 1986. The 665 advertisements subsequently examined were studied for certain predetermined parameters in order to indicate trends. The trend results may be summarized as follows. As an extension of prescription drug advertising trends in general, oral hypoglycemic drug advertising showed an increasing length along with a subsequent decrease in the amount of space devoted to the copy portion of the advertisement. They also showed a decrease in the use of statistical information and rarely made references to competitors. Nongender specific, colored advertisements with product and use related appeals have become more common with the passage of time. Although the rationale and purpose behind advertising is unchanged, the format has changed considerably. These changes are primarily due to the enhancement of print technology and to some extent, the changing social environment.

  14. The Student-to-Student Chemistry Initiative: Training High School Students To Perform Chemistry Demonstration Programs for Elementary School Students

    NASA Astrophysics Data System (ADS)

    Voegel, Phillip D.; Quashnock, Kathryn A.; Heil, Katrina M.

    2004-05-01

    The Student-to-Student Chemistry Initiative is an outreach program started in the fall of 2001 at Midwestern State University (MSU). The oncampus program trains high school science students to perform a series of chemistry demonstrations and subsequently provides kits containing necessary supplies and reagents for the high school students to perform demonstration programs at elementary schools. The program focuses on improving student perception of science. The program's impact on high school student perception is evaluated through statistical analysis of paired preparticipation and postparticipation surveys. The surveys focus on four areas of student perception: general attitude toward science, interest in careers in science, science awareness, and interest in attending MSU for postsecondary education. Increased scores were observed in all evaluation areas including a statistically significant increase in science awareness following participation.

  15. The use of open source bioinformatics tools to dissect transcriptomic data.

    PubMed

    Nitsche, Benjamin M; Ram, Arthur F J; Meyer, Vera

    2012-01-01

    Microarrays are a valuable technology to study fungal physiology on a transcriptomic level. Various microarray platforms are available comprising both single and two channel arrays. Despite different technologies, preprocessing of microarray data generally includes quality control, background correction, normalization, and summarization of probe level data. Subsequently, depending on the experimental design, diverse statistical analysis can be performed, including the identification of differentially expressed genes and the construction of gene coexpression networks.We describe how Bioconductor, a collection of open source and open development packages for the statistical programming language R, can be used for dissecting microarray data. We provide fundamental details that facilitate the process of getting started with R and Bioconductor. Using two publicly available microarray datasets from Aspergillus niger, we give detailed protocols on how to identify differentially expressed genes and how to construct gene coexpression networks.

  16. Evaluation of Masood’s and Modified Masood’s Scoring Systems in the Cytological Diagnosis of Palpable Breast Lump Aspirates

    PubMed Central

    Chithrabhanu, Savithri Moothiringode

    2017-01-01

    Introduction Fine Needle Aspiration Cytology (FNAC) has a leading role in the assessment of breast lesions. Masood’s Scoring Index (MSI) and its modification (Modified Masood’s scoring index; MMSI) has been proposed to aid in sub-grouping breast lesions and to help in subsequent management. Aim To assess and compare the diagnostic accuracy of MSI and MMSI by subsequent correlation with histopathology. Materials and Methods The study was cross-sectional in nature and was conducted in a tertiary care setting. The study included 207 cases presenting as palpable breast lump, which had undergone FNAC and subsequent excision biopsy for histopathology. Statistical Analysis The cases were grouped into four categories as suggested by Masood et al., (MSI) and Nandini et al., (MMSI) and concordance analysis with reference to histopathological diagnosis was done. Results In comparison to MSI, MMSI showed better concordance with histopathological diagnosis and superior diagnostic accuracy in non-proliferative breast disease category (p-value = 0.046) as well as in proliferative breast disease without atypia category. The overall diagnostic accuracy of the cytological scoring was 97.5%, with 94.5% sensitivity and 100% specificity. Conclusion Though both MSI and MMSI were found effective in subcategorizing breast lesions, MMSI was found to have better concordance with histopathology. Inclusion of cellular pattern and background material may further help in increasing the accuracy. PMID:28571141

  17. Assessing Threat Detection Scenarios through Hypothesis Generation and Testing

    DTIC Science & Technology

    2015-12-01

    Publications. Field, A. (2005). Discovering statistics using SPSS (2nd ed.). Thousand Oaks, CA: Sage Publications. Fisher, S. D., Gettys, C. F...therefore, subsequent F statistics are reported using the Huynh-Feldt correction (Greenhouse-Geisser Epsilon > .775). Experienced and inexperienced...change in hypothesis using experience and initial confidence as predictors. In the Dog Day scenario, the regression was not statistically

  18. Natural variation reveals relationships between pre-stress carbohydrate nutritional status and subsequent responses to xenobiotic and oxidative stress in Arabidopsis thaliana

    PubMed Central

    Ramel, Fanny; Sulmon, Cécile; Gouesbet, Gwenola; Couée, Ivan

    2009-01-01

    Background Soluble sugars are involved in responses to stress, and act as signalling molecules that activate specific or hormone cross-talk transduction pathways. Thus, exogenous sucrose treatment efficiently induces tolerance to the herbicide atrazine in Arabidopsis thaliana plantlets, at least partially through large-scale modifications of expression of stress-related genes. Methods Availability of sugars in planta for stress responses is likely to depend on complex dynamics of soluble sugar accumulation, sucrose–starch partition and organ allocation. The question of potential relationships between endogenous sugar levels and stress responses to atrazine treatment was investigated through analysis of natural genetic accessions of A. thaliana. Parallel quantitative and statistical analysis of biochemical parameters and of stress-sensitive physiological traits was carried out on a set of 11 accessions. Key Results Important natural variation was found between accessions of A. thaliana in pre-stress shoot endogenous sugar levels and responses of plantlets to subsequent atrazine stress. Moreover, consistent trends and statistically significant correlations were detected between specific endogenous sugar parameters, such as the pre-stress end of day sucrose level in shoots, and physiological markers of atrazine tolerance. Conclusions These significant relationships between endogenous carbohydrate metabolism and stress response therefore point to an important integration of carbon nutritional status and induction of stress tolerance in plants. The specific correlation between pre-stress sucrose level and greater atrazine tolerance may reflect adaptive mechanisms that link sucrose accumulation, photosynthesis-related stress and sucrose induction of stress defences. PMID:19789177

  19. Subsequent health-care utilization associated with early physical therapy for new episodes of low back pain in older adults.

    PubMed

    Karvelas, Deven A; Rundell, Sean D; Friedly, Janna L; Gellhorn, Alfred C; Gold, Laura S; Comstock, Bryan A; Heagerty, Patrick J; Bresnahan, Brian W; Nerenz, David R; Jarvik, Jeffrey G

    2017-03-01

    The association between early physical therapy (PT) and subsequent health-care utilization following a new visit for low back pain is not clear, particularly in the setting of acute low back pain. This study aimed to estimate the association between initiating early PT following a new visit for an episode of low back pain and subsequent back pain-specific health-care utilization in older adults. This is a prospective cohort study. Data were collected at three integrated health-care systems in the United States through the Back Pain Outcomes using Longitudinal Data (BOLD) registry. We recruited 4,723 adults, aged 65 and older, presenting to a primary care setting with a new episode of low back pain. Primary outcome was total back pain-specific relative value units (RVUs), from days 29 to 365. Secondary outcomes included overall RVUs for all health care and use of specific health-care services including imaging (x-ray and magnetic resonance imaging [MRI] or computed tomography [CT]), emergency department visits, physician visits, PT, spinal injections, spinal surgeries, and opioid use. We compared patients who had early PT (initiated within 28 days of the index visit) with those not initiating early PT using appropriate, generalized linear models to adjust for potential confounding variables. Adjusted analysis found no statistically significant difference in total spine RVUs between the two groups (ratio of means 1.19, 95% CI of 0.72-1.96, p=.49). For secondary outcomes, only the difference between total spine imaging RVUs and total PT RVUs was statistically significant. The early PT group had greater PT RVUs; the ratio of means was 2.56 (95% CI of 2.17-3.03, p<.001). The early PT group had greater imaging RVUs; the ratio of means was 1.37 (95% CI of 1.09-1.71, p=.01.) CONCLUSIONS: We found that in a group of older adults presenting for a new episode of low back pain, the use of early PT is not associated with any statistically significant difference in subsequent back pain-specific health-care utilization compared with patients not receiving early PT. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Tandem mass spectrometry of human tryptic blood peptides calculated by a statistical algorithm and captured by a relational database with exploration by a general statistical analysis system.

    PubMed

    Bowden, Peter; Beavis, Ron; Marshall, John

    2009-11-02

    A goodness of fit test may be used to assign tandem mass spectra of peptides to amino acid sequences and to directly calculate the expected probability of mis-identification. The product of the peptide expectation values directly yields the probability that the parent protein has been mis-identified. A relational database could capture the mass spectral data, the best fit results, and permit subsequent calculations by a general statistical analysis system. The many files of the Hupo blood protein data correlated by X!TANDEM against the proteins of ENSEMBL were collected into a relational database. A redundant set of 247,077 proteins and peptides were correlated by X!TANDEM, and that was collapsed to a set of 34,956 peptides from 13,379 distinct proteins. About 6875 distinct proteins were only represented by a single distinct peptide, 2866 proteins showed 2 distinct peptides, and 3454 proteins showed at least three distinct peptides by X!TANDEM. More than 99% of the peptides were associated with proteins that had cumulative expectation values, i.e. probability of false positive identification, of one in one hundred or less. The distribution of peptides per protein from X!TANDEM was significantly different than those expected from random assignment of peptides.

  1. Unconscious analyses of visual scenes based on feature conjunctions.

    PubMed

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  2. Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approach

    NASA Astrophysics Data System (ADS)

    Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Onoe, Hironori; Mok, Chin Man W.; Wen, Jet-Chau; Huang, Shao-Yang; Wang, Wenke

    2017-04-01

    Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e.g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometer-scale-fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.

  3. Comparison of extended colectomy and limited resection in patients with Lynch syndrome.

    PubMed

    Natarajan, Nagendra; Watson, Patrice; Silva-Lopez, Edibaldo; Lynch, Henry T

    2010-01-01

    The purpose of the study was to determine the advantages and disadvantages of prophylactic/extended colectomy (subtotal colectomy) in patients with Lynch syndrome who manifest colorectal cancer. A retrospective cohort using Creighton University's hereditary cancer database was used to identify cases and controls. Cases are patients who underwent subtotal colectomy, either with no colorectal cancer diagnosis (prophylactic) or at diagnosis of first colorectal cancer; controls for these 2 types of cases were, respectively, patients who underwent no colon surgery or those having limited resection at time of diagnosis of first colorectal cancer. The Kaplan-Meier and proportional hazard regression models from the Statistical Analysis Software program was used to calculate the difference in survival, time to subsequent colorectal cancer, and subsequent abdominal surgery between cases and controls. The event-free survival of our study did not reach 50%, so we used the event-free survival at 5 years as our parameter to compare the 2 groups. The event-free survival for subsequent colorectal cancer, subsequent abdominal surgery, and death was 94%, 84%, and 93%, respectively, for cases and 74%, 63%, and 88%, respectively, for controls. Times to subsequent colorectal cancer and subsequent abdominal surgery were significantly shorter in the control group (P < .006 and P < .04, respectively). No significant difference was identified with respect to survival time between the cases and controls. Even though no survival benefit was identified between the cases and controls the increased incidence of metachronous colorectal cancer and increased abdominal surgeries among controls warrant the recommendation of subtotal colectomy in patients with Lynch syndrome.

  4. Summary and Statistical Analysis of the First AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Morgenstern, John M.

    2014-01-01

    A summary is provided for the First AIAA Sonic Boom Workshop held 11 January 2014 in conjunction with AIAA SciTech 2014. Near-field pressure signatures extracted from computational fluid dynamics solutions are gathered from nineteen participants representing three countries for the two required cases, an axisymmetric body and simple delta wing body. Structured multiblock, unstructured mixed-element, unstructured tetrahedral, overset, and Cartesian cut-cell methods are used by the participants. Participants provided signatures computed on participant generated and solution adapted grids. Signatures are also provided for a series of uniformly refined workshop provided grids. These submissions are propagated to the ground and loudness measures are computed. This allows the grid convergence of a loudness measure and a validation metric (dfference norm between computed and wind tunnel measured near-field signatures) to be studied for the first time. Statistical analysis is also presented for these measures. An optional configuration includes fuselage, wing, tail, flow-through nacelles, and blade sting. This full configuration exhibits more variation in eleven submissions than the sixty submissions provided for each required case. Recommendations are provided for potential improvements to the analysis methods and a possible subsequent workshop.

  5. Synchronized LES for acoustic near-field analysis of a supersonic jet

    NASA Astrophysics Data System (ADS)

    S, Unnikrishnan; Gaitonde, Datta; The Ohio State University Team

    2014-11-01

    We develop a novel method using simultaneous, synchronized Large Eddy Simulations (LES) to examine the manner in which the plume of a supersonic jet generates the near acoustic field. Starting from a statistically stationary state, at each time-step, the first LES (Baseline) is used to obtain native perturbations, which are then localized in space, scaled to small values and injected into the second LES (Twin). At any subsequent time, the difference between the two simulations can be processed to discern how disturbances from any particular zone in the jet are modulated and filtered by the non-linear core to form the combined hydrodynamic and acoustic near field and the fully acoustic farfield. Unlike inverse techniques that use correlations between jet turbulence and far-field signals to infer causality, the current forward analysis effectively tags and tracks native perturbations as they are processed by the jet. Results are presented for a Mach 1.3 cold jet. Statistical analysis of the baseline and perturbation boost provides insight into different mechanisms of disturbance propagation, amplification, directivity, generation of intermittent wave-packet like events and the direct and indirect effect of different parts of the jet on the acoustic field. Office of Naval Research.

  6. Noise exposure-response relationships established from repeated binary observations: Modeling approaches and applications.

    PubMed

    Schäffer, Beat; Pieren, Reto; Mendolia, Franco; Basner, Mathias; Brink, Mark

    2017-05-01

    Noise exposure-response relationships are used to estimate the effects of noise on individuals or a population. Such relationships may be derived from independent or repeated binary observations, and modeled by different statistical methods. Depending on the method by which they were established, their application in population risk assessment or estimation of individual responses may yield different results, i.e., predict "weaker" or "stronger" effects. As far as the present body of literature on noise effect studies is concerned, however, the underlying statistical methodology to establish exposure-response relationships has not always been paid sufficient attention. This paper gives an overview on two statistical approaches (subject-specific and population-averaged logistic regression analysis) to establish noise exposure-response relationships from repeated binary observations, and their appropriate applications. The considerations are illustrated with data from three noise effect studies, estimating also the magnitude of differences in results when applying exposure-response relationships derived from the two statistical approaches. Depending on the underlying data set and the probability range of the binary variable it covers, the two approaches yield similar to very different results. The adequate choice of a specific statistical approach and its application in subsequent studies, both depending on the research question, are therefore crucial.

  7. Manual tracing versus smartphone application (app) tracing: a comparative study.

    PubMed

    Sayar, Gülşilay; Kilinc, Delal Dara

    2017-11-01

    This study aimed to compare the results of conventional manual cephalometric tracing with those acquired with smartphone application cephalometric tracing. The cephalometric radiographs of 55 patients (25 females and 30 males) were traced via the manual and app methods and were subsequently examined with Steiner's analysis. Five skeletal measurements, five dental measurements and two soft tissue measurements were managed based on 21 landmarks. The durations of the performances of the two methods were also compared. SNA (Sella, Nasion, A point angle) and SNB (Sella, Nasion, B point angle) values for the manual method were statistically lower (p < .001) than those for the app method. The ANB value for the manual method was statistically lower than that of app method. L1-NB (°) and upper lip protrusion values for the manual method were statistically higher than those for the app method. Go-GN/SN, U1-NA (°) and U1-NA (mm) values for manual method were statistically lower than those for the app method. No differences between the two methods were found in the L1-NB (mm), occlusal plane to SN, interincisal angle or lower lip protrusion values. Although statistically significant differences were found between the two methods, the cephalometric tracing proceeded faster with the app method than with the manual method.

  8. MTHFR gene polymorphism and risk of myeloid leukemia: a meta-analysis.

    PubMed

    Dong, Song; Liu, Yueling; Chen, Jieping

    2014-09-01

    An increasing body of evidence has shown that the amino acid changes at position 1298 might eliminate methylenetetrahydrofolate reductase (MTHFR) enzyme activity, leading to insufficient folic acid and subsequent human chromosome breakage. Epidemiological studies have linked MTHFR single-nucleotide polymorphism (SNP) rs1801131 to myeloid leukemia risk, with considerable discrepancy in their results. We therefore were prompted to clarify this issue by use of a meta-analysis. The search terms were used to cover the possible reports in the MEDLINE, Web of Knowledge, and China National Knowledge Infrastructure (CNKI) databases. Odds ratios were estimated to assess the association of SNP rs1801131 with myeloid leukemia risk. Statistical heterogeneity was detected using the Q-statistic and I (2) metric. Subgroup analysis was performed by ethnicity, histological subtype, and Hardy-Weinberg equilibrium (HWE). This meta-analysis of eight publications with a total of 1,114 cases and 3,227 controls revealed no global association. Nor did the subgroup analysis according to histological subtype and HWE show any significant associations. However, Asian individuals who harbored the CC genotype were found to have 1.66-fold higher risk of myeloid leukemia (odds ratio, 1.66; 95 % confidence interval, 1.10 to 2.49; P h = 0.342; I (2) = 0.114). Our meta-analysis has presented evidence supporting a possible association between the CC genotype of MTHFR SNP rs1801131 and myeloid leukemia in Asian populations.

  9. Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.

    DTIC Science & Technology

    1985-12-27

    Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan

  10. [Histologic assessment of tissue healing of hyaline cartilage by use of semiquantitative evaluation scale].

    PubMed

    Vukasović, Andreja; Ivković, Alan; Jezek, Davor; Cerovecki, Ivan; Vnuk, Drazen; Kreszinger, Mario; Hudetz, Damir; Pećina, Marko

    2011-01-01

    Articular cartilage is an avascular and aneural tissue lacking lymph drainage, hence its inability of spontaneous repair following injury. Thus, it offers an interesting model for scientific research. A number of methods have been suggested to enhance cartilage repair, but none has yet produced significant success. The possible application of the aforementioned methods has brought about the necessity to evaluate their results. The objective of this study was to analyze results of a study of the effects of the use of TGF-beta gene transduced bone marrow clot on articular cartilage defects using ICRS visual histological assessment scale. The research was conducted on 28 skeletally mature sheep that were randomly assigned to four groups and surgically inflicted femoral chondral defects. The articular surfaces were then treated with TGF-beta1 gene transduced bone marrow clot (TGF group), GFP transduced bone marrow clot (GFP group), untransduced bone marrow clot (BM group) or left untreated (NC group). The analysis was performed by visual examination of cartilage samples and results were obtained using ICRS visual histological assessment scale. The results were subsequently subjected to statistical assessment using Kruskal-Wallis and Mann-Whitney tests. Kruskal-Wallis test yielded statistically significant difference with respect to cell distribution. Mann-Whitney test showed statistically significant difference between TGF and NC groups (P = 0.002), as well as between BM and NC groups (P = 0.002 with Bonferroni correction). Twenty-six of the twenty-eight samples were subjected to histologic and subsequent statistical analysis; two were discarded due to faulty histology technique. Our results indicated a level of certainty as to the positive effect of TGF-beta1 gene transduced bone marrow clot in restoration of articular cartilage defects. However, additional research is necessary in the field. One of the significant drawbacks on histologic assessment of cartilage samples were the errors in histologic preparation, for which some samples had to be discarded and significantly impaired the analytical quality of the others. Defects of structures surrounding the articular cartilage, e.g., subchondral bone or connective tissue, might also impair the quality of histologic analysis. Additional analyses, i.e. polarizing microscopy should be performed to determine the degree of integration of the newly formed tissue with the surrounding cartilage. The semiquantitative ICRS scale, although of great practical value, has limitations as to the objectivity of the assessment, taking into account the analytical ability of the evaluator, as well as the accuracy of semiquantitative analysis in comparison to the methods of quantitative analysis. Overall results of histologic analysis indicated that the application of TGF-beta1 gene transduced bone marrow clot could have measurable clinical effects on articular cartilage repair. The ICRS visual histological assessment scale is a valuable analytical method for cartilage repair evaluation. In this respect, further analyses of the method value would be of great importance.

  11. Stress and adult smartphone addiction: Mediation by self-control, neuroticism, and extraversion.

    PubMed

    Cho, Hea-Young; Kim, Dai Jin; Park, Jae Woo

    2017-12-01

    This study employed descriptive statistics and correlation analysis to examine the influence of stress on smartphone addiction as well as the mediating effects of self-control, neuroticism, and extraversion using 400 men and women in their 20s to 40s followed by structural equation analysis. Our findings indicate that stress had a significant influence on smartphone addiction, and self-control mediates the influence of stress on smartphone addiction. As stress increases, self-control decreases, which subsequently leads to increased smartphone addiction. Self-control was confirmed as an important factor in the prevention of smartphone addiction. Finally, among personality factors, neuroticism, and extraversion mediate the influence of stress on smartphone addiction. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Use of the Analysis of the Volatile Faecal Metabolome in Screening for Colorectal Cancer

    PubMed Central

    2015-01-01

    Diagnosis of colorectal cancer is an invasive and expensive colonoscopy, which is usually carried out after a positive screening test. Unfortunately, existing screening tests lack specificity and sensitivity, hence many unnecessary colonoscopies are performed. Here we report on a potential new screening test for colorectal cancer based on the analysis of volatile organic compounds (VOCs) in the headspace of faecal samples. Faecal samples were obtained from subjects who had a positive faecal occult blood sample (FOBT). Subjects subsequently had colonoscopies performed to classify them into low risk (non-cancer) and high risk (colorectal cancer) groups. Volatile organic compounds were analysed by selected ion flow tube mass spectrometry (SIFT-MS) and then data were analysed using both univariate and multivariate statistical methods. Ions most likely from hydrogen sulphide, dimethyl sulphide and dimethyl disulphide are statistically significantly higher in samples from high risk rather than low risk subjects. Results using multivariate methods show that the test gives a correct classification of 75% with 78% specificity and 72% sensitivity on FOBT positive samples, offering a potentially effective alternative to FOBT. PMID:26086914

  13. A proposed method to minimize waste from institutional radiation safety surveillance programs through the application of expected value statistics.

    PubMed

    Emery, R J

    1997-03-01

    Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.

  14. Prolonged Instability Prior to a Regime Shift | Science ...

    EPA Pesticide Factsheets

    Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia. This manuscript explores various methods of assessing the transition between alternative states in an ecological system described by a long-term high-resolution paleoecological dataset.

  15. Estimation of the amount of asbestos-cement roofing in Poland.

    PubMed

    Wilk, Ewa; Krówczyńska, Małgorzata; Pabjanek, Piotr; Mędrzycki, Piotr

    2017-05-01

    The unique set of physical and chemical properties has led to many industrial applications of asbestos worldwide; one of them was roof covering. Asbestos is harmful to human health, and therefore its use was legally forbidden. Since in Poland there is no adequate data on the amount of asbestos-cement roofing, the objective of this study was to estimate its quantity on the basis of physical inventory taking with the use of aerial imagery, and the application of selected statistical features. Data pre-processing and analysis was executed in R Statistical Environment v. 3.1.0. Best random forest models were computed; model explaining 72.9% of the variance was subsequently used to prepare the prediction map of the amount of asbestos-cement roofing in Poland. Variables defining the number of farms, number and age of buildings, and regional differences were crucial for the analysis. The total amount of asbestos roofing in Poland was estimated at 738,068,000 m 2 (8.2m t). It is crucial for the landfill development programme, financial resources distribution, and application of monitoring policies.

  16. Collaborative classification of hyperspectral and visible images with convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Mengmeng; Li, Wei; Du, Qian

    2017-10-01

    Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.

  17. Performing Inferential Statistics Prior to Data Collection

    ERIC Educational Resources Information Center

    Trafimow, David; MacDonald, Justin A.

    2017-01-01

    Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…

  18. Linking sounds to meanings: infant statistical learning in a natural language.

    PubMed

    Hay, Jessica F; Pelucchi, Bruna; Graf Estes, Katharine; Saffran, Jenny R

    2011-09-01

    The processes of infant word segmentation and infant word learning have largely been studied separately. However, the ease with which potential word forms are segmented from fluent speech seems likely to influence subsequent mappings between words and their referents. To explore this process, we tested the link between the statistical coherence of sequences presented in fluent speech and infants' subsequent use of those sequences as labels for novel objects. Notably, the materials were drawn from a natural language unfamiliar to the infants (Italian). The results of three experiments suggest that there is a close relationship between the statistics of the speech stream and subsequent mapping of labels to referents. Mapping was facilitated when the labels contained high transitional probabilities in the forward and/or backward direction (Experiment 1). When no transitional probability information was available (Experiment 2), or when the internal transitional probabilities of the labels were low in both directions (Experiment 3), infants failed to link the labels to their referents. Word learning appears to be strongly influenced by infants' prior experience with the distribution of sounds that make up words in natural languages. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Biomarkers of tolerance: searching for the hidden phenotype.

    PubMed

    Perucha, Esperanza; Rebollo-Mesa, Irene; Sagoo, Pervinder; Hernandez-Fuentes, Maria P

    2011-08-01

    Induction of transplantation tolerance remains the ideal long-term clinical and logistic solution to the current challenges facing the management of renal allograft recipients. In this review, we describe the recent studies and advances made in identifying biomarkers of renal transplant tolerance, from study inceptions, to the lessons learned and their implications for current and future studies with the same goal. With the age of biomarker discovery entering a new dimension of high-throughput technologies, here we also review the current approaches, developments, and pitfalls faced in the subsequent statistical analysis required to identify valid biomarker candidates.

  20. Integrative pathway analysis of a genome-wide association study of V̇o2max response to exercise training

    PubMed Central

    Vivar, Juan C.; Sarzynski, Mark A.; Sung, Yun Ju; Timmons, James A.; Bouchard, Claude; Rankinen, Tuomo

    2013-01-01

    We previously reported the findings from a genome-wide association study of the response of maximal oxygen uptake (V̇o2max) to an exercise program. Here we follow up on these results to generate hypotheses on genes, pathways, and systems involved in the ability to respond to exercise training. A systems biology approach can help us better establish a comprehensive physiological description of what underlies V̇o2maxtrainability. The primary material for this exploration was the individual single-nucleotide polymorphism (SNP), SNP-gene mapping, and statistical significance levels. We aimed to generate novel hypotheses through analyses that go beyond statistical association of single-locus markers. This was accomplished through three complementary approaches: 1) building de novo evidence of gene candidacy through informatics-driven literature mining; 2) aggregating evidence from statistical associations to link variant enrichment in biological pathways to V̇o2max trainability; and 3) predicting possible consequences of variants residing in the pathways of interest. We started with candidate gene prioritization followed by pathway analysis focused on overrepresentation analysis and gene set enrichment analysis. Subsequently, leads were followed using in silico analysis of predicted SNP functions. Pathways related to cellular energetics (pantothenate and CoA biosynthesis; PPAR signaling) and immune functions (complement and coagulation cascades) had the highest levels of SNP burden. In particular, long-chain fatty acid transport and fatty acid oxidation genes and sequence variants were found to influence differences in V̇o2max trainability. Together, these methods allow for the hypothesis-driven ranking and prioritization of genes and pathways for future experimental testing and validation. PMID:23990238

  1. Motor vehicle crashes during pregnancy and cerebral palsy during infancy: a longitudinal cohort analysis.

    PubMed

    Redelmeier, Donald A; Naqib, Faisal; Thiruchelvam, Deva; R Barrett, Jon F

    2016-09-20

    To assess the incidence of cerebral palsy among children born to mothers who had their pregnancy complicated by a motor vehicle crash. Retrospective longitudinal cohort analysis of children born from 1 April 2002 to 31 March 2012 in Ontario, Canada. Cases defined as pregnancies complicated by a motor vehicle crash and controls as remaining pregnancies with no crash. Subsequent diagnosis of cerebral palsy by age 3 years. A total of 1 325 660 newborns were analysed, of whom 7933 were involved in a motor vehicle crash during pregnancy. A total of 2328 were subsequently diagnosed with cerebral palsy, equal to an absolute risk of 1.8 per 1000 newborns. For the entire cohort, motor vehicle crashes correlated with a 29% increased risk of subsequent cerebral palsy that was not statistically significant (95% CI -16 to +110, p=0.274). The increased risk was only significant for those with preterm birth who showed an 89% increased risk of subsequent cerebral palsy associated with a motor vehicle crash (95% CI +7 to +266, p=0.037). No significant increase was apparent for those with a term delivery (95% CI -62 to +79, p=0.510). A propensity score-matched analysis of preterm births (n=4384) yielded a 138% increased relative risk of cerebral palsy associated with a motor vehicle crash (95% CI +27 to +349, p=0.007), equal to an absolute increase of about 10.9 additional cases per 1000 newborns (18.2 vs 7.3, p=0.010). Motor vehicle crashes during pregnancy may be associated with an increased risk of cerebral palsy among the subgroup of cases with preterm birth. The increase highlights a specific role for traffic safety advice in prenatal care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  3. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  4. Response surface method in geotechnical/structural analysis, phase 1

    NASA Astrophysics Data System (ADS)

    Wong, F. S.

    1981-02-01

    In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.

  5. Psychological aptitude evaluation of the special forces candidate.

    PubMed

    Genoni, Luca; Jelmini, F; Lang, M; Muggli, F

    2017-02-01

    Changes in recruitment procedures reduced early dismissal rates from Swiss military basic recruitment schools; however, such improvements were not reflected in premature discharge rates from the special forces (SF) (Grenadier) recruitment school. A six-item questionnaire designed to identify recruits likely to be subject to premature dismissal on psychological or psychiatric grounds was developed and prospectively validated. The questionnaire was based on an analysis of medical and psychiatric/psychological records of 26 recruits dismissed from a SF recruitment school. Six items were identified that appeared to have prognostic value for early discharge. These six questions were submitted to the remaining applicants in the recruitment school by a suitably qualified psychologist or psychiatrist and effectively identified candidates who would be discharged early. Based on these results a 0-6 scale was developed and applied prospectively to subsequent Grenadier recruitment courses. Statistical analysis showed that 75% of candidates with the lowest scores would eventually complete the course and that no candidates with highest scores would subsequently complete the recruitment course. Prospective studies in subsequent recruitment courses candidates with high scores were classified as not qualified to enter the course, and those with intermediate scores were subject to additional in-depth interviews with a psychologist or psychiatrist to determine their suitability. In the following courses a correlation was established between the questionnaire score and week of discharge for those discharged. Application of this method during subsequent recruitment courses has reduced early dismissal from Swiss SF recruitment schools. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. A whole brain morphometric analysis of changes associated with pre-term birth

    NASA Astrophysics Data System (ADS)

    Thomaz, C. E.; Boardman, J. P.; Counsell, S.; Hill, D. L. G.; Hajnal, J. V.; Edwards, A. D.; Rutherford, M. A.; Gillies, D. F.; Rueckert, D.

    2006-03-01

    Pre-term birth is strongly associated with subsequent neuropsychiatric impairment. To identify structural differences in preterm infants we have examined a dataset of magnetic resonance (MR) images containing 88 preterm infants and 19 term born controls. We have analyzed these images by combining image registration, deformation based morphometry (DBM), multivariate statistics, and effect size maps (ESM). The methodology described has been performed directly on the MR intensity images rather than on segmented versions of the images. The results indicate that the approach described makes clear the statistical differences between the control and preterm samples, showing a leave-one-out classification accuracy of 94.74% and 95.45% respectively. In addition, finding the most discriminant direction between the groups and using DBM features and ESM we are able to identify not only what are the changes between preterm and term groups but also how relatively relevant they are in terms of volume expansion and contraction.

  7. Development of chemistry attitudes and experiences questionnaire (CAEQ)

    NASA Astrophysics Data System (ADS)

    Dalgety, Jacinta; Coll, Richard K.; Jones, Alister

    2003-09-01

    In this article we describe the development of the Chemistry Attitudes and Experiences Questionnaire (CAEQ) that measures first-year university chemistry students' attitude toward chemistry, chemistry self-efficacy, and learning experiences. The instrument was developed as part of a larger study and sought to fulfill a need for an instrument to investigate factors that influence student enrollment choice. We set out to design the instrument in a manner that would maximize construct validity. The CAEQ was piloted with a cohort of science and technology students (n = 129) at the end of their first year. Based on statistical analysis the instrument was modified and subsequently administered on two occasions at two tertiary institutions (n = 669). Statistical data along with additional data gathered from interviews suggest that the CAEQ possesses good construct validity and will prove a useful tool for tertiary level educators who wish to gain an understanding of factors that influence student choice of chemistry enrolment.

  8. Recovery of several volatile organic compounds from simulated water samples: Effect of transport and storage

    USGS Publications Warehouse

    Friedman, L.C.; Schroder, L.J.; Brooks, M.G.

    1986-01-01

    Solutions containing volatile organic compounds were prepared in organic-free water and 2% methanol and submitted to two U.S. Geological Survey laboratories. Data from the determination of volatile compounds in these samples were compared to analytical data for the same volatile compounds that had been kept in solutions 100 times more concentrated until immediately before analysis; there was no statistically significant difference in the analytical recoveries. Addition of 2% methanol to the storage containers hindered the recovery of bromomethane and vinyl chloride. Methanol addition did not enhance sample stability. Further, there was no statistically significant difference in results from the two laboratories, and the recovery efficiency was more than 80% in more than half of the determinations made. In a subsequent study, six of eight volatile compounds showed no significant loss of recovery after 34 days.

  9. Inferring action structure and causal relationships in continuous sequences of human action.

    PubMed

    Buchsbaum, Daphna; Griffiths, Thomas L; Plunkett, Dillon; Gopnik, Alison; Baldwin, Dare

    2015-02-01

    In the real world, causal variables do not come pre-identified or occur in isolation, but instead are embedded within a continuous temporal stream of events. A challenge faced by both human learners and machine learning algorithms is identifying subsequences that correspond to the appropriate variables for causal inference. A specific instance of this problem is action segmentation: dividing a sequence of observed behavior into meaningful actions, and determining which of those actions lead to effects in the world. Here we present a Bayesian analysis of how statistical and causal cues to segmentation should optimally be combined, as well as four experiments investigating human action segmentation and causal inference. We find that both people and our model are sensitive to statistical regularities and causal structure in continuous action, and are able to combine these sources of information in order to correctly infer both causal relationships and segmentation boundaries. Copyright © 2014. Published by Elsevier Inc.

  10. Fracture load and failure analysis of zirconia single crowns veneered with pressed and layered ceramics after chewing simulation.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Roos, Malgorzata; Trottmann, Albert; Hämmerle, Christoph H F

    2011-01-01

    This study determined the fracture load of zirconia crowns veneered with four overpressed and four layered ceramics after chewing simulation. The veneered zirconia crowns were cemented and subjected to chewing cycling. Subsequently, the specimens were loaded at an angle of 45° in a Universal Testing Machine to determine the fracture load. One-way ANOVA, followed by a post-hoc Scheffé test, t-test and Weibull statistic were performed. Overpressed crowns showed significantly lower fracture load (543-577 N) compared to layered ones (805-1067 N). No statistical difference was found between the fracture loads within the overpressed group. Within the layered groups, LV (1067 N) presented significantly higher results compared to LC (805 N). The mean values of all other groups were not significantly different. Single zirconia crowns veneered with overpressed ceramics exhibited lower fracture load than those of the layered ones after chewing simulation.

  11. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less

  12. Clinical outcomes and medication adherence in acute coronary syndrome patients with and without type 2 diabetes mellitus: a longitudinal analysis 2006-2011.

    PubMed

    Cziraky, Mark J; Reddy, Vanessa S; Luthra, Rakesh; Xu, Yaping; Wilhelm, Kenneth; Power, Thomas P; Fisher, Maxine D

    2015-06-01

    The presence of type 2 diabetes mellitus magnifies the risks associated with acute coronary syndrome (ACS), increasing the risk of recurrent cardiovascular events (CVEs) and doubling the risk of death. Managing cardiovascular risk factors has little effect on lowering the mortality risk in patients with type 2 diabetes. To evaluate the relationship between type 2 diabetes mellitus and subsequent CVEs and medication adherence following ACS hospitalization. Patients with ACS were identified using ICD-9-CM codes for acute myocardial infarction or unstable angina. The risk of subsequent CVEs was assessed at 1 and 3 years after the index ACS event based on type 2 diabetes status, adjusting for baseline demographic characteristics, comorbidities, medication use, and index ACS characteristics. Of 140,903 patients with ACS (mean age 66.8 years, 58.6% male), 27.4% had type 2 diabetes. During follow-up, 22.0% had subsequent CVEs (26.2% type 2 diabetes, 19.0% nondiabetes). After adjusting for other covariates, type 2 diabetes was associated with increased risk of subsequent CVEs by 9.7% at 1 year and 10.2% at 3 years (both P < 0.001). Most patients were not revascularized at first recurrence after index ACS discharge (79.2% type 2 diabetes, 77.5% nondiabetes). Patients with type 2 diabetes had statistically significant higher adherence rates for antiplatelet agents at 1 year and antihypertensives at 1 and 3 years versus nondiabetes patients. Persistence was higher in the type 2 diabetes group for antihypertensives and in the nondiabetes group for antiplatelet agents and statins. This analysis demonstrates that patients with type 2 diabetes have a higher risk of subsequent CVEs following an initial event versus those without diabetes, despite evidence of higher treatment persistence for certain medications. Adherence rates remained suboptimal, suggesting a continuing need for patient education.

  13. Grain boundary oxidation and an analysis of the effects of oxidation on fatigue crack nucleation life

    NASA Technical Reports Server (NTRS)

    Oshida, Y.; Liu, H. W.

    1988-01-01

    The effects of preoxidation on subsequent fatigue life were studied. Surface oxidation and grain boundary oxidation of a nickel-base superalloy (TAZ-8A) were studied at 600 to 1000 C for 10 to 1000 hours in air. Surface oxides were identified and the kinetics of surface oxidation was discussed. Grain boundary oxide penetration and morphology were studied. Pancake type grain boundary oxide penetrates deeper and its size is larger, therefore, it is more detrimental to fatigue life than cone-type grain boundary oxide. Oxide penetration depth, a (sub m), is related to oxidation temperature, T, and exposure time, t, by an empirical relation of the Arrhenius type. Effects of T and t on statistical variation of a (sub m) were analyzed according to the Weibull distribution function. Once the oxide is cracked, it serves as a fatigue crack nucleus. Statistical variation of the remaining fatigue life, after the formation of an oxide crack of a critical length, is related directly to the statistical variation of grain boundary oxide penetration depth.

  14. Correlative weighted stacking for seismic data in the wavelet domain

    USGS Publications Warehouse

    Zhang, S.; Xu, Y.; Xia, J.; ,

    2004-01-01

    Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.

  15. Principal component analysis and analysis of variance on the effects of Entellan New on the Raman spectra of fibers.

    PubMed

    Yu, Marcia M L; Sandercock, P Mark L

    2012-01-01

    During the forensic examination of textile fibers, fibers are usually mounted on glass slides for visual inspection and identification under the microscope. One method that has the capability to accurately identify single textile fibers without subsequent demounting is Raman microspectroscopy. The effect of the mountant Entellan New on the Raman spectra of fibers was investigated to determine if it is suitable for fiber analysis. Raman spectra of synthetic fibers mounted in three different ways were collected and subjected to multivariate analysis. Principal component analysis score plots revealed that while spectra from different fiber classes formed distinct groups, fibers of the same class formed a single group regardless of the mounting method. The spectra of bare fibers and those mounted in Entellan New were found to be statistically indistinguishable by analysis of variance calculations. These results demonstrate that fibers mounted in Entellan New may be identified directly by Raman microspectroscopy without further sample preparation. © 2011 American Academy of Forensic Sciences.

  16. Terrain-analysis procedures for modeling radar backscatter

    USGS Publications Warehouse

    Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis

    1978-01-01

    The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.

  17. Optimization of photocatalytic degradation of palm oil mill effluent in UV/ZnO system based on response surface methodology.

    PubMed

    Ng, Kim Hoong; Cheng, Yoke Wang; Khan, Maksudur R; Cheng, Chin Kui

    2016-12-15

    This paper reports on the optimization of palm oil mill effluent (POME) degradation in a UV-activated-ZnO system based on central composite design (CCD) in response surface methodology (RSM). Three potential factors, viz. O 2 flowrate (A), ZnO loading (B) and initial concentration of POME (C) were evaluated for the significance analysis using a 2 3 full factorial design before the optimization process. It is found that all the three main factors were significant, with contributions of 58.27% (A), 15.96% (B) and 13.85% (C), respectively, to the POME degradation. In addition, the interactions between the factors AB, AC and BC also have contributed 4.02%, 3.12% and 1.01% to the POME degradation. Subsequently, all the three factors were subjected to statistical central composite design (CCD) analysis. Quadratic models were developed and rigorously checked. A 3D-response surface was subsequently generated. Two successive validation experiments were carried out and the degradation achieved were 55.25 and 55.33%, contrasted with 52.45% for predicted degradation value. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. EEG-tomographic studies with LORETA on vigilance differences between narcolepsy patients and controls and subsequent double-blind, placebo-controlled studies with modafinil.

    PubMed

    Saletu, M; Anderer, P; Saletu-Zyhlarz, G M; Mandl, M; Arnold, O; Zeitlhofer, J; Saletu, B

    2004-11-01

    The aim of the present study was to identify brain regions associated with vigilance in untreated and modafinil-treated narcoleptic patients by means of low-resolution brain electromagnetic tomography (LORETA). 16 drug-free narcoleptics and 16 normal controls were included in the baseline investigation. Subsequently patients participated in a double-blind, placebo-controlled crossover study receiving a three-week fixed titration of modafinil (200, 300, 400 mg) and placebo. Measurements comprised LORETA, the Multiple Sleep Latency Test (MSLT) and the Epworth Sleepiness Scale (ESS) obtained before and after three weeks' therapy. Statistical overall analysis by means of the omnibus significance test demonstrated significant inter-group differences in the resting (R-EEG), but not in the vigilance-controlled recordings (V-EEG). Subsequent univariate analysis revealed a decrease in alpha-2 and beta 1-3 power in prefrontal, temporal and parietal cortices, with the right hemisphere slightly more involved in this vigilance decrement. Modafinil 400 mg/d as compared with placebo induced changes opposite to the aforementioned baseline differences (key-lock principle) with a preponderance in the left hemisphere. This increase in vigilance resulted in an improvement in the MSLT and the ESS. LORETA provided evidence of a functional deterioration of the fronto-temporo-parietal network of the right-hemispheric vigilance system in narcolepsy and a therapeutic effect of modafinil on the left hemisphere, which is less affected by the disease.

  19. The relationship between appetite scores and subsequent energy intake: an analysis based on 23 randomized controlled studies.

    PubMed

    Sadoul, Bastien C; Schuring, Ewoud A H; Mela, David J; Peters, Harry P F

    2014-12-01

    Several studies have assessed relationships of self-reported appetite (eating motivations, mainly by Visual Analogue Scales, VAS) with subsequent energy intake (EI), though usually in small data sets with limited power and variable designs. The objectives were therefore to better quantify the relationships of self-reports (incorporating subject characteristics) to subsequent EI, and to estimate the quantitative differences in VAS corresponding to consistent, significant differences in EI. Data were derived from an opportunity sample of 23 randomized controlled studies involving 549 subjects, testing the effects of various food ingredients in meal replacers or 100-150 ml mini-drinks. In all studies, scores on several VAS were recorded for 30 min to 5 h post-meal, when EI was assessed by ad libitum meal consumption. The relationships between pre-meal VAS scores and EI were examined using correlation, linear models (including subject characteristics) and a cross-validation procedure. VAS correlations with subsequent EI were statistically significant, but of low magnitude, up to r = 0.26. Hunger, age, gender, body weight and estimated basal metabolic rate explained 25% of the total variance in EI. Without hunger the prediction of EI was modestly but significantly lower (19%, P < 0.001). A change of ≥15-25 mm on a 100 mm VAS was the minimum effect consistently corresponding to a significant change in subsequent EI, depending on the starting VAS level. Eating motivations add in a small but consistently significant way to other known predictors of acute EI. Differences of about 15 mm on a 100 mm VAS appear to be the minimum effect expected to result in consistent, significant differences in subsequent EI. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Corra: Computational framework and tools for LC-MS discovery and targeted mass spectrometry-based proteomics

    PubMed Central

    Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D

    2008-01-01

    Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345

  1. Outcome and risk factors assessment for adverse events in advanced esophageal cancer patients after self-expanding metal stents placement.

    PubMed

    Rodrigues-Pinto, E; Pereira, P; Coelho, R; Andrade, P; Ribeiro, A; Lopes, S; Moutinho-Ribeiro, P; Macedo, G

    2017-02-01

    Self-expanding metal stents (SEMS) are the treatment of choice for advanced esophageal cancers. Literature is scarce on risk factors predictors for adverse events after SEMS placement. Assess risk factors for adverse events after SEMS placement in advanced esophageal cancer and evaluate survival after SEMS placement. Cross-sectional study of patients with advanced esophageal cancer referred for SEMS placement, during a period of 3 years. Ninety-seven patients with advanced esophageal cancer placed SEMS. Adverse events were more common when tumors were located at the level of the distal esophagus/cardia (47% vs 23%, P = 0.011, OR 3.1), with statistical significance being kept in the multivariate analysis (OR 3.1, P = 0.018). Time until adverse events was lower in the tumors located at the level of the distal esophagus/cardia (P = 0.036). Survival was higher in patients who placed SEMS with curative intent (327 days [126-528] vs. 119 days [91-147], P = 0.002) and in patients submitted subsequently to surgery compared with those who did just chemo/radiotherapy or who did not do further treatment (563 days [378-748] vs. 154 days [133-175] vs. 46 days [20-72], P < 0.001). Subsequent treatment kept statistical significance in the multivariate analysis (HR 3.4, P < 0.001). SEMS allow palliation of dysphagia in advanced esophageal cancer and are associated with an increased out-of-hospital survival, as long as there are conditions for further treatments. Tumors located at the level of the distal esophagus/cardia are associated with a greater number of adverse events, which also occur earlier. © 2016 International Society for Diseases of the Esophagus.

  2. The short-term effects of non-surgical periodontal therapy on the circulating levels of interleukin-6 and C-reactive protein in patients with chronic periodontitis

    PubMed Central

    George, Annie Kitty; Janam, Prasanthila

    2013-01-01

    Background: Recent epidemiological studies have shown that periodontal infection is a risk factor for a number of systemic diseases and conditions. In addition to the conventional risk factors, chronic infection and the subsequent generation of a systemic inflammatory response may be associated with this increased risk. Aims: This study was conducted to determine whether the presence of chronic periodontitis and subsequent non-surgical periodontal therapy could influence the serum levels of interleukin-6 and C-reactive protein (CRP) in patients with severe chronic generalized periodontitis. Settings and Design: Participants were selected from subjects who attended the Department of Periodontics and Oral Implantololgy, Government Dental College, Thiruvananthapuram. Materials and Methods: Sera were obtained from 25 patients with periodontitis for baseline examination and reassessment after completion of treatment. As a control, sera were also obtained from 20 subjects without periodontitis. Interleukin-6 was determined by sensitive enzyme-linked immunosorbent assay, and high-sensitivity CRP (hsCRP) was measured using latex turbidometric immunoassay. Statistical Analysis: Data were analyzed using computer software, Statistical Package for Social Sciences (SPSS) version 10. Results: The level of interleukin-6 and hsCRP in the sera of periodontitis patients was seen to be higher than those of healthy controls. Interleukin-6 level tended to decrease with improvement of the periodontal condition following treatment and approached that of control subjects, and this decline was statistically significant. The hsCRP levels also showed a decreasing trend following periodontal treatment. Conclusions: In this study, we were able to show that periodontal disease significantly affects the serum levels of systemic inflammatory markers and that non-surgical periodontal therapy could bring about a decrease in the levels of these inflammatory markers. PMID:23633770

  3. Analysis of STAT laboratory turnaround times before and after conversion of the hospital information system.

    PubMed

    Lowe, Gary R; Griffin, Yolanda; Hart, Michael D

    2014-08-01

    Modern electronic health record systems (EHRS) reportedly offer advantages including improved quality, error prevention, cost reduction, and increased efficiency. This project reviewed the impact on specimen turnaround times (TAT) and percent compliance for specimens processed in a STAT laboratory after implementation of an upgraded EHRS. Before EHRS implementation, laboratory personnel received instruction and training for specimen processing. One laboratory member per shift received additional training. TAT and percent compliance data sampling occurred 4 times monthly for 13 months post-conversion and were compared with the mean of data collected for 3 months pre-conversion. Percent compliance was gauged using a benchmark of reporting 95% of all specimens within 7 min from receipt. Control charts were constructed for TAT and percent compliance with control limits set at 2 SD and applied continuously through the data collection period. TAT recovered to pre-conversion levels by the 6th month post-conversion. Percent compliance consistently returned to pre-conversion levels by the 10th month post-conversion. Statistical analyses revealed the TAT were significantly longer for 3 months post-conversion (P < .001) compared with pre-conversion levels. Statistical significance was not observed for subsequent groups. Percent compliance results were significantly lower for 6 months post-conversion (P < .001). Statistical significance was not observed for subsequent groups. Extensive efforts were made to train and prepare personnel for challenges expected after the EHRS upgrade. Specific causes identified with the upgraded EHRS included multiple issues involving personnel and the EHRS. These data suggest that system and user issues contributed to delays in returning to pre-conversion TAT and percent compliance levels following the upgrade in the EHRS.

  4. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  5. Re-Thinking Statistics Education for Social Science Majors

    ERIC Educational Resources Information Center

    Reid, Howard M.; Mason, Susan E.

    2008-01-01

    Many college students majoring in the social sciences find the required statistics course to be dull as well as difficult. Further, they often do not retain much of the material, which limits their success in subsequent courses. We describe a few simple changes, the incorporation of which may enhance student learning. (Contains 1 table.)

  6. Review of validation and reporting of non-targeted fingerprinting approaches for food authentication.

    PubMed

    Riedl, Janet; Esslinger, Susanne; Fauhl-Hassek, Carsten

    2015-07-23

    Food fingerprinting approaches are expected to become a very potent tool in authentication processes aiming at a comprehensive characterization of complex food matrices. By non-targeted spectrometric or spectroscopic chemical analysis with a subsequent (multivariate) statistical evaluation of acquired data, food matrices can be investigated in terms of their geographical origin, species variety or possible adulterations. Although many successful research projects have already demonstrated the feasibility of non-targeted fingerprinting approaches, their uptake and implementation into routine analysis and food surveillance is still limited. In many proof-of-principle studies, the prediction ability of only one data set was explored, measured within a limited period of time using one instrument within one laboratory. Thorough validation strategies that guarantee reliability of the respective data basis and that allow conclusion on the applicability of the respective approaches for its fit-for-purpose have not yet been proposed. Within this review, critical steps of the fingerprinting workflow were explored to develop a generic scheme for multivariate model validation. As a result, a proposed scheme for "good practice" shall guide users through validation and reporting of non-targeted fingerprinting results. Furthermore, food fingerprinting studies were selected by a systematic search approach and reviewed with regard to (a) transparency of data processing and (b) validity of study results. Subsequently, the studies were inspected for measures of statistical model validation, analytical method validation and quality assurance measures. In this context, issues and recommendations were found that might be considered as an actual starting point for developing validation standards of non-targeted metabolomics approaches for food authentication in the future. Hence, this review intends to contribute to the harmonization and standardization of food fingerprinting, both required as a prior condition for the authentication of food in routine analysis and official control. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. The role of enamel thickness and refractive index on human tooth colour.

    PubMed

    Oguro, Rena; Nakajima, Masatoshi; Seki, Naoko; Sadr, Alireza; Tagami, Junji; Sumi, Yasunori

    2016-08-01

    To investigate the role of enamel thickness and refractive index (n) on tooth colour. The colour and enamel thickness of fifteen extracted human central incisors were determined according to CIELab colour scale using spectrophotometer (Crystaleye) and swept-source optical coherence tomography (SS-OCT), respectively. Subsequently, labial enamel was trimmed by approximately 100μm, and the colour and remaining enamel thickness were investigated again. This cycle was repeated until dentin appeared. Enamel blocks were prepared from the same teeth and their n were obtained using SS-OCT. Multiple regression analysis was performed to reveal any effects of enamel thickness and n on colour difference (ΔE00) and differences in colour parameters with CIELCh and CIELab colour scales. Multiple regression analysis revealed that enamel thickness (p=0.02) and n of enamel (p<0.001) were statistically significant predictors of ΔE00 after complete enamel trimming. The n was also a significant predictor of ΔH' (p=0.01). Enamel thickness and n were not statistically significant predictors of ΔL', ΔC', Δa* and Δb*. Enamel affected tooth colour, in which n was a statistically significant predictor for tooth colour change. Understanding the role of enamel in tooth colour could contribute to development of aesthetic restorative materials that mimic the colour of natural tooth with minimal reduction of the existing enamel. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Discovering genetic variants in Crohn's disease by exploring genomic regions enriched of weak association signals.

    PubMed

    D'Addabbo, Annarita; Palmieri, Orazio; Maglietta, Rosalia; Latiano, Anna; Mukherjee, Sayan; Annese, Vito; Ancona, Nicola

    2011-08-01

    A meta-analysis has re-analysed previous genome-wide association scanning definitively confirming eleven genes and further identifying 21 new loci. However, the identified genes/loci still explain only the minority of genetic predisposition of Crohn's disease. To identify genes weakly involved in disease predisposition by analysing chromosomal regions enriched of single nucleotide polymorphisms with modest statistical association. We utilized the WTCCC data set evaluating 1748 CD and 2938 controls. The identification of candidate genes/loci was performed by a two-step procedure: first of all chromosomal regions enriched of weak association signals were localized; subsequently, weak signals clustered in gene regions were identified. The statistical significance was assessed by non parametric permutation tests. The cytoband enrichment analysis highlighted 44 regions (P≤0.05) enriched with single nucleotide polymorphisms significantly associated with the trait including 23 out of 31 previously confirmed and replicated genes. Importantly, we highlight further 20 novel chromosomal regions carrying approximately one hundred genes/loci with modest association. Amongst these we find compelling functional candidate genes such as MAPT, GRB2 and CREM, LCT, and IL12RB2. Our study suggests a different statistical perspective to discover genes weakly associated with a given trait, although further confirmatory functional studies are needed. Copyright © 2011 Editrice Gastroenterologica Italiana S.r.l. All rights reserved.

  9. Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krause, E.; et al.

    We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihoodmore » $$\\Delta \\chi^2 \\le 0.045$$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$$~h^{-1}$$) and galaxy-galaxy lensing (12 Mpc$$~h^{-1}$$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.« less

  10. Differentiation of wines according to grape variety and geographical origin based on volatiles profiling using SPME-MS and SPME-GC/MS methods.

    PubMed

    Ziółkowska, Angelika; Wąsowicz, Erwin; Jeleń, Henryk H

    2016-12-15

    Among methods to detect wine adulteration, profiling volatiles is one with a great potential regarding robustness, analysis time and abundance of information for subsequent data treatment. Volatile fraction fingerprinting by solid-phase microextraction with direct analysis by mass spectrometry without compounds separation (SPME-MS) was used for differentiation of white as well as red wines. The aim was to differentiate between varieties used for wine production and to also differentiate wines by country of origin. The results obtained were compared to SPME-GC/MS analysis in which compounds were resolved by gas chromatography. For both approaches the same type of statistical procedure was used to compare samples: principal component analysis (PCA) followed by linear discriminant analysis (LDA). White wines (38) and red wines (41) representing different grape varieties and various regions of origin were analysed. SPME-MS proved to be advantageous in use due to better discrimination and higher sample throughput. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Different reaction of the core histones H2A and H2B to red laser irradiation

    NASA Astrophysics Data System (ADS)

    Brill, G. E.; Egorova, A. V.; Bugaeva, I. O.; Postnov, D. E.; Ushakova, O. V.

    2017-03-01

    Analysis of the influence of red laser irradiation on the processes of self-assembly of the core histones H2A and H2B was performed using a wedge dehydration method. Image-analysis of facies included their qualitative characteristics and calculation of quantitative parameters with subsequent statistical processing. It was established that linearly polarized red laser light (λ - 660 nm, 1 J/cm2) significantly modified the process of self-assembly of core histone H2B, whereas the structure of the facies of H2A histone changed to a lesser extent. Histones were used in the form of aqueous salt solutions. The effect of red light seems to result from the formation of singlet oxygen by direct laser excitation of molecular oxygen.

  12. Urine metabolic fingerprinting using LC-MS and GC-MS reveals metabolite changes in prostate cancer: A pilot study.

    PubMed

    Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bujak, Renata; Yumba Mpanga, Arlette; Markuszewski, Marcin; Jacyna, Julia; Matuszewski, Marcin; Kaliszan, Roman; Markuszewski, Michał J

    2015-01-01

    Prostate cancer (CaP) is a leading cause of cancer deaths in men worldwide. The alarming statistics, the currently applied biomarkers are still not enough specific and selective. In addition, pathogenesis of CaP development is not totally understood. Therefore, in the present work, metabolomics study related to urinary metabolic fingerprinting analyses has been performed in order to scrutinize potential biomarkers that could help in explaining the pathomechanism of the disease and be potentially useful in its diagnosis and prognosis. Urine samples from CaP patients and healthy volunteers were analyzed with the use of high performance liquid chromatography coupled with time of flight mass spectrometry detection (HPLC-TOF/MS) in positive and negative polarity as well as gas chromatography hyphenated with triple quadruple mass spectrometry detection (GC-QqQ/MS) in a scan mode. The obtained data sets were statistically analyzed using univariate and multivariate statistical analyses. The Principal Component Analysis (PCA) was used to check systems' stability and possible outliers, whereas Partial Least Squares Discriminant Analysis (PLS-DA) was performed for evaluation of quality of the model as well as its predictive ability using statistically significant metabolites. The subsequent identification of selected metabolites using NIST library and commonly available databases allows for creation of a list of putative biomarkers and related biochemical pathways they are involved in. The selected pathways, like urea and tricarboxylic acid cycle, amino acid and purine metabolism, can play crucial role in pathogenesis of prostate cancer disease. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. miRNA Temporal Analyzer (mirnaTA): a bioinformatics tool for identifying differentially expressed microRNAs in temporal studies using normal quantile transformation.

    PubMed

    Cer, Regina Z; Herrera-Galeano, J Enrique; Anderson, Joseph J; Bishop-Lilly, Kimberly A; Mokashi, Vishwesh P

    2014-01-01

    Understanding the biological roles of microRNAs (miRNAs) is a an active area of research that has produced a surge of publications in PubMed, particularly in cancer research. Along with this increasing interest, many open-source bioinformatics tools to identify existing and/or discover novel miRNAs in next-generation sequencing (NGS) reads become available. While miRNA identification and discovery tools are significantly improved, the development of miRNA differential expression analysis tools, especially in temporal studies, remains substantially challenging. Further, the installation of currently available software is non-trivial and steps of testing with example datasets, trying with one's own dataset, and interpreting the results require notable expertise and time. Subsequently, there is a strong need for a tool that allows scientists to normalize raw data, perform statistical analyses, and provide intuitive results without having to invest significant efforts. We have developed miRNA Temporal Analyzer (mirnaTA), a bioinformatics package to identify differentially expressed miRNAs in temporal studies. mirnaTA is written in Perl and R (Version 2.13.0 or later) and can be run across multiple platforms, such as Linux, Mac and Windows. In the current version, mirnaTA requires users to provide a simple, tab-delimited, matrix file containing miRNA name and count data from a minimum of two to a maximum of 20 time points and three replicates. To recalibrate data and remove technical variability, raw data is normalized using Normal Quantile Transformation (NQT), and linear regression model is used to locate any miRNAs which are differentially expressed in a linear pattern. Subsequently, remaining miRNAs which do not fit a linear model are further analyzed in two different non-linear methods 1) cumulative distribution function (CDF) or 2) analysis of variances (ANOVA). After both linear and non-linear analyses are completed, statistically significant miRNAs (P < 0.05) are plotted as heat maps using hierarchical cluster analysis and Euclidean distance matrix computation methods. mirnaTA is an open-source, bioinformatics tool to aid scientists in identifying differentially expressed miRNAs which could be further mined for biological significance. It is expected to provide researchers with a means of interpreting raw data to statistical summaries in a fast and intuitive manner.

  14. Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.

    2016-12-01

    In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.

  15. Cancerouspdomains: comprehensive analysis of cancer type-specific recurrent somatic mutations in proteins and domains.

    PubMed

    Hashemi, Seirana; Nowzari Dalini, Abbas; Jalali, Adrin; Banaei-Moghaddam, Ali Mohammad; Razaghi-Moghadam, Zahra

    2017-08-16

    Discriminating driver mutations from the ones that play no role in cancer is a severe bottleneck in elucidating molecular mechanisms underlying cancer development. Since protein domains are representatives of functional regions within proteins, mutations on them may disturb the protein functionality. Therefore, studying mutations at domain level may point researchers to more accurate assessment of the functional impact of the mutations. This article presents a comprehensive study to map mutations from 29 cancer types to both sequence- and structure-based domains. Statistical analysis was performed to identify candidate domains in which mutations occur with high statistical significance. For each cancer type, the corresponding type-specific domains were distinguished among all candidate domains. Subsequently, cancer type-specific domains facilitated the identification of specific proteins for each cancer type. Besides, performing interactome analysis on specific proteins of each cancer type showed high levels of interconnectivity among them, which implies their functional relationship. To evaluate the role of mitochondrial genes, stem cell-specific genes and DNA repair genes in cancer development, their mutation frequency was determined via further analysis. This study has provided researchers with a publicly available data repository for studying both CATH and Pfam domain regions on protein-coding genes. Moreover, the associations between different groups of genes/domains and various cancer types have been clarified. The work is available at http://www.cancerouspdomains.ir .

  16. Statistical strategies for averaging EC50 from multiple dose-response experiments.

    PubMed

    Jiang, Xiaoqi; Kopp-Schneider, Annette

    2015-11-01

    In most dose-response studies, repeated experiments are conducted to determine the EC50 value for a chemical, requiring averaging EC50 estimates from a series of experiments. Two statistical strategies, the mixed-effect modeling and the meta-analysis approach, can be applied to estimate average behavior of EC50 values over all experiments by considering the variabilities within and among experiments. We investigated these two strategies in two common cases of multiple dose-response experiments in (a) complete and explicit dose-response relationships are observed in all experiments and in (b) only in a subset of experiments. In case (a), the meta-analysis strategy is a simple and robust method to average EC50 estimates. In case (b), all experimental data sets can be first screened using the dose-response screening plot, which allows visualization and comparison of multiple dose-response experimental results. As long as more than three experiments provide information about complete dose-response relationships, the experiments that cover incomplete relationships can be excluded from the meta-analysis strategy of averaging EC50 estimates. If there are only two experiments containing complete dose-response information, the mixed-effects model approach is suggested. We subsequently provided a web application for non-statisticians to implement the proposed meta-analysis strategy of averaging EC50 estimates from multiple dose-response experiments.

  17. Maternal cigarette smoking during pregnancy and criminal/deviant behavior: a meta-analysis.

    PubMed

    Pratt, Travis C; McGloin, Jean Marie; Fearn, Noelle E

    2006-12-01

    A growing body of empirical literature has emerged examining the somewhat inconsistent relationship between maternal cigarette smoking (MCS) during pregnancy and children's subsequent antisocial behavior. To systematically assess what existing studies reveal regarding MCS as a criminogenic risk factor for offspring, the authors subjected this body of literature to a meta-analysis. The analysis reveals a statistically significant--yet rather small--overall mean "effect size" of the relationship between MCS and the likelihood children will engage in deviant/criminal behavior. In addition to being rather moderate in size, the MCS-crime/deviance relationship is sensitive to a number of methodological specifications across empirical studies--particularly those associated with sample characteristics. The implications of this modest, and somewhat unstable, relationship are discussed in terms of guidelines for future research on this subject and how existing theoretical perspectives may be integrated to explain the MCS-crime/deviance link.

  18. A description of how metal pollution occurs in the Tinto-Odiel rias (Huelva-Spain) through the application of cluster analysis.

    PubMed

    Grande, J A; Borrego, J; Morales, J A; de la Torre, M L

    2003-04-01

    In the last few decades, the study of space-time distribution and variations of heavy metals in estuaries has been extensively studied as an environmental indicator. In the case described here, the combination of acid water from mines, industrial effluents and sea water plays a determining role in the evolutionary process of the chemical makeup of the water in the estuary of the Tinto and Odiel Rivers, located in the southwest of the Iberian Peninsula. Based on the statistical treatment of the data from the analysis of the water samples from this system, which has been affected by processes of industrial and mining pollution, the 16 variables analyzed can be grouped into two large families. Each family presents high, positive Pearson r values that suggest common origins (fluvial or sea) for the pollutants present in the water analyzed and allow their subsequent contrast through cluster analysis.

  19. In situ and in-transit analysis of cosmological simulations

    DOE PAGES

    Friesen, Brian; Almgren, Ann; Lukic, Zarija; ...

    2016-08-24

    Modern cosmological simulations have reached the trillion-element scale, rendering data storage and subsequent analysis formidable tasks. To address this circumstance, we present a new MPI-parallel approach for analysis of simulation data while the simulation runs, as an alternative to the traditional workflow consisting of periodically saving large data sets to disk for subsequent ‘offline’ analysis. We demonstrate this approach in the compressible gasdynamics/N-body code Nyx, a hybrid MPI+OpenMP code based on the BoxLib framework, used for large-scale cosmological simulations. We have enabled on-the-fly workflows in two different ways: one is a straightforward approach consisting of all MPI processes periodically haltingmore » the main simulation and analyzing each component of data that they own (‘ in situ’). The other consists of partitioning processes into disjoint MPI groups, with one performing the simulation and periodically sending data to the other ‘sidecar’ group, which post-processes it while the simulation continues (‘in-transit’). The two groups execute their tasks asynchronously, stopping only to synchronize when a new set of simulation data needs to be analyzed. For both the in situ and in-transit approaches, we experiment with two different analysis suites with distinct performance behavior: one which finds dark matter halos in the simulation using merge trees to calculate the mass contained within iso-density contours, and another which calculates probability distribution functions and power spectra of various fields in the simulation. Both are common analysis tasks for cosmology, and both result in summary statistics significantly smaller than the original data set. We study the behavior of each type of analysis in each workflow in order to determine the optimal configuration for the different data analysis algorithms.« less

  20. Authenticating concealed private data while maintaining concealment

    DOEpatents

    Thomas, Edward V [Albuquerque, NM; Draelos, Timothy J [Albuquerque, NM

    2007-06-26

    A method of and system for authenticating concealed and statistically varying multi-dimensional data comprising: acquiring an initial measurement of an item, wherein the initial measurement is subject to measurement error; applying a transformation to the initial measurement to generate reference template data; acquiring a subsequent measurement of an item, wherein the subsequent measurement is subject to measurement error; applying the transformation to the subsequent measurement; and calculating a Euclidean distance metric between the transformed measurements; wherein the calculated Euclidean distance metric is identical to a Euclidean distance metric between the measurement prior to transformation.

  1. Analyzing Large Gene Expression and Methylation Data Profiles Using StatBicRM: Statistical Biclustering-Based Rule Mining

    PubMed Central

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level. PMID:25830807

  2. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    PubMed

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level.

  3. Aircraft Maneuvers for the Evaluation of Flying Qualities and Agility. Volume 1. Maneuver Development Process and Initial Maneuver Set

    DTIC Science & Technology

    1993-08-01

    subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used

  4. Is there an association between flow diverter fish mouthing and delayed-type hypersensitivity to metals?-a case-control study.

    PubMed

    Kocer, Naci; Mondel, Prabath Kumar; Yamac, Elif; Kavak, Ayse; Kizilkilic, Osman; Islak, Civan

    2017-11-01

    Flow diverters are increasingly used in the treatment of complex and giant intracranial aneurysms. However, they are associated with complications like late aneurysmal rupture. Additionally, flow diverters show focal structural decrease in luminal diameter without any intimal hyperplasia. This resembles a "fish mouth" when viewed en face. In this pilot study, we tested the hypothesis of a possible association between flow diverter fish-mouthing and delayed-type hypersensitivity to its metal constituents. We retrospectively reviewed patient records from our center between May 2010 and November 2015. A total of nine patients had flow diverter fish mouthing. A control group of 25 patients was selected. All study participants underwent prospective patch test to detect hypersensitivity to flow diverter metal constituents. Analysis was performed using logistic regression analysis and Wilcoxon sign rank sum test. Univariate and multivariate analyses were performed to test variables to predict flow diverter fish mouthing. The association between flow diverter fish mouthing and positive patch test was not statistically significant. In multivariate analysis, history of allergy and maximum aneurysm size category was associated with flow diverter fish mouthing. This was further confirmed on Wilcoxon sign rank sum test. The study showed statistically significant association between flow diverter fish mouthing and history of contact allergy and a small aneurysmal size. Further large-scale studies are needed to detect a statistically significant association between flow diverter fish mouthing and patch test. We recommend early and more frequent follow-up imaging in patients with contact allergy to detect flow diverter fish mouthing and its subsequent evolution.

  5. Investigation of the Effects of High-Intensity, Intermittent Exercise and Unanticipation on Trunk and Lower Limb Biomechanics During a Side-Cutting Maneuver Using Statistical Parametric Mapping.

    PubMed

    Whyte, Enda F; Richter, Chris; OʼConnor, Siobhan; Moran, Kieran A

    2018-06-01

    Whyte, EF, Richter, C, O'Connor, S, and Moran, KA. Investigation of the effects of high-intensity, intermittent exercise and unanticipation on trunk and lower limb biomechanics during a side-cutting maneuver using statistical parametric mapping. J Strength Cond Res 32(6): 1583-1593, 2018-Anterior cruciate ligament (ACL) injuries frequently occur during side-cutting maneuvers when fatigued or reacting to the sporting environment. Trunk and hip biomechanics are proposed to influence ACL loading during these activities. However, the effects of fatigue and unanticipation on the biomechanics of the kinetic chain may be limited by traditional discrete point analysis. We recruited 28 male, varsity, Gaelic footballers (21.7 ± 2.2 years; 178.7 ± 14.6 m; 81.8 ± 11.4 kg) to perform anticipated and unanticipated side-cutting maneuvers before and after a high-intensity, intermittent exercise protocol (HIIP). Statistical parametric mapping (repeated-measures analysis of varience) identified differences in phases of trunk and stance leg biomechanics during weight acceptance. Unanticipation resulted in less trunk flexion (p < 0.001) and greater side flexion away from the direction of cut (p < 0.001). This led to smaller (internal) knee flexor and greater (internal) knee extensor (p = 0.002-0.007), hip adductor (p = 0.005), and hip external rotator (p = 0.007) moments. The HIIP resulted in increased trunk flexion (p < 0.001) and side flexion away from the direction of cut (p = 0.038), resulting in smaller (internal) knee extensor moments (p = 0.006). One interaction effect was noted demonstrating greater hip extensor moments in the unanticipated condition post-HIIP (p = 0.025). Results demonstrate that unanticipation resulted in trunk kinematics considered an ACL injury risk factor. A subsequent increase in frontal and transverse plane hip loading and sagittal plane knee loading was observed, which may increase ACL strain. Conversely, HIIP-induced trunk kinematic alterations resulted in reduced sagittal plane knee and subsequent ACL loading. Therefore, adequate hip and knee control is important during unanticipated side-cutting maneuvers.

  6. A Framework for Estimating Causal Effects in Latent Class Analysis: Is There a Causal Link Between Early Sex and Subsequent Profiles of Delinquency?

    PubMed Central

    Lanza, Stephanie T.; Coffman, Donna L.

    2013-01-01

    Prevention scientists use latent class analysis (LCA) with increasing frequency to characterize complex behavior patterns and profiles of risk. Often, the most important research questions in these studies involve establishing characteristics that predict membership in the latent classes, thus describing the composition of the subgroups and suggesting possible points of intervention. More recently, prevention scientists have begun to adopt modern methods for drawing causal inference from observational data because of the bias that can be introduced by confounders. This same issue of confounding exists in any analysis of observational data, including prediction of latent class membership. This study demonstrates a straightforward approach to causal inference in LCA that builds on propensity score methods. We demonstrate this approach by examining the causal effect of early sex on subsequent delinquency latent classes using data from 1,890 adolescents in 11th and 12th grade from wave I of the National Longitudinal Study of Adolescent Health. Prior to the statistical adjustment for potential confounders, early sex was significantly associated with delinquency latent class membership for both genders (p=0.02). However, the propensity score adjusted analysis indicated no evidence for a causal effect of early sex on delinquency class membership (p=0.76) for either gender. Sample R and SAS code is included in an Appendix in the ESM so that prevention scientists may adopt this approach to causal inference in LCA in their own work. PMID:23839479

  7. A framework for estimating causal effects in latent class analysis: is there a causal link between early sex and subsequent profiles of delinquency?

    PubMed

    Butera, Nicole M; Lanza, Stephanie T; Coffman, Donna L

    2014-06-01

    Prevention scientists use latent class analysis (LCA) with increasing frequency to characterize complex behavior patterns and profiles of risk. Often, the most important research questions in these studies involve establishing characteristics that predict membership in the latent classes, thus describing the composition of the subgroups and suggesting possible points of intervention. More recently, prevention scientists have begun to adopt modern methods for drawing causal inference from observational data because of the bias that can be introduced by confounders. This same issue of confounding exists in any analysis of observational data, including prediction of latent class membership. This study demonstrates a straightforward approach to causal inference in LCA that builds on propensity score methods. We demonstrate this approach by examining the causal effect of early sex on subsequent delinquency latent classes using data from 1,890 adolescents in 11th and 12th grade from wave I of the National Longitudinal Study of Adolescent Health. Prior to the statistical adjustment for potential confounders, early sex was significantly associated with delinquency latent class membership for both genders (p = 0.02). However, the propensity score adjusted analysis indicated no evidence for a causal effect of early sex on delinquency class membership (p = 0.76) for either gender. Sample R and SAS code is included in an Appendix in the ESM so that prevention scientists may adopt this approach to causal inference in LCA in their own work.

  8. Sex Differences in Application, Success, and Funding Rates for NIH Extramural Programs

    PubMed Central

    Pohlhaus, Jennifer Reineke; Jiang, Hong; Wagner, Robin M.; Schaffer, Walter T.; Pinn, Vivian W.

    2011-01-01

    Purpose The authors provide an analysis of sex differences in National Institutes of Health (NIH) award programs to inform potential initiatives for promoting diversity in the research workforce. Method In 2010, the authors retrieved data for NIH extramural grants in the electronic Research Administration Information for Management, Planning, and Coordination II database, and used statistical analysis to determine any sex differences in securing NIH funding, as well as subsequent success of researchers who had already received independent NIH support. Results Success and funding rates for men and women were not significantly different in most award programs. Furthermore, in programs where participation was lower for women than men, the disparity was primarily related to a lower percentage of women applicants compared to men, rather than decreased success rates or funding rates. However, for subsequent grants, both application and funding rates were generally higher for men than for women. Conclusions Cross-sectional analysis showed that women and men were generally equally successful at all career stages, but longitudinal analysis showed that men with previous experience as NIH grantees had higher application and funding rates than women at similar career points. On average, although women received larger R01 awards than men, men had more R01 awards than women at all points in their careers. Therefore, while greater participation of women in NIH programs is underway, further action will be required to eradicate remaining sex differences. PMID:21512358

  9. Codebook-based electrooculography data analysis towards cognitive activity recognition.

    PubMed

    Lagodzinski, P; Shirahama, K; Grzegorzek, M

    2018-04-01

    With the advancement in mobile/wearable technology, people started to use a variety of sensing devices to track their daily activities as well as health and fitness conditions in order to improve the quality of life. This work addresses an idea of eye movement analysis, which due to the strong correlation with cognitive tasks can be successfully utilized in activity recognition. Eye movements are recorded using an electrooculographic (EOG) system built into the frames of glasses, which can be worn more unobtrusively and comfortably than other devices. Since the obtained information is low-level sensor data expressed as a sequence representing values in constant intervals (100 Hz), the cognitive activity recognition problem is formulated as sequence classification. However, it is unclear what kind of features are useful for accurate cognitive activity recognition. Thus, a machine learning algorithm like a codebook approach is applied, which instead of focusing on feature engineering is using a distribution of characteristic subsequences (codewords) to describe sequences of recorded EOG data, where the codewords are obtained by clustering a large number of subsequences. Further, statistical analysis of the codeword distribution results in discovering features which are characteristic to a certain activity class. Experimental results demonstrate good accuracy of the codebook-based cognitive activity recognition reflecting the effective usage of the codewords. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Stats on the Cheap: Using Free and Inexpensive Internet Resources to Enhance the Teaching of Statistics and Research Methods

    ERIC Educational Resources Information Center

    Hartnett, Jessica L.

    2013-01-01

    The present article describes four free or inexpensive Internet-based activities that can be used to supplement statistics/research methods/general psychology classes. Each activity and subsequent homework assessment is described, as well as homework performance outcome and student opinion data for each activity. (Contains 1 table.)

  11. Retrieving Essential Material at the End of Lectures Improves Performance on Statistics Exams

    ERIC Educational Resources Information Center

    Lyle, Keith B.; Crawford, Nicole A.

    2011-01-01

    At the end of each lecture in a statistics for psychology course, students answered a small set of questions that required them to retrieve information from the same day's lecture. These exercises constituted retrieval practice for lecture material subsequently tested on four exams throughout the course. This technique is called the PUREMEM…

  12. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  13. Quantitative topographic differentiation of the neonatal EEG.

    PubMed

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  14. A probabilistic framework for microarray data analysis: fundamental probability models and statistical inference.

    PubMed

    Ogunnaike, Babatunde A; Gelmi, Claudio A; Edwards, Jeremy S

    2010-05-21

    Gene expression studies generate large quantities of data with the defining characteristic that the number of genes (whose expression profiles are to be determined) exceed the number of available replicates by several orders of magnitude. Standard spot-by-spot analysis still seeks to extract useful information for each gene on the basis of the number of available replicates, and thus plays to the weakness of microarrays. On the other hand, because of the data volume, treating the entire data set as an ensemble, and developing theoretical distributions for these ensembles provides a framework that plays instead to the strength of microarrays. We present theoretical results that under reasonable assumptions, the distribution of microarray intensities follows the Gamma model, with the biological interpretations of the model parameters emerging naturally. We subsequently establish that for each microarray data set, the fractional intensities can be represented as a mixture of Beta densities, and develop a procedure for using these results to draw statistical inference regarding differential gene expression. We illustrate the results with experimental data from gene expression studies on Deinococcus radiodurans following DNA damage using cDNA microarrays. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  15. Second primary malignancies after treatment for malignant lymphoma

    PubMed Central

    Okines, A; Thomson, C S; Radstone, C R; Horsman, J M; Hancock, B W

    2005-01-01

    To determine the incidence and possible causes of second primary malignancies after treatment for Hodgkin's and Non-Hodgkin's lymphoma (HL and NHL). A cohort of 3764 consecutive patients diagnosed with HL or NHL between January 1970 and July 2001 was identified using the Sheffield Lymphoma Group database. A search was undertaken for all patients diagnosed with a subsequent primary malignancy. Two matched controls were identified for each case. Odds ratios were calculated to detect and quantify any risk factors in the cases compared to their matched controls. Mean follow-up for the cohort was 5.2 years. A total of 68 patients who developed second cancers at least 6 months after their primary diagnosis were identified, giving a crude incidence of 1.89% overall: 3.21% among the patients treated for HL, 1.32% in those treated for NHL. Most common were bronchial, breast, colorectal and haematological malignancies. High stage at diagnosis almost reached statistical significance in the analysis of just the NHL patients (odds ratio=3.48; P=0.068) after adjustment for other factors. Treatment modality was not statistically significant in any analysis. High stage at diagnosis of NHL may be a risk factor for developing a second primary cancer. PMID:16106249

  16. Early Father Involvement and Subsequent Child Behaviour at Ages 3, 5 and 7 Years: Prospective Analysis of the UK Millennium Cohort Study.

    PubMed

    Kroll, Mary E; Carson, Claire; Redshaw, Maggie; Quigley, Maria A

    Fathers are increasingly involved in care of their babies and young children. We assessed the association of resident fathers' involvement with subsequent behaviour of their children, examining boys and girls separately. We used longitudinal data from the UK Millennium Cohort Study for children born in 2000-2001, divided into three separate analysis periods: ages 9 months to 3 years, 3 to 5 years, and 5 to 7 years. By exploratory factor analysis of self-reported attitudes and engagement in caring activities, we derived composite measures of various types of father involvement at 9 months, 3 and 5 years. Where possible we created equivalent measures of mother involvement. Child behaviour was assessed by the Strengths and Difficulties Questionnaire (SDQ), which was completed by the mother when the child was aged 3, 5 and 7 years. We estimated gender-specific odds ratios for behaviour problems per quintile of father involvement, using separate logistic regression models for boys and girls in each analysis period. We controlled for a wide range of potential confounders: characteristics of the child (temperament and development at 9 months, and illness and exact age at outcome), equivalent mother involvement where appropriate, and factors related to socioeconomic status, household change, and parental well-being, where statistically significant. Paternal positive parenting beliefs at age 9 months and increased frequency of creative play at age 5 years were significantly associated with lower risk of subsequent behaviour problems (SDQ total difficulties) in both boys and girls (p<0.05), odds ratios ranging between 0.81 and 0.89 per quintile of involvement. No associations were observed for other composite measures of caring activity by the father at 9 months, 3 years or 5 years. Quality of parenting, rather than the division of routine care between parents, was associated with child behavioural outcomes.

  17. NATbox: a network analysis toolbox in R.

    PubMed

    Chavan, Shweta S; Bauer, Michael A; Scutari, Marco; Nagarajan, Radhakrishnan

    2009-10-08

    There has been recent interest in capturing the functional relationships (FRs) from high-throughput assays using suitable computational techniques. FRs elucidate the working of genes in concert as a system as opposed to independent entities hence may provide preliminary insights into biological pathways and signalling mechanisms. Bayesian structure learning (BSL) techniques and its extensions have been used successfully for modelling FRs from expression profiles. Such techniques are especially useful in discovering undocumented FRs, investigating non-canonical signalling mechanisms and cross-talk between pathways. The objective of the present study is to develop a graphical user interface (GUI), NATbox: Network Analysis Toolbox in the language R that houses a battery of BSL algorithms in conjunction with suitable statistical tools for modelling FRs in the form of acyclic networks from gene expression profiles and their subsequent analysis. NATbox is a menu-driven open-source GUI implemented in the R statistical language for modelling and analysis of FRs from gene expression profiles. It provides options to (i) impute missing observations in the given data (ii) model FRs and network structure from gene expression profiles using a battery of BSL algorithms and identify robust dependencies using a bootstrap procedure, (iii) present the FRs in the form of acyclic graphs for visualization and investigate its topological properties using network analysis metrics, (iv) retrieve FRs of interest from published literature. Subsequently, use these FRs as structural priors in BSL (v) enhance scalability of BSL across high-dimensional data by parallelizing the bootstrap routines. NATbox provides a menu-driven GUI for modelling and analysis of FRs from gene expression profiles. By incorporating readily available functions from existing R-packages, it minimizes redundancy and improves reproducibility, transparency and sustainability, characteristic of open-source environments. NATbox is especially suited for interdisciplinary researchers and biologists with minimal programming experience and would like to use systems biology approaches without delving into the algorithmic aspects. The GUI provides appropriate parameter recommendations for the various menu options including default parameter choices for the user. NATbox can also prove to be a useful demonstration and teaching tool in graduate and undergraduate course in systems biology. It has been tested successfully under Windows and Linux operating systems. The source code along with installation instructions and accompanying tutorial can be found at http://bioinformatics.ualr.edu/natboxWiki/index.php/Main_Page.

  18. The Impact of Celebrity Suicide on Subsequent Suicide Rates in the General Population of Korea from 1990 to 2010.

    PubMed

    Park, Juhyun; Choi, Nari; Kim, Seog Ju; Kim, Soohyun; An, Hyonggin; Lee, Heon-Jeong; Lee, Yu Jin

    2016-04-01

    The association between celebrity suicide and subsequent increase in suicide rates among the general population has been suggested. Previous studies primarily focused on celebrity suicides in the 2000s. To better understand the association, this study examined the impacts of celebrity suicides on subsequent suicide rates using the data of Korean celebrity suicides between 1990 and 2010. Nine celebrity suicides were selected by an investigation of media reports of suicide deaths published in three major newspapers in Korea between 1990 and 2010. Suicide mortality data were obtained from the National Statistical Office of Korea. Seasonal autoregressive integrated moving average models with intervention analysis were used to test the impacts of celebrity suicides, controlling for seasonality. Six of the 9 celebrity suicides had significant impacts on suicide rates both in the total population and in the same gender- or the same age-subgroups. The incident that occurred in the 1990s had no significant impact on the overall suicide rates, whereas the majority of the incidents in the 2000s had significant influences for 30 or 60 days following each incident. The influence of celebrity suicide was shown to reach its peak following the suicide death of a renowned actress in 2008. The findings may suggest a link between media coverage and the impact of celebrity suicide. Future studies should focus more on the underlying processes and confounding factors that may contribute to the impact of celebrity suicide on subsequent suicide rates.

  19. The Impact of Celebrity Suicide on Subsequent Suicide Rates in the General Population of Korea from 1990 to 2010

    PubMed Central

    2016-01-01

    The association between celebrity suicide and subsequent increase in suicide rates among the general population has been suggested. Previous studies primarily focused on celebrity suicides in the 2000s. To better understand the association, this study examined the impacts of celebrity suicides on subsequent suicide rates using the data of Korean celebrity suicides between 1990 and 2010. Nine celebrity suicides were selected by an investigation of media reports of suicide deaths published in three major newspapers in Korea between 1990 and 2010. Suicide mortality data were obtained from the National Statistical Office of Korea. Seasonal autoregressive integrated moving average models with intervention analysis were used to test the impacts of celebrity suicides, controlling for seasonality. Six of the 9 celebrity suicides had significant impacts on suicide rates both in the total population and in the same gender- or the same age-subgroups. The incident that occurred in the 1990s had no significant impact on the overall suicide rates, whereas the majority of the incidents in the 2000s had significant influences for 30 or 60 days following each incident. The influence of celebrity suicide was shown to reach its peak following the suicide death of a renowned actress in 2008. The findings may suggest a link between media coverage and the impact of celebrity suicide. Future studies should focus more on the underlying processes and confounding factors that may contribute to the impact of celebrity suicide on subsequent suicide rates. PMID:27051245

  20. Sugar and acid content of Citrus prediction modeling using FT-IR fingerprinting in combination with multivariate statistical analysis.

    PubMed

    Song, Seung Yeob; Lee, Young Koung; Kim, In-Jung

    2016-01-01

    A high-throughput screening system for Citrus lines were established with higher sugar and acid contents using Fourier transform infrared (FT-IR) spectroscopy in combination with multivariate analysis. FT-IR spectra confirmed typical spectral differences between the frequency regions of 950-1100 cm(-1), 1300-1500 cm(-1), and 1500-1700 cm(-1). Principal component analysis (PCA) and subsequent partial least square-discriminant analysis (PLS-DA) were able to discriminate five Citrus lines into three separate clusters corresponding to their taxonomic relationships. The quantitative predictive modeling of sugar and acid contents from Citrus fruits was established using partial least square regression algorithms from FT-IR spectra. The regression coefficients (R(2)) between predicted values and estimated sugar and acid content values were 0.99. These results demonstrate that by using FT-IR spectra and applying quantitative prediction modeling to Citrus sugar and acid contents, excellent Citrus lines can be early detected with greater accuracy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Investigation of Sunspot Area Varying with Sunspot Number

    NASA Astrophysics Data System (ADS)

    Li, K. J.; Li, F. Y.; Zhang, J.; Feng, W.

    2016-11-01

    The statistical relationship between sunspot area (SA) and sunspot number (SN) is investigated through analysis of their daily observation records from May 1874 to April 2015. For a total of 1607 days, representing 3 % of the total interval considered, either SA or SN had a value of zero while the other parameter did not. These occurrences most likely reflect the report of short-lived spots by a single observatory and subsequent averaging of zero values over multiple stations. The main results obtained are as follows: i) The number of spotless days around the minimum of a solar cycle is statistically negatively correlated with the maximum strength of solar activity of that cycle. ii) The probability distribution of SA generally decreases monotonically with SA, but the distribution of SN generally increases first, then it decreases as a whole. The different probability distribution of SA and SN should strengthen their non-linear relation, and the correction factor [k] in the definition of SN may be one of the factors that cause the non-linearity. iii) The non-linear relation of SA and SN indeed exists statistically, and it is clearer during the maximum epoch of a solar cycle.

  2. Bayesian statistics in medicine: a 25 year review.

    PubMed

    Ashby, Deborah

    2006-11-15

    This review examines the state of Bayesian thinking as Statistics in Medicine was launched in 1982, reflecting particularly on its applicability and uses in medical research. It then looks at each subsequent five-year epoch, with a focus on papers appearing in Statistics in Medicine, putting these in the context of major developments in Bayesian thinking and computation with reference to important books, landmark meetings and seminal papers. It charts the growth of Bayesian statistics as it is applied to medicine and makes predictions for the future. From sparse beginnings, where Bayesian statistics was barely mentioned, Bayesian statistics has now permeated all the major areas of medical statistics, including clinical trials, epidemiology, meta-analyses and evidence synthesis, spatial modelling, longitudinal modelling, survival modelling, molecular genetics and decision-making in respect of new technologies.

  3. Hilda Mary Woods MBE, DSc, LRAM, FSS (1892–1971): reflections on a Fellow of the Royal Statistical Society

    PubMed Central

    Farewell, Vern; Johnson, Tony; Gear, Rosemary

    2012-01-01

    We have previously described the content of a text by Woods and Russell, An Introduction to Medical Statistics, compared it with Principles of Medical Statistics by Hill and set both volumes against the background of vital statistics up until 1937. The two books mark a watershed in the history of medical statistics. Very little has been recorded about the life and career of the first author of the earlier textbook, who was a Fellow of the Royal Statistical Society for at least 25 years, an omission which we can now rectify with this paper. We describe her education, entry into medical statistics, relationship with Major Greenwood and her subsequent career and life in Ceylon, Kenya, Australia, England and South Africa. PMID:22973076

  4. Beyond Transitional Probability Computations: Extracting Word-Like Units when Only Statistical Information Is Available

    ERIC Educational Resources Information Center

    Perruchet, Pierre; Poulin-Charronnat, Benedicte

    2012-01-01

    Endress and Mehler (2009) reported that when adult subjects are exposed to an unsegmented artificial language composed from trisyllabic words such as ABX, YBC, and AZC, they are unable to distinguish between these words and what they coined as the "phantom-word" ABC in a subsequent test. This suggests that statistical learning generates knowledge…

  5. Comparison of oral nicotinamide adenine dinucleotide (NADH) versus conventional therapy for chronic fatigue syndrome.

    PubMed

    Santaella, María L; Font, Ivonne; Disdier, Orville M

    2004-06-01

    To compare effectiveness of oral therapy with reduced nicotinamide adenine dinucleotide (NADH) to conventional modalities of treatment in patients with chronic fatigue syndrome (CFS). CFS is a potentially disabling condition of unknown etiology. Although its clinical presentation is associated to a myriad of symptoms, fatigue is a universal and essential finding for its diagnosis. No therapeutic regimen has proven effective for this condition. A total of 31 patients fulfilling the Centers for Disease Control criteria for CFS, were randomly assigned to either NADH or nutritional supplements and psychological therapy for 24 months. A thorough medical history, physical examination and completion of a questionnaire on the severity of fatigue and other symptoms were performed each trimester of therapy. In addition, all of them underwent evaluation in terms of immunological parameters and viral antibody titers. Statistical analysis was applied to the demographic data, as well as to symptoms scores at baseline and at each trimester of therapy. The twelve patients who received NADH had a dramatic and statistically significant reduction of the mean symptom score in the first trimester (p < 0.001). However, symptom scores in the subsequent trimesters of therapy were similar in both treatment groups. Elevated IgG and Ig E antibody levels were found in a significant number of patients. Observed effectiveness of NADH over conventional treatment in the first trimester of the trial and the trend of improvement of that modality in the subsequent trimesters should be further assessed in a larger patient sample.

  6. Formulation and Statistical Optimization of Culture Medium for Improved Production of Antimicrobial Compound by Streptomyces sp. JAJ06

    PubMed Central

    Arul Jose, Polpass; Sivakala, Kunjukrishnan Kamalakshi; Jebakumar, Solomon Robinson David

    2013-01-01

    Streptomyces sp. JAJ06 is a seawater-dependent antibiotic producer, previously isolated and characterised from an Indian coastal solar saltern. This paper reports replacement of seawater with a defined salt formulation in production medium and subsequent statistical media optimization to ensure consistent as well as improved antibiotic production by Streptomyces sp. JAJ06. This strain was observed to be proficient to produce antibiotic compound with incorporation of chemically defined sodium-chloride-based salt formulation instead of seawater into the production medium. Plackett-Burman design experiment was applied, and three media constituents, starch, KBr, and CaCO3, were recognised to have significant effect on the antibiotic production of Streptomyces JAJ06 at their individual levels. Subsequently, Response surface methodology with Box-Behnken design was employed to optimize these influencing medium constituents for the improved antibiotic production of Streptomyces sp. JAJ06. A total of 17 experiments were conducted towards the construction of a quadratic model and a second-order polynomial equation. Optimum levels of medium constituents were obtained by analysis of the model and numerical optimization method. When the strain JAJ06 was cultivated in the optimized medium, the antibiotic activity was increased to 173.3 U/mL, 26.8% increase as compared to the original (136.7 U/mL). This study found a useful way to cultivate Streptomyces sp. JAJ06 for enhanced production of antibiotic compound. PMID:24454383

  7. Multiplicative point process as a model of trading activity

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Kaulakys, B.

    2004-11-01

    Signals consisting of a sequence of pulses show that inherent origin of the 1/ f noise is a Brownian fluctuation of the average interevent time between subsequent pulses of the pulse sequence. In this paper, we generalize the model of interevent time to reproduce a variety of self-affine time series exhibiting power spectral density S( f) scaling as a power of the frequency f. Furthermore, we analyze the relation between the power-law correlations and the origin of the power-law probability distribution of the signal intensity. We introduce a stochastic multiplicative model for the time intervals between point events and analyze the statistical properties of the signal analytically and numerically. Such model system exhibits power-law spectral density S( f)∼1/ fβ for various values of β, including β= {1}/{2}, 1 and {3}/{2}. Explicit expressions for the power spectra in the low-frequency limit and for the distribution density of the interevent time are obtained. The counting statistics of the events is analyzed analytically and numerically, as well. The specific interest of our analysis is related with the financial markets, where long-range correlations of price fluctuations largely depend on the number of transactions. We analyze the spectral density and counting statistics of the number of transactions. The model reproduces spectral properties of the real markets and explains the mechanism of power-law distribution of trading activity. The study provides evidence that the statistical properties of the financial markets are enclosed in the statistics of the time interval between trades. A multiplicative point process serves as a consistent model generating this statistics.

  8. The association between miR-499 polymorphism and cancer susceptibility: a meta-analysis.

    PubMed

    Xu, Zhongfei; Zhang, Enjiao; Duan, Weiyi; Sun, Changfu; Bai, Shuang; Tan, Xuexin

    2015-01-01

    MicroRNAs are a class of new noncoding RNA that play important roles in the pathogenesis of tumor. Rs3746444 in miR-499 is suggested to be associated with cancer susceptibility. In the present study, we assess the association between miR-499 rs3746444 polymorphism and cancer susceptibility through a meta-analysis. We searched relevant articles from the PubMed and Embase databases. We screened all the resulting articles for adherence to the inclusion and exclusion criteria. The associations between miR-499 polymorphism and cancer susceptibility were estimated by computing the odds ratios (ORs) and 95% confidence intervals (CIs). All analyses were performed using Stata software. There are 18 datasets included in the analysis. Statistically significant associations were found between the miR-499 rs3746444 polymorphism and susceptibility to cancer (GG versus AA: OR =1.24, 95% CI: 1.01-1.52; G versus A: OR =1.11, 95% CI: 1.01-1.23). A subsequent analysis, on the basis of ethnicity for the population characteristic, showed that Asians had increased susceptibility to cancer (GG versus AA: OR =1.32, 95% CI: 1.09-1.59; GG + AG versus AA: OR = 1.17, 95% CI: 1.01-1.37). In the subgroup analysis of tumor type, none of the genetic models had statistically significant results. The meta-regression suggested that race and cancer types are not the source of heterogeneity in the present meta-analysis. No publication bias was detected by either the inverted funnel plot or Egger's test. Rs3746444 in miR-499 might be related to susceptibility to cancer.

  9. Open Reduction and Internal Fixation versus Non-Surgical Treatment in Displaced Midshaft Clavicle Fractures: A Meta-Analysis.

    PubMed

    Ahmed, Abdulaziz F; Salameh, Motasem; AlKhatib, Nidal; Elmhiregh, Aissam; Ahmed, Ghalib O

    2018-04-17

    To compare open reduction and internal fixation (ORIF) and non-surgical treatment outcomes in displaced midshaft clavicle fractures. PubMed, MEDLINE, EMBASE, Web of Science, Cochrane Library, and ClinicalTrials.gov were searched in September 2017. Inclusion criteria were randomized controlled trials reporting nonunion, shoulder functional outcomes, and subsequent surgery rates or pain scores. We excluded studies with patients younger than 16 years, maximum follow-up less than nine months, and inaccessible full text. Extracted data included the first author, publication year, number of patients, number of nonunions, Constant scores, disabilities of the arm, shoulder and hand (DASH) scores, number of subsequent surgeries, and pain measured using the visual analogue analog scale. The risk ratio (RR) of nonunion was 0.15 (95% confidence interval [CI], 0.08, 0.31) in ORIF compared with that of non-surgical treatment. Constant and DASH scores were significantly better in ORIF up to 6 months. The mean difference (MD) in DASH scores at 12 months was statistically insignificant in both treatments (MD, -4.19; 95% CI, -9.34, 0.96). Constant scores remained significant in ORIF (MD, 4.39; 95% CI, 1.03, 7.75). Subsequent surgeries and pain scores were similar in both treatments. Significant reduction in nonunions and favorable early functional outcomes are associated with ORIF. Nevertheless, late functional outcomes, subsequent surgeries, and pain scores are similar to those of non-surgical treatment. Although patients treated with ORIF mainly had subsequent elective plate removals; non-surgically treated patients had more surgical fixations for nonunions. As a result, there remains inconsistent evidence regarding the best treatment for displaced midshaft clavicle fractures. Therapeutic Level I.

  10. Two-way DF relaying assisted D2D communication: ergodic rate and power allocation

    NASA Astrophysics Data System (ADS)

    Ni, Yiyang; Wang, Yuxi; Jin, Shi; Wong, Kai-Kit; Zhu, Hongbo

    2017-12-01

    In this paper, we investigate the ergodic rate for a device-to-device (D2D) communication system aided by a two-way decode-and-forward (DF) relay node. We first derive closed-form expressions for the ergodic rate of the D2D link under asymmetric and symmetric cases, respectively. We subsequently discuss two special scenarios including weak interference case and high signal-to-noise ratio case. Then we derive the tight approximations for each of the considered scenarios. Assuming that each transmitter only has access to its own statistical channel state information (CSI), we further derive closed-form power allocation strategy to improve the system performance according to the analytical results of the ergodic rate. Furthermore, some insights are provided for the power allocation strategy based on the analytical results. The strategies are easy to compute and require to know only the channel statistics. Numerical results show the accuracy of the analysis results under various conditions and test the availability of the power allocation strategy.

  11. Changes in infant disposable diaper weights at selected intervals post-wetting.

    PubMed

    Carlisle, Joan; Moore, Amanda; Cooper, Alyssa; Henderson, Terri; Mayfield, Debbie; Taylor, Randa; Thomas, Jennifer; Van Fleet, Laduska; Askanazi, David; Fineberg, Naomi; Sun, Yanhui

    2012-01-01

    Pediatric acute care nurses questioned the practice of weighing disposable infant diapers immediately after voiding. This study asked the research question, "Does volume of saline, diaper configuration, and/or size of diaper statistically effect changes in diaper weights over time?" The method was an experimental, laboratory model. Pre-set volumes of saline were added to disposable diapers that were then left folded or unfolded. Each diaper was weighed immediately post-wetting and re-weighed at hourly intervals for seven hours. Data were analyzed using a repeated measures analysis of variance (RMANOVA) with balanced data (F-test). Diaper weight changes over time were statistically significant for all time points and for all volumes regardless of diaper size; however, the changes in weight were small and without clinical significance. It is appropriate to weigh diapers at the end of eight hours without risk of altering subsequent fluid management of patients in open-air, non-humidified environments. This practice has led to more efficient use of nurses' time with fewer interruptions for patients and families.

  12. Methods for estimating drought streamflow probabilities for Virginia streams

    USGS Publications Warehouse

    Austin, Samuel H.

    2014-01-01

    Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.

  13. Management of undescended testis may be improved with educational updates and new transferring model.

    PubMed

    Yi, Wei; Sheng-de, Wu; Lian-Ju, Shen; Tao, Lin; Da-Wei, He; Guang-Hui, Wei

    2018-05-24

    To investigate whether management of undescended testis (UDT) may be improved with educational updates and new transferring model among referring providers (RPs). The age of orchidopexies performed in Children's Hospital of Chongqing Medical University were reviewed. We then proposed educational updates and new transferring model among RPs. The age of orchidopexies performed after our intervention were collected. Data were represented graphically and statistical analysis Chi-square for trend were used. A total of 1543 orchidopexies were performed. The median age of orchidopexy did not matched the target age of 6-12 months in any subsequent year. Survey of the RPs showed that 48.85% of their recommended age was below 12 months. However, only 25.50% of them would directly make a surgical referral to pediatric surgery specifically at this point. After we proposed educational updates, tracking the age of orchidopexy revealed a statistically significant trend downward. The management of undescended testis may be improved with educational updates and new transferring model among primary healthcare practitioners.

  14. Association between pathology and texture features of multi parametric MRI of the prostate

    NASA Astrophysics Data System (ADS)

    Kuess, Peter; Andrzejewski, Piotr; Nilsson, David; Georg, Petra; Knoth, Johannes; Susani, Martin; Trygg, Johan; Helbich, Thomas H.; Polanec, Stephan H.; Georg, Dietmar; Nyholm, Tufve

    2017-10-01

    The role of multi-parametric (mp)MRI in the diagnosis and treatment of prostate cancer has increased considerably. An alternative to visual inspection of mpMRI is the evaluation using histogram-based (first order statistics) parameters and textural features (second order statistics). The aims of the present work were to investigate the relationship between benign and malignant sub-volumes of the prostate and textures obtained from mpMR images. The performance of tumor prediction was investigated based on the combination of histogram-based and textural parameters. Subsequently, the relative importance of mpMR images was assessed and the benefit of additional imaging analyzed. Finally, sub-structures based on the PI-RADS classification were investigated as potential regions to automatically detect maligned lesions. Twenty-five patients who received mpMRI prior to radical prostatectomy were included in the study. The imaging protocol included T2, DWI, and DCE. Delineation of tumor regions was performed based on pathological information. First and second order statistics were derived from each structure and for all image modalities. The resulting data were processed with multivariate analysis, using PCA (principal component analysis) and OPLS-DA (orthogonal partial least squares discriminant analysis) for separation of malignant and healthy tissue. PCA showed a clear difference between tumor and healthy regions in the peripheral zone for all investigated images. The predictive ability of the OPLS-DA models increased for all image modalities when first and second order statistics were combined. The predictive value reached a plateau after adding ADC and T2, and did not increase further with the addition of other image information. The present study indicates a distinct difference in the signatures between malign and benign prostate tissue. This is an absolute prerequisite for automatic tumor segmentation, but only the first step in that direction. For the specific identified signature, DCE did not add complementary information to T2 and ADC maps.

  15. Efficacy and Physicochemical Evaluation of an Optimized Semisolid Formulation of Povidone Iodine Proposed by Extreme Vertices Statistical Design; a Practical Approach

    PubMed Central

    Lotfipour, Farzaneh; Valizadeh, Hadi; Shademan, Shahin; Monajjemzadeh, Farnaz

    2015-01-01

    One of the most significant issues in pharmaceutical industries, prior to commercialization of a pharmaceutical preparation is the "preformulation" stage. However, far too attention has been paid to verification of the software assisted statistical designs in preformulation studies. The main aim of this study was to report a step by step preformulation approach for a semisolid preparation based on a statistical mixture design and to verify the predictions made by the software with an in-vitro efficacy bioassay test. Extreme vertices mixture design (4 factors, 4 levels) was applied for preformulation of a semisolid Povidone Iodine preparation as Water removable ointment using different PolyEthylenGlycoles. Software Assisted (Minitab) analysis was then performed using four practically assessed response values including; Available iodine, viscosity (N index and yield value) and water absorption capacity. Subsequently mixture analysis was performed and finally, an optimized formulation was proposed. The efficacy of this formulation was bio-assayed using microbial tests in-vitro and MIC values were calculated for Escherichia coli, pseudomonaaeruginosa, staphylococcus aureus and Candida albicans. Results indicated the acceptable conformity of the measured responses. Thus, it can be concluded that the proposed design had an adequate power to predict the responses in practice. Stability studies, proved no significant change during the one year study for the optimized formulation. Efficacy was eligible on all tested species and in the case of staphylococcus aureus; the prepared semisolid formulation was even more effective. PMID:26664368

  16. Electromagnetic wave scattering from rough terrain

    NASA Astrophysics Data System (ADS)

    Papa, R. J.; Lennon, J. F.; Taylor, R. L.

    1980-09-01

    This report presents two aspects of a program designed to calculate electromagnetic scattering from rough terrain: (1) the use of statistical estimation techniques to determine topographic parameters and (2) the results of a single-roughness-scale scattering calculation based on those parameters, including comparison with experimental data. In the statistical part of the present calculation, digitized topographic maps are used to generate data bases for the required scattering cells. The application of estimation theory to the data leads to the specification of statistical parameters for each cell. The estimated parameters are then used in a hypothesis test to decide on a probability density function (PDF) that represents the height distribution in the cell. Initially, the formulation uses a single observation of the multivariate data. A subsequent approach involves multiple observations of the heights on a bivariate basis, and further refinements are being considered. The electromagnetic scattering analysis, the second topic, calculates the amount of specular and diffuse multipath power reaching a monopulse receiver from a pulsed beacon positioned over a rough Earth. The program allows for spatial inhomogeneities and multiple specular reflection points. The analysis of shadowing by the rough surface has been extended to the case where the surface heights are distributed exponentially. The calculated loss of boresight pointing accuracy attributable to diffuse multipath is then compared with the experimental results. The extent of the specular region, the use of localized height variations, and the effect of the azimuthal variation in power pattern are all assessed.

  17. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    PubMed

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  18. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    PubMed

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P < 0.001). At qualitative analysis of the third study, it also showed that the images reconstructed using ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P < 0.001). Our phantom studies showed that ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  19. Learning curves for single incision and conventional laparoscopic right hemicolectomy: a multidimensional analysis.

    PubMed

    Park, Yoonah; Yong, Yuen Geng; Yun, Seong Hyeon; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung

    2015-05-01

    This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase.

  20. Risk models for post-endoscopic retrograde cholangiopancreatography pancreatitis (PEP): smoking and chronic liver disease are predictors of protection against PEP.

    PubMed

    DiMagno, Matthew J; Spaete, Joshua P; Ballard, Darren D; Wamsteker, Erik-Jan; Saini, Sameer D

    2013-08-01

    We investigated which variables independently associated with protection against or development of postendoscopic retrograde cholangiopancreatography (ERCP) pancreatitis (PEP) and severity of PEP. Subsequently, we derived predictive risk models for PEP. In a case-control design, 6505 patients had 8264 ERCPs, 211 patients had PEP, and 22 patients had severe PEP. We randomly selected 348 non-PEP controls. We examined 7 established- and 9 investigational variables. In univariate analysis, 7 variables predicted PEP: younger age, female sex, suspected sphincter of Oddi dysfunction (SOD), pancreatic sphincterotomy, moderate-difficult cannulation (MDC), pancreatic stent placement, and lower Charlson score. Protective variables were current smoking, former drinking, diabetes, and chronic liver disease (CLD, biliary/transplant complications). Multivariate analysis identified seven independent variables for PEP, three protective (current smoking, CLD-biliary, CLD-transplant/hepatectomy complications) and 4 predictive (younger age, suspected SOD, pancreatic sphincterotomy, MDC). Pre- and post-ERCP risk models of 7 variables have a C-statistic of 0.74. Removing age (seventh variable) did not significantly affect the predictive value (C-statistic of 0.73) and reduced model complexity. Severity of PEP did not associate with any variables by multivariate analysis. By using the newly identified protective variables with 3 predictive variables, we derived 2 risk models with a higher predictive value for PEP compared to prior studies.

  1. Bladder radiotherapy treatment: A retrospective comparison of 3-dimensional conformal radiotherapy, intensity-modulated radiation therapy, and volumetric-modulated arc therapy plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasciuti, Katia, E-mail: k.pasciuti@virgilio.it; Kuthpady, Shrinivas; Anderson, Anne

    To examine tumor's and organ's response when different radiotherapy plan techniques are used. Ten patients with confirmed bladder tumors were first treated using 3-dimensional conformal radiotherapy (3DCRT) and subsequently the original plans were re-optimized using the intensity-modulated radiation treatment (IMRT) and volumetric-modulated arc therapy (VMAT)-techniques. Targets coverage in terms of conformity and homogeneity index, TCP, and organs' dose limits, including integral dose analysis were evaluated. In addition, MUs and treatment delivery times were compared. Better minimum target coverage (1.3%) was observed in VMAT plans when compared to 3DCRT and IMRT ones confirmed by a statistically significant conformity index (CI) results.more » Large differences were observed among techniques in integral dose results of the femoral heads. Even if no statistically significant differences were reported in rectum and tissue, a large amount of energy deposition was observed in 3DCRT plans. In any case, VMAT plans provided better organs and tissue sparing confirmed also by the normal tissue complication probability (NTCP) analysis as well as a better tumor control probability (TCP) result. Our analysis showed better overall results in planning using VMAT techniques. Furthermore, a total time reduction in treatment observed among techniques including gantry and collimator rotation could encourage using the more recent one, reducing target movements and patient discomfort.« less

  2. Assessment of contribution of Australia's energy production to CO2 emissions and environmental degradation using statistical dynamic approach.

    PubMed

    Sarkodie, Samuel Asumadu; Strezov, Vladimir

    2018-10-15

    Energy production remains the major emitter of atmospheric emissions, thus, in accordance with Australia's Emissions Projections by 2030, this study analyzed the impact of Australia's energy portfolio on environmental degradation and CO 2 emissions using locally compiled data on disaggregate energy production, energy imports and exports spanning from 1974 to 2013. This study employed the fully modified ordinary least squares, dynamic ordinary least squares, and canonical cointegrating regression estimators; statistically inspired modification of partial least squares regression analysis with a subsequent sustainability sensitivity analysis. The validity of the environmental Kuznets curve hypothesis proposes a paradigm shift from energy-intensive and carbon-intensive industries to less-energy-intensive and green energy industries and its related services, leading to a structural change in the economy. Thus, decoupling energy services provide better interpretation of the role of the energy sector portfolio in environmental degradation and CO 2 emissions assessment. The sensitivity analysis revealed that nonrenewable energy production above 10% and energy imports above 5% will dampen the goals for the 2030 emission reduction target. Increasing the share of renewable energy penetration in the energy portfolio decreases the level of CO 2 emissions, while increasing the share of non-renewable energy sources in the energy mix increases the level of atmospheric emissions, thus increasing climate change and their impacts. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. [Continuity of hospital identifiers in hospital discharge data - Analysis of the nationwide German DRG Statistics from 2005 to 2013].

    PubMed

    Nimptsch, Ulrike; Wengler, Annelene; Mansky, Thomas

    2016-11-01

    In Germany, nationwide hospital discharge data (DRG statistics provided by the research data centers of the Federal Statistical Office and the Statistical Offices of the 'Länder') are increasingly used as data source for health services research. Within this data hospitals can be separated via their hospital identifier ([Institutionskennzeichen] IK). However, this hospital identifier primarily designates the invoicing unit and is not necessarily equivalent to one hospital location. Aiming to investigate direction and extent of possible bias in hospital-level analyses this study examines the continuity of the hospital identifier within a cross-sectional and longitudinal approach and compares the results to official hospital census statistics. Within the DRG statistics from 2005 to 2013 the annual number of hospitals as classified by hospital identifiers was counted for each year of observation. The annual number of hospitals derived from DRG statistics was compared to the number of hospitals in the official census statistics 'Grunddaten der Krankenhäuser'. Subsequently, the temporal continuity of hospital identifiers in the DRG statistics was analyzed within cohorts of hospitals. Until 2013, the annual number of hospital identifiers in the DRG statistics fell by 175 (from 1,725 to 1,550). This decline affected only providers with small or medium case volume. The number of hospitals identified in the DRG statistics was lower than the number given in the census statistics (e.g., in 2013 1,550 IK vs. 1,668 hospitals in the census statistics). The longitudinal analyses revealed that the majority of hospital identifiers persisted in the years of observation, while one fifth of hospital identifiers changed. In cross-sectional studies of German hospital discharge data the separation of hospitals via the hospital identifier might lead to underestimating the number of hospitals and consequential overestimation of caseload per hospital. Discontinuities of hospital identifiers over time might impair the follow-up of hospital cohorts. These limitations must be taken into account in analyses of German hospital discharge data focusing on the hospital level. Copyright © 2016. Published by Elsevier GmbH.

  4. Recent intensified impact of December Arctic Oscillation on subsequent January temperature in Eurasia and North Africa

    NASA Astrophysics Data System (ADS)

    He, Shengping; Wang, Huijun; Gao, Yongqi; Li, Fei

    2018-03-01

    This study reveals an intensified influence of December Arctic Oscillation (AO) on the subsequent January surface air temperature (SAT) over Eurasia and North Africa in recent decades. The connection is statistically insignificant during 1957/58-1979/80 (P1), which becomes statistically significant during 1989/90-2011/12 (P2). The possible causes are further investigated. Associated with positive December AO during P2, significant anomalous anticyclone emerges over the central North Atlantic, which is accompanied with significant westerly and easterly anomalies along 45°-65°N and 20°-40°N, respectively. This favors the significant influence of December AO on the subsequent January SAT and atmospheric circulation over Eurasia and North Africa via triggering the North Atlantic tripole sea surface temperature (SST) anomaly that persists into the subsequent January. By contrast, the December AO-related anomalous anticyclone during P1 is weak and is characterized by two separate centers located in the eastern and western North Atlantic. Correspondingly, the westerly and easterly anomalies over the North Atlantic Ocean are weak and the-related tripole SST anomaly is not well formed, unfavorable for the persistent impact of the December AO into the subsequent January. Further analyses indicate that the different anomalous anticyclone associated with the December AO over the North Atlantic may be induced by the strengthened synoptic-scale eddy feedbacks over the North Atlantic, which may be related to the interdecadal intensification of the storm track activity. Additionally, the planetary stationary wave related to the December AO propagates from surface into upper stratosphere at mid-latitudes during P2, which further propagates downward to the troposphere and causes anomalous atmospheric circulation in the subsequent January.

  5. A Network-Based Method to Assess the Statistical Significance of Mild Co-Regulation Effects

    PubMed Central

    Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna

    2013-01-01

    Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis. PMID:24039936

  6. Unwanted pregnancy and induced abortion among young women 16-22 years old in Greece: a retrospective study of the risk factors.

    PubMed

    Salakos, N; Koumousidis, A; Bakalianou, K; Paltoglou, G; Kalampokas, T; Iavazzo, C

    2010-01-01

    Unwanted pregnancies and the subsequent induced abortions are common problems of our youths in modern Greece. The aim of this study was to recognize the risk factors of the problem in an effort to find the best possible solution out of this social dead end. We interviewed 1,320 young female individuals and analyzed their answers using statistical analysis. Several useful conclusions were reached concerning the forces that are involved in unwanted pregnancy/induced abortions. We have tried to underline the strategy to combat the problem. Sexual education and the proper use of contraception remain the essential tools in this effort.

  7. Multifractal Approach to Time Clustering of Earthquakes. Application to Mt. Vesuvio Seismicity

    NASA Astrophysics Data System (ADS)

    Codano, C.; Alonzo, M. L.; Vilardo, G.

    The clustering structure of the Vesuvian earthquakes occurring is investigated by means of statistical tools: the inter-event time distribution, the running mean and the multifractal analysis. The first cannot clearly distinguish between a Poissonian process and a clustered one due to the difficulties of clearly distinguishing between an exponential distribution and a power law one. The running mean test reveals the clustering of the earthquakes, but looses information about the structure of the distribution at global scales. The multifractal approach can enlighten the clustering at small scales, while the global behaviour remains Poissonian. Subsequently the clustering of the events is interpreted in terms of diffusive processes of the stress in the earth crust.

  8. Quantification of hemoglobin and its derivatives in oral cancer diagnosis by diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Kaniyappan, Udayakumar; Gnanatheepam, Einstein; Aruna, Prakasarao; Dornadula, Koteeswaran; Ganesan, Singaravelu

    2017-02-01

    Cancer is one of the most common threat to human beings and it increases at an alarming level around the globe. In recent years, due to the advancements in opto-electronic technology, various optical spectroscopy techniques have emerged to assess the photophysicochemical and morphological conditions of normal and malignant tissues in micro as well as in macroscopic scale. In this regard, diffuse reflectance spectroscopy is considered to be the simplest, cost effective and rapid technique in diagnosis of cancerous tissues. In the present study, the hemoglobin concentration in normal and cancerous oral tissues was quantified and subsequent statistical analysis has been carried out to verify the diagnostic potentiality of the technique.

  9. Regional frequency analysis of extreme rainfalls using partial L moments method

    NASA Astrophysics Data System (ADS)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  10. Characterization of abdominal pain during methylnaltrexone treatment of opioid-induced constipation in advanced illness: a post hoc analysis of two clinical trials.

    PubMed

    Slatkin, Neal E; Lynn, Richard; Su, Chinyu; Wang, Wenjin; Israel, Robert J

    2011-11-01

    Methylnaltrexone is a selective peripherally acting mu-opioid receptor antagonist that decreases the constipating effects of opioids without affecting centrally mediated analgesia. In two double-blind, placebo-controlled, Phase III studies of methylnaltrexone for opioid-induced constipation in patients with advanced illness, abdominal pain was the most common adverse event (AE) reported. This analysis sought to further characterize the Medical Dictionary for Regulatory Activities-defined abdominal pain AEs experienced in these studies. A post hoc analysis of verbatim descriptions was used to further assess AEs characterized as abdominal pain in both trials. Descriptive summary statistics were used to assess severity of abdominal pain, effect of abdominal pain on global pain scores, and other characteristics. Logistic regression analysis was used to determine the association of baseline characteristics with abdominal pain. Most verbatim descriptions of abdominal pain referred to "abdominal cramps" or "cramping." Abdominal pain AEs were mostly mild to moderate in severity and did not affect patients' global evaluation of pain. The incidence of abdominal pain AEs in methylnaltrexone-treated patients was greatest after the first dose and decreased with subsequent doses. No association between abdominal pain AEs and most baseline patient characteristics was noted. Abdominal pain AEs in methylnaltrexone-treated patients in clinical trials are usually described as "cramps" or "cramping," are mostly mild to moderate in severity, and decrease in incidence with subsequent dosing. Copyright © 2011 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  11. A double-blind placebo-controlled cross-over clinical trial of DONepezil In Posterior cortical atrophy due to underlying Alzheimer's Disease: DONIPAD study.

    PubMed

    Ridha, Basil H; Crutch, Sebastian; Cutler, Dawn; Frost, Christopher; Knight, William; Barker, Suzie; Epie, Norah; Warrington, Elizabeth K; Kukkastenvehmas, Riitta; Douglas, Jane; Rossor, Martin N

    2018-05-01

    The study investigated whether donepezil exerts symptomatic benefit in patients with posterior cortical atrophy (PCA), an atypical variant of Alzheimer's disease. A single-centre, double-blind, placebo-controlled, cross-over clinical trial was performed to assess the efficacy of donepezil in patients with PCA. Each patient received either donepezil (5 mg once daily in the first 6 weeks and 10 mg once daily in the second 6 weeks) or placebo for 12 weeks. After a 2-week washout period, each patient received the other treatment arm during the following 12 weeks followed by another 2-week washout period. The primary outcome was the Mini-Mental State Examination (MMSE) at 12 weeks. Secondary outcome measures were five neuropsychological tests reflecting parieto-occipital function. Intention-to-treat analysis was used. For each outcome measure, carry-over effects were first assessed. If present, then analysis was restricted to the first 12-week period. Otherwise, the standard approach to the analysis of a 2 × 2 cross-over trial was used. Eighteen patients (13 females) were recruited (mean age 61.6 years). There was a protocol violation in one patient, who subsequently withdrew from the study due to gastrointestinal side effects. There was statistically significant (p < 0.05) evidence of a carry-over effect on MMSE. Therefore, the analysis of treatment effect on MMSE was restricted to the first 12-week period. Treatment effect at 6 weeks was statistically significant (difference = 2.5 in favour of donepezil, 95% CI 0.1 to 5.0, p < 0.05). Treatment effect at 12 weeks was close, but not statistically significant (difference = 2.0 in favour of donepezil, 95% CI -0.1 to 4.5, p > 0.05). There were no statistically significant treatment effects on any of the five neuropsychological tests, except for digit span at 12 weeks (higher by 0.5 digits in favour of placebo, 95% CI 0.1 to 0.9). Gastrointestinal side effects occurred most frequently, affecting 13/18 subjects (72%), and were the cause of study discontinuation in one subject. Nightmares and vivid dreams occurred in 8/18 subjects (44%), and were statistically more frequent during treatment with donepezil. In this small study, there was no statistically significant treatment effect of donepezil on the primary outcome measure (MMSE score at 12 weeks) in PCA patients, who appear to be particularly susceptible to the development of nightmares and vivid dreams when treated. Trial registration: Current Controlled Trials ISRCTN22636071 . Retrospectively registered 19 May 2010.

  12. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    PubMed Central

    Cunningham, Michael R.; Baumeister, Roy F.

    2016-01-01

    The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.’s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect – contrary to their title. PMID:27826272

  13. Age and gender differences in conviction and crash occurrence subsequent to being directed to Iowa's driver improvement program.

    PubMed

    Zhang, Wei; Gkritza, Konstantina; Keren, Nir; Nambisan, Shashi

    2011-10-01

    This paper investigates potential gender and age differences in conviction and crash occurrence subsequent to being directed to attend Iowa's Driver Improvement Program (DIP). Binary logit models were developed to investigate the factors that influence conviction occurrence after DIP by gender and age. Because of the low crash occurrence subsequent to DIP, association rules were applied to investigate the factors that influence crash occurrence subsequent to DIP, in lieu of econometric models. There were statistical significant differences by driver gender, age, and conviction history in the likelihood of subsequent convictions. However, this paper found no association between DIP outcome, crash history, and crash occurrence. Evaluating the differences in conviction and crash occurrence subsequent to DIP between female and male drivers, and among different age groups can lead to improvements of the effectiveness of DIPs and help to identify low-cost intervention measures, customized based on drivers' gender and age, for improving driving behaviors. Copyright © 2011 National Safety Council and Elsevier Ltd. All rights reserved.

  14. A Longitudinal Analysis of Treatment Optimism and HIV Acquisition and Transmission Risk Behaviors Among Black Men Who Have Sex with Men in HPTN 061.

    PubMed

    Levy, Matthew E; Phillips, Gregory; Magnus, Manya; Kuo, Irene; Beauchamp, Geetha; Emel, Lynda; Hucks-Ortiz, Christopher; Hamilton, Erica L; Wilton, Leo; Chen, Iris; Mannheimer, Sharon; Tieu, Hong-Van; Scott, Hyman; Fields, Sheldon D; Del Rio, Carlos; Shoptaw, Steven; Mayer, Kenneth

    2017-10-01

    Little is known about HIV treatment optimism and risk behaviors among Black men who have sex with men (BMSM). Using longitudinal data from BMSM in the HPTN 061 study, we examined participants' self-reported comfort with having condomless sex due to optimistic beliefs regarding HIV treatment. We assessed correlates of treatment optimism and its association with subsequent risk behaviors for HIV acquisition or transmission using multivariable logistic regression with generalized estimating equations. Independent correlates of treatment optimism included age ≥35 years, annual household income <$20,000, depressive symptoms, high HIV conspiracy beliefs, problematic alcohol use, and previous HIV diagnosis. Treatment optimism was independently associated with subsequent condomless anal sex with a male partner of serodiscordant/unknown HIV status among HIV-infected men, but this association was not statistically significant among HIV-uninfected men. HIV providers should engage men in counseling conversations to assess and minimize willingness to have condomless sex that is rooted in optimistic treatment beliefs without knowledge of viral suppression.

  15. Does Body Mass Index Reduction by Bariatric Surgery Affect Laryngoscopy Difficulty During Subsequent Anesthesia?

    PubMed

    Shimonov, Mordechai; Schechter, Pinhas; Boaz, Mona; Waintrob, Ronen; Ezri, Tiberiu

    2017-03-01

    The effect of body mass index (BMI) reduction following bariatric surgery on subsequent airway management has not been investigated. This study aimed to investigate the association between BMI reduction and airway assessment and management measured by Mallampati class (MC) and laryngoscopy grade (LG). We conducted a retrospective study over 6 years to compare the BMI changes, MC and LG in patients having weight reduction bariatric surgery followed by subsequent surgery. Data was extracted from the anesthesia records of patients undergoing laparoscopic band insertion (LBI) and laparoscopic sleeve gastrectomy (LSG). Difficult airway was defined as Malampati class 3 and 4 on a 1-4 difficulty scale or laryngoscopy grade >2 on a 1-4 difficulty scale and need for unplanned fiberoptic intubation. Changes in these variables were correlated with weight reduction. Statistical analysis included t test, univariante, and multivariant logistic regression. Five hundred forty-six patients underwent LSG and 83 patients had LBI during the study period. Of those patients, 65 patients had subsequent surgical procedures after the bariatric procedure. Of the 65 patients identified, 62 were eligible. BMI decreased by approximately13 kg/m 2 (p = 0.000) which roughly represents a 30 % reduction between the two surgical procedures. Mallampati class decreased significantly (p = 0.000) while laryngoscopy grade did not (p = 0.419). Our study revealed that a significant reduction in BMI was associated with a significant decrease in Mallampati class. There was no significant decrease in laryngoscopy grade, and there was no case of unplanned fiberoptic intubation.

  16. Study of Staphylococcus aureus N315 Pathogenic Genes by Text Mining and Enrichment Analysis of Pathways and Operons.

    PubMed

    Yang, Chun-Feng; Gou, Wei-Hui; Dai, Xin-Lun; Li, Yu-Mei

    2018-06-01

    Staphylococcus aureus (S. aureus) is a versatile pathogen found in many environments and can cause nosocomial infections in the community and hospitals. S. aureus infection is an increasingly serious threat to global public health that requires action across many government bodies, medical and health sectors, and scientific research institutions. In the present study, S. aureus N315 genes that have been shown in the literature to be pathogenic were extracted using a bibliometric method for functional enrichment analysis of pathways and operons to statistically discover novel pathogenic genes associated with S. aureus N315. A total of 383 pathogenic genes were mined from the literature using bibliometrics, and subsequently a few new pathogenic genes of S. aureus N315 were identified by functional enrichment analysis of pathways and operons. The discovery of these novel S. aureus N315 pathogenic genes is of great significance to treat S. aureus induced diseases and identify potential diagnostic markers, thus providing theoretical fundamentals for epidemiological prevention.

  17. Diagnosis by Volatile Organic Compounds in Exhaled Breath from Lung Cancer Patients Using Support Vector Machine Algorithm

    PubMed Central

    Sakumura, Yuichi; Koyama, Yutaro; Tokutake, Hiroaki; Hida, Toyoaki; Sato, Kazuo; Itoh, Toshio; Akamatsu, Takafumi; Shin, Woosuck

    2017-01-01

    Monitoring exhaled breath is a very attractive, noninvasive screening technique for early diagnosis of diseases, especially lung cancer. However, the technique provides insufficient accuracy because the exhaled air has many crucial volatile organic compounds (VOCs) at very low concentrations (ppb level). We analyzed the breath exhaled by lung cancer patients and healthy subjects (controls) using gas chromatography/mass spectrometry (GC/MS), and performed a subsequent statistical analysis to diagnose lung cancer based on the combination of multiple lung cancer-related VOCs. We detected 68 VOCs as marker species using GC/MS analysis. We reduced the number of VOCs and used support vector machine (SVM) algorithm to classify the samples. We observed that a combination of five VOCs (CHN, methanol, CH3CN, isoprene, 1-propanol) is sufficient for 89.0% screening accuracy, and hence, it can be used for the design and development of a desktop GC-sensor analysis system for lung cancer. PMID:28165388

  18. Diagnosis by Volatile Organic Compounds in Exhaled Breath from Lung Cancer Patients Using Support Vector Machine Algorithm.

    PubMed

    Sakumura, Yuichi; Koyama, Yutaro; Tokutake, Hiroaki; Hida, Toyoaki; Sato, Kazuo; Itoh, Toshio; Akamatsu, Takafumi; Shin, Woosuck

    2017-02-04

    Monitoring exhaled breath is a very attractive, noninvasive screening technique for early diagnosis of diseases, especially lung cancer. However, the technique provides insufficient accuracy because the exhaled air has many crucial volatile organic compounds (VOCs) at very low concentrations (ppb level). We analyzed the breath exhaled by lung cancer patients and healthy subjects (controls) using gas chromatography/mass spectrometry (GC/MS), and performed a subsequent statistical analysis to diagnose lung cancer based on the combination of multiple lung cancer-related VOCs. We detected 68 VOCs as marker species using GC/MS analysis. We reduced the number of VOCs and used support vector machine (SVM) algorithm to classify the samples. We observed that a combination of five VOCs (CHN, methanol, CH₃CN, isoprene, 1-propanol) is sufficient for 89.0% screening accuracy, and hence, it can be used for the design and development of a desktop GC-sensor analysis system for lung cancer.

  19. Lack of effectiveness of laser therapy applied to the nerve course and the correspondent medullary roots

    PubMed Central

    Sousa, Fausto Fernandes de Almeida; Ribeiro, Thaís Lopes; Fazan, Valéria Paula Sassoli; Barbieri, Claudio Henrique

    2013-01-01

    OBJECTIVE: To investigate the influence of low intensity laser irradiation on the regeneration of the fibular nerve of rats after crush injury. METHODS: Twenty-five rats were used, divided into three groups: 1) intact nerve, no treatment; 2) crushed nerve, no treatment; 3) crush injury, laser irradiation applied on the medullary region corresponding to the roots of the sciatic nerve and subsequently on the course of the damaged nerve. Laser irradiation was carried out for 14 consecutive days. RESULTS: Animals were evaluated by functional gait analysis with the peroneal functional index and by histomorphometric analysis using the total number of myelinated nerve fibers and their density, total number of Schwann cells, total number of blood vessels and the occupied area, minimum diameter of the fiber diameter and G-quotient. CONCLUSION: According to the statistical analysis there was no significant difference among groups and the authors conclude that low intensity laser irradiation has little or no influence on nerve regeneration and functional recovery. Laboratory investigation. PMID:24453650

  20. Folic acid supplements and colorectal cancer risk: meta-analysis of randomized controlled trials

    NASA Astrophysics Data System (ADS)

    Qin, Tingting; Du, Mulong; Du, Haina; Shu, Yongqian; Wang, Meilin; Zhu, Lingjun

    2015-07-01

    Numerous studies have investigated the effects of folic acid supplementation on colorectal cancer risk, but conflicting results were reported. We herein performed a meta-analysis based on relevant studies to reach a more definitive conclusion. The PubMed and Embase databases were searched for quality randomized controlled trials (RCTs) published before October 2014. Eight articles met the inclusion criteria and were subsequently analyzed. The results suggested that folic acid treatment was not associated with colorectal cancer risk in the total population (relative risk [RR] = 1.00, 95% confidence interval [CI] = 0.82-1.22, P = 0.974). Moreover, no statistical effect was identified in further subgroup analyses stratified by ethnicity, gender, body mass index (BMI) and potential confounding factors. No significant heterogeneity or publication bias was observed. In conclusion, our meta-analysis demonstrated that folic acid supplementation had no effect on colorectal cancer risk. However, this finding must be validated by further large studies.

  1. A basic analysis toolkit for biological sequences

    PubMed Central

    Giancarlo, Raffaele; Siragusa, Alessandro; Siragusa, Enrico; Utro, Filippo

    2007-01-01

    This paper presents a software library, nicknamed BATS, for some basic sequence analysis tasks. Namely, local alignments, via approximate string matching, and global alignments, via longest common subsequence and alignments with affine and concave gap cost functions. Moreover, it also supports filtering operations to select strings from a set and establish their statistical significance, via z-score computation. None of the algorithms is new, but although they are generally regarded as fundamental for sequence analysis, they have not been implemented in a single and consistent software package, as we do here. Therefore, our main contribution is to fill this gap between algorithmic theory and practice by providing an extensible and easy to use software library that includes algorithms for the mentioned string matching and alignment problems. The library consists of C/C++ library functions as well as Perl library functions. It can be interfaced with Bioperl and can also be used as a stand-alone system with a GUI. The software is available at under the GNU GPL. PMID:17877802

  2. Having been bullied in childhood: relationship to aggressive behaviour in adulthood.

    PubMed

    Sansone, Randy A; Leung, Justin S; Wiederman, Michael W

    2013-12-01

    Victimization through being bullied in childhood is traditionally associated with subsequent internalizing symptoms, but some literature suggests otherwise. In this study, we examined a history of being bullied in relationship to 21 externalized aggressive behaviours in adulthood. Using a cross-sectional approach and a self-report survey methodology, we examined a history of being bullied in childhood in relation to 21 aggression variables in a consecutive sample of 342 internal medicine outpatients. In comparison with the not bullied, participants who reported having been bullied in childhood had a statistically significantly greater overall number of self-reported aggressive behaviours. Longer duration of being bullied was statistically significantly correlated with a greater number of reported aggressive behaviours. With regard to individual behaviours, four were statistically significantly associated with being bullied: hitting walls; intentionally breaking things; getting into fist fights; and pushing/shoving a partner. While relationships between bullying in childhood and subsequent internalizing symptoms have been well established, the present study indicates that bullying in childhood is also associated with externalizing/aggressive behaviours in adulthood.

  3. A novel microfluidic platform for size and deformability based separation and the subsequent molecular characterization of viable circulating tumor cells.

    PubMed

    Hvichia, G E; Parveen, Z; Wagner, C; Janning, M; Quidde, J; Stein, A; Müller, V; Loges, S; Neves, R P L; Stoecklein, N H; Wikman, H; Riethdorf, S; Pantel, K; Gorges, T M

    2016-06-15

    Circulating tumor cells (CTCs) were introduced as biomarkers more than 10 years ago, but capture of viable CTCs at high purity from peripheral blood of cancer patients is still a major technical challenge. Here, we report a novel microfluidic platform designed for marker independent capture of CTCs. The Parsortix™ cell separation system provides size and deformability-based enrichment with automated staining for cell identification, and subsequent recovery (harvesting) of cells from the device. Using the Parsortix™ system, average cell capture inside the device ranged between 42% and 70%. Subsequent harvest of cells from the device ranged between 54% and 69% of cells captured. Most importantly, 99% of the isolated tumor cells were viable after processing in spiking experiments as well as after harvesting from patient samples and still functional for downstream molecular analysis as demonstrated by mRNA characterization and array-based comparative genomic hybridization. Analyzing clinical blood samples from metastatic (n = 20) and nonmetastatic (n = 6) cancer patients in parallel with CellSearch(®) system, we found that there was no statistically significant difference between the quantitative behavior of the two systems in this set of twenty six paired separations. In conclusion, the epitope independent Parsortix™ system enables the isolation of viable CTCs at a very high purity. Using this system, viable tumor cells are easily accessible and ready for molecular and functional analysis. The system's ability for enumeration and molecular characterization of EpCAM-negative CTCs will help to broaden research into the mechanisms of cancer as well as facilitating the use of CTCs as "liquid biopsies." © 2016 The Authors International Journal of Cancer published by John Wiley & Sons Ltd on behalf of UICC.

  4. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  5. Effects of Sequences of Cognitions on Group Performance Over Time

    PubMed Central

    Molenaar, Inge; Chiu, Ming Ming

    2017-01-01

    Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions. PMID:28490854

  6. Effects of Sequences of Cognitions on Group Performance Over Time.

    PubMed

    Molenaar, Inge; Chiu, Ming Ming

    2017-04-01

    Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions.

  7. ISRNA: an integrative online toolkit for short reads from high-throughput sequencing data.

    PubMed

    Luo, Guan-Zheng; Yang, Wei; Ma, Ying-Ke; Wang, Xiu-Jie

    2014-02-01

    Integrative Short Reads NAvigator (ISRNA) is an online toolkit for analyzing high-throughput small RNA sequencing data. Besides the high-speed genome mapping function, ISRNA provides statistics for genomic location, length distribution and nucleotide composition bias analysis of sequence reads. Number of reads mapped to known microRNAs and other classes of short non-coding RNAs, coverage of short reads on genes, expression abundance of sequence reads as well as some other analysis functions are also supported. The versatile search functions enable users to select sequence reads according to their sub-sequences, expression abundance, genomic location, relationship to genes, etc. A specialized genome browser is integrated to visualize the genomic distribution of short reads. ISRNA also supports management and comparison among multiple datasets. ISRNA is implemented in Java/C++/Perl/MySQL and can be freely accessed at http://omicslab.genetics.ac.cn/ISRNA/.

  8. Classification of passive auditory event-related potentials using discriminant analysis and self-organizing feature maps.

    PubMed

    Schönweiler, R; Wübbelt, P; Tolloczko, R; Rose, C; Ptok, M

    2000-01-01

    Discriminant analysis (DA) and self-organizing feature maps (SOFM) were used to classify passively evoked auditory event-related potentials (ERP) P(1), N(1), P(2) and N(2). Responses from 16 children with severe behavioral auditory perception deficits, 16 children with marked behavioral auditory perception deficits, and 14 controls were examined. Eighteen ERP amplitude parameters were selected for examination of statistical differences between the groups. Different DA methods and SOFM configurations were trained to the values. SOFM had better classification results than DA methods. Subsequently, measures on another 37 subjects that were unknown for the trained SOFM were used to test the reliability of the system. With 10-dimensional vectors, reliable classifications were obtained that matched behavioral auditory perception deficits in 96%, implying central auditory processing disorder (CAPD). The results also support the assumption that CAPD includes a 'non-peripheral' auditory processing deficit. Copyright 2000 S. Karger AG, Basel.

  9. The use of natural infochemicals for sustainable and efficient harvesting of the microalgae Scenedesmus spp. for biotechnology: insights from a meta-analysis.

    PubMed

    Roccuzzo, Sebastiana; Beckerman, Andrew P; Pandhal, Jagroop

    2016-12-01

    Open raceway ponds are regarded as the most economically viable option for large-scale cultivation of microalgae for low to mid-value bio-products, such as biodiesel. However, improvements are required including reducing the costs associated with harvesting biomass. There is now a growing interest in exploiting natural ecological processes within biotechnology. We review how chemical cues produced by algal grazers induce colony formation in algal cells, which subsequently leads to their sedimentation. A statistical meta-analysis of more than 80 studies reveals that Daphnia grazers can induce high levels of colony formation and sedimentation in Scenedesmus obliquus and that these natural, infochemical induced sedimentation rates are comparable to using commercial chemical equivalents. These data suggest that natural ecological interactions can be co-opted in biotechnology as part of a promising, low energy and clean harvesting method for use in large raceway systems.

  10. Fingerprinting Breast Cancer vs. Normal Mammary Cells by Mass Spectrometric Analysis of Volatiles

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Sinues, Pablo Martinez-Lozano; Hollmén, Maija; Li, Xue; Detmar, Michael; Zenobi, Renato

    2014-06-01

    There is increasing interest in the development of noninvasive diagnostic methods for early cancer detection, to improve the survival rate and quality of life of cancer patients. Identification of volatile metabolic compounds may provide an approach for noninvasive early diagnosis of malignant diseases. Here we analyzed the volatile metabolic signature of human breast cancer cell lines versus normal human mammary cells. Volatile compounds in the headspace of conditioned culture medium were directly fingerprinted by secondary electrospray ionization-mass spectrometry. The mass spectra were subsequently treated statistically to identify discriminating features between normal vs. cancerous cell types. We were able to classify different samples by using feature selection followed by principal component analysis (PCA). Additionally, high-resolution mass spectrometry allowed us to propose their chemical structures for some of the most discriminating molecules. We conclude that cancerous cells can release a characteristic odor whose constituents may be used as disease markers.

  11. Income and obesity: what is the direction of the relationship? A systematic review and meta-analysis

    PubMed Central

    Kim, Tae Jun; von dem Knesebeck, Olaf

    2018-01-01

    Objective It was repeatedly shown that lower income is associated with higher risks for subsequent obesity. However, the perspective of a potential reverse causality is often neglected, in which obesity is considered a cause for lower income, when obese people drift into lower-income jobs due to labour–market discrimination and public stigmatisation. This review was performed to explore the direction of the relation between income and obesity by specifically assessing the importance of social causation and reverse causality. Design Systematic review and meta-analysis. Methods A systematic literature search was conducted in January 2017. The databases Medline, PsycINFO, Sociological Abstracts, International Bibliography of Social Sciences and Sociological Index were screened to identify prospective cohort studies with quantitative data on the relation between income and obesity. Meta-analytic methods were applied using random-effect models, and the quality of studies assessed with the Newcastle-Ottawa Scale. Results In total, 21 studies were eligible for meta-analysis. All included studies originated from either the USA (n=16), the UK (n=3) or Canada (n=2). From these, 14 studies on causation and 7 studies on reverse causality were found. Meta-analyses revealed that lower income is associated with subsequent obesity (OR 1.27, 95% CI 1.10 to 1.47; risk ratio 1.52, 95% CI 1.08 to 2.13), though the statistical significance vanished once adjusted for publication bias. Studies on reverse causality indicated a more consistent relation between obesity and subsequent income, even after taking publication bias into account (standardised mean difference −0.15, 95% CI −0.30 to 0.01). Sensitivity analyses implied that the association is influenced by obesity measurement, gender, length of observation and study quality. Conclusions Findings suggest that there is more consistent evidence for reverse causality. Therefore, there is a need to examine reverse causality processes in more detail to understand the relation between income and obesity. PROSPERO registration number 42016041296. PMID:29306894

  12. A Comparative Analysis of Selected Demographic Parameters for Evaluating Parity of Women in Poland, Spain, England and Wales for the Period 1996-2011.

    PubMed

    Strama, Agnieszka; Heimrath, Jerzy; Dudek, Krzysztof

    2016-01-01

    The Central Statistical Offices in Europe indicate an increase of women's parity age and extramarital births. The aim of this study was to analyze the chosen demographics of parity in European countries of Poland, Spain, England and Wales in 1996-2011. Statistical packet: women's average age at the time of their first and subsequent births, newborns' average body weight in relation to the age of mother; live marital and extramarital births. The age of mothers giving birth to their first and subsequent children in 1998-2011 in all of the researched countries is presented, and next compared in 1999, 2005 and 2011. An analysis of the births of children in marital and extramarital relationships as well as the body weight of live newborns is presented in detail in 1996-2006, and next in 6 year periods: 1999, 2005 and 2011. The average age of the mother giving birth to her first baby in 1996-2011 oscillates around: 26-27 years in England and Wales, 28-30 years in Spain and 23-26 years in Poland. In Poland, the highest average children's body weight, 3394 g, was achieved by children born by mothers at the age of 25-29. In Spain, however, at the mothers' age of 20-24, it was 3317 g. In England and Wales, at 30-34 years, it was 3262 g. The number of extramarital births in comparison to marital births is increasing. England and Wales has the lowest percentage of marital births, whereas Poland, the highest. In Spain, England and Wales we can observe an increase of extramarital births, while in Poland this number is stable at around 21.3%. The age of women having their first baby, the parity of later children, and extramarital births are increasing. In Poland, infant body weight is significantly bigger than in Spain, England and Wales.

  13. What Is in the Naming? A 5-Year Longitudinal Study of Early Rapid Naming and Phonological Sensitivity in Relation to Subsequent Reading Skills in Both Native Chinese and English as a Second Language

    ERIC Educational Resources Information Center

    Pan, Jinger; McBride-Chang, Catherine; Shu, Hua; Liu, Hongyun; Zhang, Yuping; Li, Hong

    2011-01-01

    Among 262 Chinese children, syllable awareness and rapid automatized naming (RAN) at age 5 years and invented spelling of Pinyin at age 6 years independently predicted subsequent Chinese character recognition and English word reading at ages 8 years and 10 years, even with initial Chinese character reading ability statistically controlled. In…

  14. Single-cell forensic short tandem repeat typing within microfluidic droplets.

    PubMed

    Geng, Tao; Novak, Richard; Mathies, Richard A

    2014-01-07

    A short tandem repeat (STR) typing method is developed for forensic identification of individual cells. In our strategy, monodisperse 1.5 nL agarose-in-oil droplets are produced with a high frequency using a microfluidic droplet generator. Statistically dilute single cells, along with primer-functionalized microbeads, are randomly compartmentalized in the droplets. Massively parallel single-cell droplet polymerase chain reaction (PCR) is performed to transfer replicas of desired STR targets from the single-cell genomic DNA onto the coencapsulated microbeads. These DNA-conjugated beads are subsequently harvested and reamplified under statistically dilute conditions for conventional capillary electrophoresis (CE) STR fragment size analysis. The 9-plex STR profiles of single cells from both pure and mixed populations of GM09947 and GM09948 human lymphoid cells show that all alleles are correctly called and allelic drop-in/drop-out is not observed. The cell mixture study exhibits a good linear relationship between the observed and input cell ratios in the range of 1:1 to 10:1. Additionally, the STR profile of GM09947 cells could be deduced even in the presence of a high concentration of cell-free contaminating 9948 genomic DNA. Our method will be valuable for the STR analysis of samples containing mixtures of cells/DNA from multiple contributors and for low-concentration samples.

  15. Mapping the Structure-Function Relationship in Glaucoma and Healthy Patients Measured with Spectralis OCT and Humphrey Perimetry

    PubMed Central

    Muñoz–Negrete, Francisco J.; Oblanca, Noelia; Rebolleda, Gema

    2018-01-01

    Purpose To study the structure-function relationship in glaucoma and healthy patients assessed with Spectralis OCT and Humphrey perimetry using new statistical approaches. Materials and Methods Eighty-five eyes were prospectively selected and divided into 2 groups: glaucoma (44) and healthy patients (41). Three different statistical approaches were carried out: (1) factor analysis of the threshold sensitivities (dB) (automated perimetry) and the macular thickness (μm) (Spectralis OCT), subsequently applying Pearson's correlation to the obtained regions, (2) nonparametric regression analysis relating the values in each pair of regions that showed significant correlation, and (3) nonparametric spatial regressions using three models designed for the purpose of this study. Results In the glaucoma group, a map that relates structural and functional damage was drawn. The strongest correlation with visual fields was observed in the peripheral nasal region of both superior and inferior hemigrids (r = 0.602 and r = 0.458, resp.). The estimated functions obtained with the nonparametric regressions provided the mean sensitivity that corresponds to each given macular thickness. These functions allowed for accurate characterization of the structure-function relationship. Conclusions Both maps and point-to-point functions obtained linking structure and function damage contribute to a better understanding of this relationship and may help in the future to improve glaucoma diagnosis. PMID:29850196

  16. Statistical detection of geographic clusters of resistant Escherichia coli in a regional network with WHONET and SaTScan.

    PubMed

    Park, Rachel; O'Brien, Thomas F; Huang, Susan S; Baker, Meghan A; Yokoe, Deborah S; Kulldorff, Martin; Barrett, Craig; Swift, Jamie; Stelling, John

    2016-11-01

    While antimicrobial resistance threatens the prevention, treatment, and control of infectious diseases, systematic analysis of routine microbiology laboratory test results worldwide can alert new threats and promote timely response. This study explores statistical algorithms for recognizing geographic clustering of multi-resistant microbes within a healthcare network and monitoring the dissemination of new strains over time. Escherichia coli antimicrobial susceptibility data from a three-year period stored in WHONET were analyzed across ten facilities in a healthcare network utilizing SaTScan's spatial multinomial model with two models for defining geographic proximity. We explored geographic clustering of multi-resistance phenotypes within the network and changes in clustering over time. Geographic clustering identified from both latitude/longitude and non-parametric facility groupings geographic models were similar, while the latter was offers greater flexibility and generalizability. Iterative application of the clustering algorithms suggested the possible recognition of the initial appearance of invasive E. coli ST131 in the clinical database of a single hospital and subsequent dissemination to others. Systematic analysis of routine antimicrobial resistance susceptibility test results supports the recognition of geographic clustering of microbial phenotypic subpopulations with WHONET and SaTScan, and iterative application of these algorithms can detect the initial appearance in and dissemination across a region prompting early investigation, response, and containment measures.

  17. The influence of dopants on the nucleation of semiconductor nanocrystals from homogeneous solution.

    PubMed

    Bryan, J Daniel; Schwartz, Dana A; Gamelin, Daniel R

    2005-09-01

    The influence of Co2+ ions on the homogeneous nucleation of ZnO is examined. Using electronic absorption spectroscopy as a dopant-specific in-situ spectroscopic probe, Co2+ ions are found to be quantitatively excluded from the ZnO critical nuclei but incorporated nearly statistically in the subsequent growth layers, resulting in crystallites with pure ZnO cores and Zn(1-x)Co(x)O shells. Strong inhibition of ZnO nucleation by Co2+ ions is also observed. These results are explained using the classical nucleation model. Statistical analysis of nucleation inhibition data allows estimation of the critical nucleus size as 25 +/- 4 Zn2+ ions. Bulk calorimetric data allow the activation barrier for ZnO nucleation containing a single Co2+ impurity to be estimated as 5.75 kcal/mol cluster greater than that of pure ZnO, corresponding to a 1.5 x 10(4)-fold reduction in the ZnO nucleation rate constant upon introduction of a single Co2+ impurity. These data and analysis offer a rare view into the role of composition in homogeneous nucleation processes, and specifically address recent experiments targeting formation of semiconductor quantum dots containing single magnetic impurity ions at their precise centers.

  18. Statistics in Japanese universities.

    PubMed Central

    Ito, P K

    1979-01-01

    The teaching of statistics in the U.S. and Japanese universities is briefly reviewed. It is found that H. Hotelling's articles and subsequent relevant publications on the teaching of statistics have contributed to a considerable extent to the establishment of excellent departments of statistics in U.S. universities and colleges. Today the U.S. may be proud of many well-staffed and well-organized departments of theoretical and applied statistics with excellent undergraduate and graduate programs. On the contrary, no Japanese universities have an independent department of statistics at present, and the teaching of statistics has been spread among a heterogeneous group of departments of application. This was mainly due to the Japanese government regulation concerning the establishment of a university. However, it has recently been revised so that an independent department of statistics may be started in a Japanese university with undergraduate and graduate programs. It is hoped that discussions will be started among those concerned on the question of organization of the teaching of statistics in Japanese universities as soon as possible. PMID:396154

  19. Air-flow distortion and turbulence statistics near an animal facility

    NASA Astrophysics Data System (ADS)

    Prueger, J. H.; Eichinger, W. E.; Hipps, L. E.; Hatfield, J. L.; Cooper, D. I.

    The emission and dispersion of particulates and gases from concentrated animal feeding operations (CAFO) at local to regional scales is a current issue in science and society. The transport of particulates, odors and toxic chemical species from the source into the local and eventually regional atmosphere is largely determined by turbulence. Any models that attempt to simulate the dispersion of particles must either specify or assume various statistical properties of the turbulence field. Statistical properties of turbulence are well documented for idealized boundary layers above uniform surfaces. However, an animal production facility is a complex surface with structures that act as bluff bodies that distort the turbulence intensity near the buildings. As a result, the initial release and subsequent dispersion of effluents in the region near a facility will be affected by the complex nature of the surface. Previous Lidar studies of plume dispersion over the facility used in this study indicated that plumes move in complex yet organized patterns that would not be explained by the properties of turbulence generally assumed in models. The objective of this study was to characterize the near-surface turbulence statistics in the flow field around an array of animal confinement buildings. Eddy covariance towers were erected in the upwind, within the building array and downwind regions of the flow field. Substantial changes in turbulence intensity statistics and turbulence-kinetic energy (TKE) were observed as the mean wind flow encountered the building structures. Spectra analysis demonstrated unique distribution of the spectral energy in the vertical profile above the buildings.

  20. Data analysis of the benefits of an electronic registry of information in a neonatal intensive care unit in Greece.

    PubMed

    Skouroliakou, Maria; Soloupis, George; Gounaris, Antonis; Charitou, Antonia; Papasarantopoulos, Petros; Markantonis, Sophia L; Golna, Christina; Souliotis, Kyriakos

    2008-07-28

    This study assesses the results of implementation of a software program that allows for input of admission/discharge summary data (including cost) in a neonatal intensive care unit (NICU) in Greece, based on the establishment of a baseline statistical database for infants treated in a NICU and the statistical analysis of epidemiological and resource utilization data thus collected. A software tool was designed, developed, and implemented between April 2004 and March 2005 in the NICU of the LITO private maternity hospital in Athens, Greece, to allow for the first time for step-by-step collection and management of summary treatment data. Data collected over this period were subsequently analyzed using defined indicators as a basis to extract results related to treatment options, treatment duration, and relative resource utilization. Data for 499 babies were entered in the tool and processed. Information on medical costs (e.g., mean total cost +/- SD of treatment was euro310.44 +/- 249.17 and euro6704.27 +/- 4079.53 for babies weighing more than 2500 g and 1000-1500 g respectively), incidence of complications or disease (e.g., 4.3 percent and 14.3 percent of study babies weighing 1,000 to 1,500 g suffered from cerebral bleeding [grade I] and bronchopulmonary dysplasia, respectively, while overall 6.0 percent had microbial infections), and medical statistics (e.g., perinatal mortality was 6.8 percent) was obtained in a quick and robust manner. The software tool allowed for collection and analysis of data traditionally maintained in paper medical records in the NICU with greater ease and accuracy. Data codification and analysis led to significant findings at the epidemiological, medical resource utilization, and respective hospital cost levels that allowed comparisons with literature findings for the first time in Greece. The tool thus contributed to a clearer understanding of treatment practices in the NICU and set the baseline for the assessment of the impact of future interventions at the policy or hospital level.

  1. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  2. Professional development in statistics, technology, and cognitively demanding tasks: classroom implementation and obstacles

    NASA Astrophysics Data System (ADS)

    Foley, Gregory D.; Bakr Khoshaim, Heba; Alsaeed, Maha; Nihan Er, S.

    2012-03-01

    Attending professional development programmes can support teachers in applying new strategies for teaching mathematics and statistics. This study investigated (a) the extent to which the participants in a professional development programme subsequently used the techniques they had learned when teaching mathematics and statistics and (b) the obstacles they encountered in enacting cognitively demanding instructional tasks in their classrooms. The programme created an intellectual learning community among the participants and helped them gain confidence as teachers of statistics, and the students of participating teachers became actively engaged in deep mathematical thinking. The participants indicated, however, that time, availability of resources and students' prior achievement critically affected the implementation of cognitively demanding instructional activities.

  3. Polish Adaptation of Wrist Evaluation Questionnaires.

    PubMed

    Czarnecki, Piotr; Wawrzyniak-Bielęda, Anna; Romanowski, Leszek

    2015-01-01

    Questionnaires evaluating hand and wrist function are a very useful tool allowing for objective and systematic recording of symptoms reported by the patients. Most questionnaires generally accepted in clinical practice are available in English and need to be appropriately adapted in translation and undergo subsequent validation before they can be used in another culture and language. The process of translation of the questionnaires was based on the generally accepted guidelines of the International Quality of Life Assessment Project (IQOLA). First, the questionnaires were translated from English into Polish by two independent translators. Then, a joint version of the translation was prepared collectively and translated back into English. Each stage was followed by a written report. The translated questionnaires were then evaluated by a group of patients. We selected 31 patients with wrist problems and asked them to complete the PRWE, Mayo, Michigan and DASH questionnaires twice at intervals of 3-10 days. The results were submitted for statistical analysis. We found a statistically significant (p<0.05) correlation for the two completions of the questionnaires. A comparison of the PRWE and Mayo questionnaires with the DASH questionnaire also showed a statistically significant correlation (p<0.05). Our results indicate that the cultural adaptation of the translated questionnaires was successful and that the questionnaires may be used in clinical practice.

  4. 9 CFR 114.13 - Expiration date determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... presented in support of licensure shall be tested for potency at release and at or after the dating requested. (c) Subsequent changes in the dating period for a product may be granted, based on statistically...

  5. Measurement of self-evaluative motives: a shopping scenario.

    PubMed

    Wajda, Theresa A; Kolbe, Richard; Hu, Michael Y; Cui, Annie Peng

    2008-08-01

    To develop measures of consumers' self-evaluative motives of Self-verification, Self-enhancement, and Self-improvement within the context of a mall shopping environment, an initial set of 49 items was generated by conducting three focus-group sessions. These items were subsequently converted into shopping-dependent motive statements. 250 undergraduate college students responded on a 7-point scale to each statement as these related to the acquisition of recent personal shopping goods. An exploratory factor analysis yielded five factors, accounting for 57.7% of the variance, three of which corresponded to the Self-verification motive (five items), Self-enhancement motive (three items), and Self-improvement motive (six items). These 14 items, along with 9 reconstructed items, yielded 23 items retained and subjected to additional testing. In a final round of data collection, 169 college students provided data for exploratory factor analysis. 11 items were used in confirmatory factor analysis. Analysis indicated that the 11-item scale adequately captured measures of the three self-evaluative motives. However, further data reduction produced a 9-item scale with marked improvement in statistical fit over the 11-item scale.

  6. [Influence of surgeon specialization upon the results of colon cancer surgery. Usefulness of propensity scores].

    PubMed

    Martínez-Ramos, D; Escrig-Sos, J; Miralles-Tena, J M; Rivadulla-Serrano, M I; Daroca-José, J M; Salvador Sanchís, J L

    2008-07-01

    surgeon influence on colorectal cancer surgery outcomes has been repeatedly studied in the scientific literature, but conclusions have been contradictory. Here we study whether surgeon specialization is a determinant factor for outcome in these patients. The importance of propensity scores (PS) in surgical research is also studied. a retrospective study was performed and medical records were reviewed for 236 patients who were intervened for colon cancer in Castellon General Hospital (Spain). Cases were divided into two groups (specialist and non-specialist surgeons), and both 5-year surveillance and disease free survival were compared. Comparisons were first made with no adjustments, and then subsequently using PS analysis. the initial (non-adjusted) analysis was clearly favourable for the specialist surgeon group (5-year surveillance, 64.3 vs. 79.3%, p = 0.028). After adjusting for PS no statistical significance was obtained. surgeon specialization had no significant impact on patient outcome after colon cancer surgery. Propensity score analysis is an important tool in the analysis of surgical non-randomized studies, particularly when events under scrutiny are rare.

  7. Antiviral activity of Quercus persica L.: High efficacy and low toxicity

    PubMed Central

    Karimi, Ali; Moradi, Mohammad-Taghi; Saeedi, Mojtaba; Asgari, Sedigheh; Rafieian-kopaei, Mahmoud

    2013-01-01

    Background: Drug-resistant strain of Herpes simplex virus type 1 (HSV-I) has increased the interest in the use of natural substances. Aims: This study was aimed to determine minimum inhibitory concentration of hydroalchoholic extract of a traditionally used herbal plant, Quercus persica L., on HSV-1 replication on baby hamster kidney (BHK) cells. Setting: The study was conducted in Shahrekord University of Medical Sciences, Iran. Design: This was an experimental study. Materials and Methods: BHK cells were grown in monolayer culture with Dulbecco's modified Eagle's medium (DMEM) supplemented with 5% fetal calf serum and plated onto 48-well culture plates. Fifty percent cytotoxic concentration (CC50%) of Q. persica L. on BHK cells was determined. Subsequently, 50% inhibitory concentration (IC50%) of the extract on replication of HSV-1 both in interacellular and exteracellular cases was assessed. Statistical Analysis: Statistic Probit model was used for statistical analysis. The dose-dependent effect of antiviral activity of the extracts was determined by linear regression. Results: Q. persica L. had no cytotoxic effect on this cell line. There was significant relationship between the concentration of the extract and cell death (P<0.01). IC50s of Q. persica L. on HSV-1, before and after attachment to BHK cells were 1.02 and 0.257 μg/mL, respectively. There was significant relationship between the concentration of this extract and inhibition of cytopathic effect (CPE) (P<0.05). Antioxidant capacity of the extract was 67.5%. Conclusions: The hydroalchoholic extract of Q. persica L. is potentially an appropriate and promising anti herpetic herbal medicine. PMID:24516836

  8. The role of different PI-RADS versions in prostate multiparametric magnetic resonance tomography assessment.

    PubMed

    Aliukonis, Paulius; Letauta, Tadas; Briedienė, Rūta; Naruševičiūtė, Ieva; Letautienė, Simona

    2017-01-01

    Background . Standardised Prostate Imaging Reporting and Data System (PI-RADS) guidelines for the assessment of prostate alterations were designed for the assessment of prostate pathology. Published by the ESUR in 2012, PI-RADS v1 was based on the total score of different MRI sequences with subsequent calculation. PI-RADS v2 was published by the American College of Radiology in 2015 and featured different assessment criteria for prostate peripheral and transitory zones. Aim . To assess the correlations of PI-RADS v1 and PI-RADS v2 with Gleason score values and to define their predictive values of the diagnosis of prostate cancer. Materials and methods . A retrospective analysis of 66 patients. Prostate specific antigen (PSA) value and the Gleason score (GS) were assessed. One the most malignant focal lesion was selected in the peripheral zone of each lobe of the prostate (91 in total). Statistical analysis was carried out applying SPSS software, v.23, p < 0.05. Results . Focal lesions assessed by PI-RADS v1 score: 10% - 1, 12% - 2, 41% - 3, 23% - 4, 14% - 5. Assessment applying PI-RADS v.2: 20% - 1, 7.5% - 2, 26%, 29.5%, and 17% were assessed by 3, 4, and 5 scores. Statistically relevant correlation was found only between GS and PI-RADS ( p = 0.033). The positive predictive value of both versions of PI-RADS - 75%, negative predictive value of PI-RADS v1 - 46%, PI-RADS v2 - 43%. Conclusions . PI-RADS v1 was more statistically relevant in assessing the grade of tumour. Prediction values were similar in both versions.

  9. Application of statistical experimental design to study the formulation variables influencing the coating process of lidocaine liposomes.

    PubMed

    González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M

    2007-06-07

    In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.

  10. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  11. Stress hyperglycaemia in critically ill patients and the subsequent risk of diabetes: a systematic review and meta-analysis.

    PubMed

    Ali Abdelhamid, Yasmine; Kar, Palash; Finnis, Mark E; Phillips, Liza K; Plummer, Mark P; Shaw, Jonathan E; Horowitz, Michael; Deane, Adam M

    2016-09-27

    Hyperglycaemia occurs frequently in critically ill patients without diabetes. We conducted a systematic review and meta-analysis to evaluate whether this 'stress hyperglycaemia' identifies survivors of critical illness at increased risk of subsequently developing diabetes. We searched the MEDLINE and Embase databases from their inception to February 2016. We included observational studies evaluating adults admitted to the intensive care unit (ICU) who developed stress hyperglycaemia if the researchers reported incident diabetes or prediabetes diagnosed ≥3 months after hospital discharge. Two reviewers independently screened the titles and abstracts of identified studies and evaluated the full text of relevant studies. Data were extracted using pre-defined data fields, and risk of bias was assessed using the Newcastle-Ottawa Scale. Pooled ORs with 95 % CIs for the occurrence of diabetes were calculated using a random-effects model. Four cohort studies provided 2923 participants, including 698 with stress hyperglycaemia and 131 cases of newly diagnosed diabetes. Stress hyperglycaemia was associated with increased risk of incident diabetes (OR 3.48; 95 % CI 2.02-5.98; I 2  = 36.5 %). Studies differed with regard to definitions of stress hyperglycaemia, follow-up and cohorts studied. Stress hyperglycaemia during ICU admission is associated with increased risk of incident diabetes. The strength of this association remains uncertain because of statistical and clinical heterogeneity among the included studies.

  12. Application of the denaturing gradient gel electrophoresis (DGGE) technique as an efficient diagnostic tool for ciliate communities in soil.

    PubMed

    Jousset, Alexandre; Lara, Enrique; Nikolausz, Marcell; Harms, Hauke; Chatzinotas, Antonis

    2010-02-01

    Ciliates (or Ciliophora) are ubiquitous organisms which can be widely used as bioindicators in ecosystems exposed to anthropogenic and industrial influences. The evaluation of the environmental impact on soil ciliate communities with methods relying on morphology-based identification may be hampered by the large number of samples usually required for a statistically supported, reliable conclusion. Cultivation-independent molecular-biological diagnostic tools are a promising alternative to greatly simplify and accelerate such studies. In this present work a ciliate-specific fingerprint method based on the amplification of a phylogenetic marker gene (i.e. the 18S ribosomal RNA gene) with subsequent analysis by denaturing gradient gel electrophoresis (DGGE) was developed and used to monitor community shifts in a polycyclic aromatic hydrocarbon (PAH) polluted soil. The semi-nested approach generated ciliate-specific amplification products from all soil samples and allowed to distinguish community profiles from a PAH-polluted and a non-polluted control soil. Subsequent sequence analysis of excised bands provided evidence that polluted soil samples are dominated by organisms belonging to the class Colpodea. The general DGGE approach presented in this study might thus in principle serve as a fast and reproducible diagnostic tool, complementing and facilitating future ecological and ecotoxicological monitoring of ciliates in polluted habitats. Copyright 2009 Elsevier B.V. All rights reserved.

  13. How effective is good domestic kitchen hygiene at reducing diarrhoeal disease in developed countries? A systematic review and reanalysis of the UK IID study

    PubMed Central

    Stenberg, Anna; Macdonald, Clare; Hunter, Paul R

    2008-01-01

    Background To assess whether domestic kitchen hygiene is an important contributor to the development of diarrhoea in the developed world. Methods Electronic searches were carried out in October 2006 in EMBASE, MEDLINE, Web of Knowledge, Cochrane central register of clinical trials and CINAHL. All publications, irrespective of study design, assessing food hygiene practices with an outcome measure of diarrhoea were included in the review. All included studies underwent data extraction and the data was subsequently analysed. The analysis was conducted by qualitative synthesis of the results. Given the substantial heterogeneity in study design and outcome measures meta-analysis was not done. In addition the existing dataset of the UK IID study was reanalysed to investigate possible associations between self-reported diarrhoea and variables indicative of poor domestic kitchen hygiene Results Some 14 studies were finally included in subsequent analyses. Of the 14 studies included in this systematic review, 11 were case-control studies, 2 cross-sectional surveys, and 1 RCT. Very few studies identified any significant association with good environmental kitchen hygiene. Although some of the variables in the reanalysis of the UK IID study were statistically significant no obvious trend was seen. Conclusion The balance of the available evidence does not support the hypothesis that poor domestic kitchen hygiene practices are important risk factors for diarrhoeal disease in developed countries. PMID:18294383

  14. How effective is good domestic kitchen hygiene at reducing diarrhoeal disease in developed countries? A systematic review and reanalysis of the UK IID study.

    PubMed

    Stenberg, Anna; Macdonald, Clare; Hunter, Paul R

    2008-02-22

    To assess whether domestic kitchen hygiene is an important contributor to the development of diarrhoea in the developed world. Electronic searches were carried out in October 2006 in EMBASE, MEDLINE, Web of Knowledge, Cochrane central register of clinical trials and CINAHL. All publications, irrespective of study design, assessing food hygiene practices with an outcome measure of diarrhoea were included in the review. All included studies underwent data extraction and the data was subsequently analysed. The analysis was conducted by qualitative synthesis of the results. Given the substantial heterogeneity in study design and outcome measures meta-analysis was not done. In addition the existing dataset of the UK IID study was reanalysed to investigate possible associations between self-reported diarrhoea and variables indicative of poor domestic kitchen hygiene Some 14 studies were finally included in subsequent analyses. Of the 14 studies included in this systematic review, 11 were case-control studies, 2 cross-sectional surveys, and 1 RCT. Very few studies identified any significant association with good environmental kitchen hygiene. Although some of the variables in the reanalysis of the UK IID study were statistically significant no obvious trend was seen. The balance of the available evidence does not support the hypothesis that poor domestic kitchen hygiene practices are important risk factors for diarrhoeal disease in developed countries.

  15. Alveolar ridge preservation after tooth extraction: a Bayesian Network meta-analysis of grafting materials efficacy on prevention of bone height and width reduction.

    PubMed

    Iocca, Oreste; Farcomeni, Alessio; Pardiñas Lopez, Simon; Talib, Huzefa S

    2017-01-01

    To conduct a traditional meta-analysis and a Bayesian Network meta-analysis to synthesize the information coming from randomized controlled trials on different socket grafting materials and combine the resulting indirect evidence in order to make inferences on treatments that have not been compared directly. RCTs were identified for inclusion in the systematic review and subsequent statistical analysis. Bone height and width remodelling were selected as the chosen summary measures for comparison. First, a series of pairwise meta-analyses were performed and overall mean difference (MD) in mm with 95% CI was calculated between grafted versus non-grafted sockets. Then, a Bayesian Network meta-analysis was performed to draw indirect conclusions on which grafting materials can be considered most likely the best compared to the others. From the six included studies, seven comparisons were obtained. Traditional meta-analysis showed statistically significant results in favour of grafting the socket compared to no-graft both for height (MD 1.02, 95% CI 0.44-1.59, p value < 0.001) than for width (MD 1.52 95% CI 1.18-1.86, p value <0.000001) remodelling. Bayesian Network meta-analysis allowed to obtain a rank of intervention efficacy. On the basis of the results of the present analysis, socket grafting seems to be more favourable than unassisted socket healing. Moreover, Bayesian Network meta-analysis indicates that freeze-dried bone graft plus membrane is the most likely effective in the reduction of bone height remodelling. Autologous bone marrow resulted the most likely effective when width remodelling was considered. Studies with larger samples and less risk of bias should be conducted in the future in order to further strengthen the results of this analysis. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. An empirical study of statistical properties of variance partition coefficients for multi-level logistic regression models

    USGS Publications Warehouse

    Li, Ji; Gray, B.R.; Bates, D.M.

    2008-01-01

    Partitioning the variance of a response by design levels is challenging for binomial and other discrete outcomes. Goldstein (2003) proposed four definitions for variance partitioning coefficients (VPC) under a two-level logistic regression model. In this study, we explicitly derived formulae for multi-level logistic regression model and subsequently studied the distributional properties of the calculated VPCs. Using simulations and a vegetation dataset, we demonstrated associations between different VPC definitions, the importance of methods for estimating VPCs (by comparing VPC obtained using Laplace and penalized quasilikehood methods), and bivariate dependence between VPCs calculated at different levels. Such an empirical study lends an immediate support to wider applications of VPC in scientific data analysis.

  17. Human Lymphatic Mesenteric Vessels: Morphology and Possible Function of Aminergic and NPY-ergic Nerve Fibers.

    PubMed

    D'Andrea, Vito; Panarese, Alessandra; Taurone, Samanta; Coppola, Luigi; Cavallotti, Carlo; Artico, Marco

    2015-09-01

    The lymphatic vessels have been studied in different organs from a morphological to a clinical point of view. Nevertheless, the knowledge of the catecholaminergic control of the lymphatic circulation is still incomplete. The aim of this work is to study the presence and distribution of the catecholaminergic and NPY-ergic nerve fibers in the whole wall of the human mesenteric lymphatic vessels in order to obtain knowledge about their morphology and functional significance. The following experimental procedures were performed: 1) drawing of tissue containing lymphatic vessels; 2) cutting of tissue; 3) staining of tissue; 4) staining of nerve fibers; 5) histofluorescence microscopy for the staining of catecholaminergic nerve fibers; 6) staining of neuropeptide Y like-immune reactivity; 7) biochemical assay of proteins; 8) measurement of noradrenaline; 9) quantitative analysis of images; 10) statistical analysis of data. Numerous nerve fibers run in the wall of lymphatic vessels. Many of them are catecholaminergic in nature. Some nerve fibers are NPY-positive. The biochemical results on noradrenaline amounts are in agreement with morphological results on catecholaminergic nerve fibers. Moreover, the morphometric results, obtained by the quantitative analysis of images and the subsequent statistical analysis of data, confirm all our morphological and biochemical data. The knowledge of the physiological or pathological mechanism regulating the functions of the lymphatic system is incomplete. Nevertheless the catecholaminergic nerve fibers of the human mesenteric lymphatic vessels come from the adrenergic periarterial plexuses of the mesenterial arterial bed. NPY-ergic nerve fibers may modulate the microcirculatory mesenterial bed in different pathological conditions.

  18. Learning curves for single incision and conventional laparoscopic right hemicolectomy: a multidimensional analysis

    PubMed Central

    Park, Yoonah; Yong, Yuen Geng; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung

    2015-01-01

    Purpose This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). Methods This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Results Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). Conclusion The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase. PMID:25960990

  19. Autogenous Partial Bone Chip Grafting on the Exposed Inferior Alveolar Nerve After Cystic Enucleation.

    PubMed

    Seo, Mi Hyun; Eo, Mi Young; Cho, Yun Ju; Kim, Soung Min; Lee, Suk Keun

    2018-03-01

    This prospective study evaluated the clinical effectiveness of the new approach of partial autogenous bone chip grafts for the treatment of mandibular cystic lesions related to the inferior alveolar nerve (IAN). A total of 38 patients treated for mandibular cysts or benign tumors were included in this prospective study and subsequently divided into 3 groups depending on the bone grafting method used: cystic enucleation without a bone graft (group 1), partial bone chip graft covering the exposed IAN (group 2), and autogenous bone graft covering the entire defect (group 3). We evaluated the symptoms, clinical signs, and radiographic changes using dental panorama preoperatively, immediate postoperatively, and at 1, 3, 6, and 12 months postoperatively. Radiographic densities were compared using Adobe Photoshop CS5 (Adobe Systems Inc., San Jose, CA). Repeated measures analysis of variance was used for statistical evaluation with SPSS 22.0 (SPSS Inc, Chicago, IL), and P < 0.05 was considered statistically significant.Radiopacities were the most increased at 1 year postoperative in group 3; groups 2 and 3 did not show statistically significant differences, whereas groups 1 and 3 were statistically significant. In terms of radiographic bone healing with clinical regeneration of the exposed IAN, healing occurred in all patients, although the best healing was achieved in group 2.This autogenous partial bone chip grafting procedure to cover the exposed IAN is suggested as a new surgical protocol for the treatment of cystic lesions associated with the IAN.

  20. Statistical modelling for recurrent events: an application to sports injuries

    PubMed Central

    Ullah, Shahid; Gabbett, Tim J; Finch, Caroline F

    2014-01-01

    Background Injuries are often recurrent, with subsequent injuries influenced by previous occurrences and hence correlation between events needs to be taken into account when analysing such data. Objective This paper compares five different survival models (Cox proportional hazards (CoxPH) model and the following generalisations to recurrent event data: Andersen-Gill (A-G), frailty, Wei-Lin-Weissfeld total time (WLW-TT) marginal, Prentice-Williams-Peterson gap time (PWP-GT) conditional models) for the analysis of recurrent injury data. Methods Empirical evaluation and comparison of different models were performed using model selection criteria and goodness-of-fit statistics. Simulation studies assessed the size and power of each model fit. Results The modelling approach is demonstrated through direct application to Australian National Rugby League recurrent injury data collected over the 2008 playing season. Of the 35 players analysed, 14 (40%) players had more than 1 injury and 47 contact injuries were sustained over 29 matches. The CoxPH model provided the poorest fit to the recurrent sports injury data. The fit was improved with the A-G and frailty models, compared to WLW-TT and PWP-GT models. Conclusions Despite little difference in model fit between the A-G and frailty models, in the interest of fewer statistical assumptions it is recommended that, where relevant, future studies involving modelling of recurrent sports injury data use the frailty model in preference to the CoxPH model or its other generalisations. The paper provides a rationale for future statistical modelling approaches for recurrent sports injury. PMID:22872683

  1. Preserved Statistical Learning of Tonal and Linguistic Material in Congenital Amusia

    PubMed Central

    Omigie, Diana; Stewart, Lauren

    2011-01-01

    Congenital amusia is a lifelong disorder whereby individuals have pervasive difficulties in perceiving and producing music. In contrast, typical individuals display a sophisticated understanding of musical structure, even in the absence of musical training. Previous research has shown that they acquire this knowledge implicitly, through exposure to music's statistical regularities. The present study tested the hypothesis that congenital amusia may result from a failure to internalize statistical regularities – specifically, lower-order transitional probabilities. To explore the specificity of any potential deficits to the musical domain, learning was examined with both tonal and linguistic material. Participants were exposed to structured tonal and linguistic sequences and, in a subsequent test phase, were required to identify items which had been heard in the exposure phase, as distinct from foils comprising elements that had been present during exposure, but presented in a different temporal order. Amusic and control individuals showed comparable learning, for both tonal and linguistic material, even when the tonal stream included pitch intervals around one semitone. However analysis of binary confidence ratings revealed that amusic individuals have less confidence in their abilities and that their performance in learning tasks may not be contingent on explicit knowledge formation or level of awareness to the degree shown in typical individuals. The current findings suggest that the difficulties amusic individuals have with real-world music cannot be accounted for by an inability to internalize lower-order statistical regularities but may arise from other factors. PMID:21779263

  2. The self-secondary crater population of the Hokusai crater on Mercury

    NASA Astrophysics Data System (ADS)

    Xiao, Zhiyong; Prieur, Nils C.; Werner, Stephanie C.

    2016-07-01

    Whether or not self-secondaries dominate small crater populations on continuous ejecta deposits and floors of fresh impact craters has long been a controversy. This issue potentially affects the age determination technique using crater statistics. Here the self-secondary crater population on the continuous ejecta deposits of the Hokusai crater on Mercury is unambiguously recognized. Superposition relationships show that this population was emplaced after both the ballistic sedimentation of excavation flows and the subsequent veneering of impact melt, but it predated the settlement and solidification of melt pools on the crater floor. Fragments that formed self-secondaries were launched via impact spallation with large angles. Complex craters on the Moon, Mercury, and Mars probably all have formed self-secondaries populations. Dating young craters using crater statistics on their continuous ejecta deposits can be misleading. Impact melt pools are less affected by self-secondaries. Overprint by subsequent crater populations with time reduces the predominance of self-secondaries.

  3. Usefulness of Dismissing and Changing the Coach in Professional Soccer

    PubMed Central

    Heuer, Andreas; Müller, Christian; Rubner, Oliver; Hagemann, Norbert; Strauss, Bernd

    2011-01-01

    Whether a coach dismissal during the mid-season has an impact on the subsequent team performance has long been a subject of controversial scientific discussion. Here we find a clear-cut answer to this question by using a recently developed statistical framework for the team fitness and by analyzing the first two moments of the effect of a coach dismissal. We can show with an unprecedented small statistical error for the German soccer league that dismissing the coach within the season has basically no effect on the subsequent performance of a team. Changing the coach between two seasons has no effect either. Furthermore, an upper bound for the actual influence of the coach on the team fitness can be estimated. Beyond the immediate relevance of this result, this study may lead the way to analogous studies for exploring the effect of managerial changes, e.g., in economic terms. PMID:21445335

  4. A computational framework for estimating statistical power and planning hypothesis-driven experiments involving one-dimensional biomechanical continua.

    PubMed

    Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos

    2018-01-03

    Statistical power assessment is an important component of hypothesis-driven research but until relatively recently (mid-1990s) no methods were available for assessing power in experiments involving continuum data and in particular those involving one-dimensional (1D) time series. The purpose of this study was to describe how continuum-level power analyses can be used to plan hypothesis-driven biomechanics experiments involving 1D data. In particular, we demonstrate how theory- and pilot-driven 1D effect modeling can be used for sample-size calculations for both single- and multi-subject experiments. For theory-driven power analysis we use the minimum jerk hypothesis and single-subject experiments involving straight-line, planar reaching. For pilot-driven power analysis we use a previously published knee kinematics dataset. Results show that powers on the order of 0.8 can be achieved with relatively small sample sizes, five and ten for within-subject minimum jerk analysis and between-subject knee kinematics, respectively. However, the appropriate sample size depends on a priori justifications of biomechanical meaning and effect size. The main advantage of the proposed technique is that it encourages a priori justification regarding the clinical and/or scientific meaning of particular 1D effects, thereby robustly structuring subsequent experimental inquiry. In short, it shifts focus from a search for significance to a search for non-rejectable hypotheses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Assessment of Tooth Wear Among Glass Factory Workers: WHO 2013 Oral Health Survey

    PubMed Central

    Bhat, Nagesh; Asawa, Kailash; Tak, Mridula; Bapat, Salil; Gupta, Vivek Vardhan

    2015-01-01

    Background Glass factory workers are often exposed to the hazardous environment that leads to deleterious oral health and subsequently, general health. We planned to determine the effects of the particulates present in the milieu on the tooth wear among workers. Aim To assess tooth wear among glass factory workers in Jaipur, Rajasthan, India. Settings and Design A descriptive cross-sectional survey was conducted among 936 glass workers in Jaipur, Rajasthan, India from January-June 2014. Materials and Methods A survey proforma was designed for tooth wear evaluation with the help of WHO Oral Health Assessment form 2013 (for adults). Information regarding oral health practices, adverse habits and dietary habits, demographic details was gathered and clinical parameters were recorded. Statistical Analysis The Chi–square test, t–test, One-way Analysis of Variance and a Stepwise multiple linear regression analysis. Results The most prevalent form of erosion was enamel erosion (589, 62.93%) with few subjects of deeper dentinal erosion and the difference was statistically significant (p=0.001). Dental erosion was found to be higher among males compared to females. Years of experience and educational status were identified as best predictors for dental erosion. Conclusion It was concluded that there was considerable evidence of dental erosion found among the factory workers. Due to ignorance on social, cultural and health aspects, professional approach with regular dental care services for detection of early symptoms and planning of preventive strategies is warranted. PMID:26436050

  6. Network approach towards understanding the crazing in glassy amorphous polymers

    NASA Astrophysics Data System (ADS)

    Venkatesan, Sudarkodi; Vivek-Ananth, R. P.; Sreejith, R. P.; Mangalapandi, Pattulingam; Hassanali, Ali A.; Samal, Areejit

    2018-04-01

    We have used molecular dynamics to simulate an amorphous glassy polymer with long chains to study the deformation mechanism of crazing and associated void statistics. The Van der Waals interactions and the entanglements between chains constituting the polymer play a crucial role in crazing. Thus, we have reconstructed two underlying weighted networks, namely, the Van der Waals network and the entanglement network from polymer configurations extracted from the molecular dynamics simulation. Subsequently, we have performed graph-theoretic analysis of the two reconstructed networks to reveal the role played by them in the crazing of polymers. Our analysis captured various stages of crazing through specific trends in the network measures for Van der Waals networks and entanglement networks. To further corroborate the effectiveness of network analysis in unraveling the underlying physics of crazing in polymers, we have contrasted the trends in network measures for Van der Waals networks and entanglement networks in the light of stress-strain behaviour and voids statistics during deformation. We find that the Van der Waals network plays a crucial role in craze initiation and growth. Although, the entanglement network was found to maintain its structure during craze initiation stage, it was found to progressively weaken and undergo dynamic changes during the hardening and failure stages of crazing phenomena. Our work demonstrates the utility of network theory in quantifying the underlying physics of polymer crazing and widens the scope of applications of network science to characterization of deformation mechanisms in diverse polymers.

  7. Spatio-temporal pattern of sylvatic rabies in the Sultanate of Oman, 2006-2010.

    PubMed

    Hussain, Muhammad Hammad; Ward, Michael P; Body, Mohammed; Al-Rawahi, Abdulmajeed; Wadir, Ali Awlad; Al-Habsi, Saif; Saqib, Muhammad; Ahmed, Mohammed Sayed; Almaawali, Mahir Gharib

    2013-07-01

    Rabies was first reported in the Sultanate of Oman is 1990. We analysed passive surveillance data (444 samples) collected and reported between 2006 and 2010. During this period, between 45 and 75% of samples submitted from suspect animals were subsequently confirmed (fluorescent antibody test, histopathology and reverse transcription PCR) as rabies cases. Overall, 63% of submitted samples were confirmed as rabies cases. The spatial distribution of species-specific cases were similar (centred in north-central Oman with a northeast-southwest distribution), although fox cases had a wider distribution and an east-west orientation. Clustering of cases was detected using interpolation, local spatial autocorrelation and scan statistical analysis. Several local government areas (wilayats) in north-central Oman were identified where higher than expected numbers of laboratory-confirmed rabies cases were reported. For fox rabies, more clusters (local spatial autocorrelation analysis) and a larger clustered area (scan statistical analysis) were detected. In Oman, monthly reports of fox rabies cases were highly correlated (rSP>0.5) with reports of camel, cattle, sheep and goat rabies. The best-fitting ARIMA model included a seasonality component. Fox rabies cases reported 6 months previously best explained rabies reported cases in other animal species. Despite likely reporting bias, results suggest that rabies exists as a sylvatic cycle of transmission in Oman and an opportunity still exists to prevent establishment of dog-mediated rabies. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Application of multivariate statistical analysis in the pollution and health risk of traffic-related heavy metals.

    PubMed

    Ebqa'ai, Mohammad; Ibrahim, Bashar

    2017-12-01

    This study aims to analyse the heavy metal pollutants in Jeddah, the second largest city in the Gulf Cooperation Council with a population exceeding 3.5 million, and many vehicles. Ninety-eight street dust samples were collected seasonally from the six major roads as well as the Jeddah Beach, and subsequently digested using modified Leeds Public Analyst method. The heavy metals (Fe, Zn, Mn, Cu, Cd, and Pb) were extracted from the ash using methyl isobutyl ketone as solvent extraction and eventually analysed by atomic absorption spectroscopy. Multivariate statistical techniques, principal component analysis (PCA), and hierarchical cluster analysis were applied to these data. Heavy metal concentrations were ranked according to the following descending order: Fe > Zn > Mn > Cu > Pb > Cd. In order to study the pollution and health risk from these heavy metals as well as estimating their effect on the environment, pollution indices, integrated pollution index, enrichment factor, daily dose average, hazard quotient, and hazard index were all analysed. The PCA showed high levels of Zn, Fe, and Cd in Al Kurnish road, while these elements were consistently detected on King Abdulaziz and Al Madina roads. The study indicates that high levels of Zn and Pb pollution were recorded for major roads in Jeddah. Six out of seven roads had high pollution indices. This study is the first step towards further investigations into current health problems in Jeddah, such as anaemia and asthma.

  9. Altered White Matter Integrity in Human Immunodeficiency Virus-Associated Neurocognitive Disorder: A Tract-Based Spatial Statistics Study.

    PubMed

    Oh, Se Won; Shin, Na-Young; Choi, Jun Yong; Lee, Seung-Koo; Bang, Mi Rim

    2018-01-01

    Human immunodeficiency virus (HIV) infection has been known to damage the microstructural integrity of white matter (WM). However, only a few studies have assessed the brain regions in HIV-associated neurocognitive disorders (HAND) with diffusion tensor imaging (DTI). Therefore, we sought to compare the DTI data between HIV patients with and without HAND using tract-based spatial statistics (TBSS). Twenty-two HIV-infected patients (10 with HAND and 12 without HAND) and 11 healthy controls (HC) were enrolled in this study. A whole-brain analysis of fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), and axial diffusivity was performed with TBSS and a subsequent 20 tract-specific region-of-interest (ROI)-based analysis to localize and compare altered WM integrity in all group contrasts. Compared with HC, patients with HAND showed decreased FA in the right frontoparietal WM including the upper corticospinal tract (CST) and increased MD and RD in the bilateral frontoparietal WM, corpus callosum, bilateral CSTs and bilateral cerebellar peduncles. The DTI values did not significantly differ between HIV patients with and without HAND or between HIV patients without HAND and HC. In the ROI-based analysis, decreased FA was observed in the right superior longitudinal fasciculus and was significantly correlated with decreased information processing speed, memory, executive function, and fine motor function in HIV patients. These results suggest that altered integrity of the frontoparietal WM contributes to cognitive dysfunction in HIV patients.

  10. IEEE International Symposium on Biomedical Imaging.

    PubMed

    2017-01-01

    The IEEE International Symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the IEEE Signal Processing Society (SPS) and the IEEE Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical image processing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by IEEE and included in IEEE Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.

  11. A “Hybrid” Bacteriology Course: The Professor’s Design and Expectations; The Students’ Performance and Assessment

    PubMed Central

    KRAWIEC, STEVEN; SALTER, DIANE; KAY, EDWIN J.

    2005-01-01

    A basic bacteriology course was offered in two successive academic years, first in a conventional format and subsequently as a “hybrid” course. The latter combined (i) online presentation of content, (ii) an emphasis on online resources, (iii) thrice-weekly, face-to-face conversations to advance understanding, and (iv) frequent student postings on an electronic discussion board. We compared the two courses through statistical analysis of student performances on the final examinations and the course overall and student assessment of teaching. The data indicated that there was no statistical difference in performance on the final examinations or the course overall. Responses on an instrument of evaluation revealed that students less strongly affirmed the following measures in the hybrid course: (i) The amount of work was appropriate for the credit received, (ii) Interactions between students and instructor were positive, (iii) I learned a great deal in this course, and (iv) I would recommend this course to other students. We recommend clear direction about active learning tasks and relevant feedback to enhance learning in a hybrid course. PMID:23653558

  12. Using Optical Coherence Tomography to Evaluate Skin Sun Damage and Precancer

    PubMed Central

    Korde, Vrushali R.; Bonnema, Garret T.; Xu, Wei; Krishnamurthy, Chetankumar; Ranger-Moore, James; Saboda, Kathylynn; Slayton, Lisa D.; Salasche, Stuart J.; Warneke, James A.; Alberts, David S.; Barton, Jennifer K.

    2008-01-01

    Background and Objectives Optical coherence tomography (OCT) is a depth resolved imaging modality that may aid in identifying sun damaged skin and the precancerous condition actinic keratosis (AK). Study Design/Materials and Methods OCT images were acquired of 112 patients at 2 sun protected and 2 sun exposed sites, with a subsequent biopsy. Each site received a dermatological evaluation, a histological diagnosis, and a solar elastosis (SE) score. OCT images were examined visually and statistically analyzed. Results Characteristic OCT image features were identified of sun protected, undiseased, sun damaged, and AK skin. A statistically significant difference (P < 0.0001) between the average attenuation values of skin with minimal and severe solar elastosis was observed. Significant differences (P < 0.0001) were also found between undiseased skin and AK using a gradient analysis. Using image features, AK could be distinguished from undiseased skin with 86% sensitivity and 83% specificity. Conclusion OCT has the potential to guide biopsies and provide non-invasive measures of skin sun damage and disease state, possibly increasing efficiency of chemopreventive agent trials. PMID:17960754

  13. Relationship between histological diagnosis and evolution of 70 periapical lesions at 12 months, treated by periapical surgery.

    PubMed

    Carrillo, Celia; Peñarrocha, Miguel; Bagán, José Vicente; Vera, Francisco

    2008-08-01

    To relate the histologic diagnosis and radiographic size with the prognosis of 70 biopsies obtained via periapical surgery. Seventy biopsies obtained during periapical surgery were histologically analyzed following curettage of the tissue, establishing the diagnosis as either apical granuloma, radicular cyst, or scar tissue. The radiographic size of the lesion (area in mm(2)) before surgery and after 1 year of follow-up was measured. The evolution at 12 months after surgery was evaluated according to the criteria of von Arx and Kurt. A statistical study was made, the inter-variable relationships were studied using analysis of variance with subsequent Tukey testing and calculation of Pearson correlation coefficient. Results indicated that 65.7% of lesions were granulomas, 25.7% scar tissue, and 8.6% cysts. The larger lesions had the worst prognosis. Cysts had the worst evolution at 12 months after surgery, this result being statistically significant. The prognosis for the periapical lesion depended on the type of lesion and its radiographic size, with cysts and larger lesions having the worst evolution.

  14. Radiographic versus clinical extension of Class II carious lesions using an F-speed film.

    PubMed

    Kooistra, Scott; Dennison, Joseph B; Yaman, Peter; Burt, Brian A; Taylor, George W

    2005-01-01

    This study investigated the difference in the apparent radiographic and true clinical extension of Class II carious lesions. Sixty-two lesions in both maxillary and mandibular premolars and molars were radiographed using Insight bitewing film. Class II lesions were scored independently by two masked examiners using an 8-point lesion severity scale. During the restoration process the lesions were dissected in a stepwise fashion from the occlusal aspect. Intraoperative photographs (2x) of the lesions were made, utilizing a novel measurement device in the field as a point of reference. Subsequently, the lesions were all given clinical scores using the same 8-point scale. Statistical analysis showed a significant difference between the true clinical extension of the lesions compared to the radiographic score. "Aggressive" and "Conservative" radiographic diagnoses underestimated the true clinical extent by 0.66 mm and 0.91 mm, respectively. No statistical difference was found between premolars and molars or maxillary and mandibular arches. The results of this study help to define the parameters for making restorative treatment decisions involving Class II carious lesions.

  15. Condenser: a statistical aggregation tool for multi-sample quantitative proteomic data from Matrix Science Mascot Distiller™.

    PubMed

    Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan

    2014-05-30

    We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Interactive semiautomatic contour delineation using statistical conditional random fields framework.

    PubMed

    Hu, Yu-Chi; Grossberg, Michael D; Wu, Abraham; Riaz, Nadeem; Perez, Carmen; Mageras, Gig S

    2012-07-01

    Contouring a normal anatomical structure during radiation treatment planning requires significant time and effort. The authors present a fast and accurate semiautomatic contour delineation method to reduce the time and effort required of expert users. Following an initial segmentation on one CT slice, the user marks the target organ and nontarget pixels with a few simple brush strokes. The algorithm calculates statistics from this information that, in turn, determines the parameters of an energy function containing both boundary and regional components. The method uses a conditional random field graphical model to define the energy function to be minimized for obtaining an estimated optimal segmentation, and a graph partition algorithm to efficiently solve the energy function minimization. Organ boundary statistics are estimated from the segmentation and propagated to subsequent images; regional statistics are estimated from the simple brush strokes that are either propagated or redrawn as needed on subsequent images. This greatly reduces the user input needed and speeds up segmentations. The proposed method can be further accelerated with graph-based interpolation of alternating slices in place of user-guided segmentation. CT images from phantom and patients were used to evaluate this method. The authors determined the sensitivity and specificity of organ segmentations using physician-drawn contours as ground truth, as well as the predicted-to-ground truth surface distances. Finally, three physicians evaluated the contours for subjective acceptability. Interobserver and intraobserver analysis was also performed and Bland-Altman plots were used to evaluate agreement. Liver and kidney segmentations in patient volumetric CT images show that boundary samples provided on a single CT slice can be reused through the entire 3D stack of images to obtain accurate segmentation. In liver, our method has better sensitivity and specificity (0.925 and 0.995) than region growing (0.897 and 0.995) and level set methods (0.912 and 0.985) as well as shorter mean predicted-to-ground truth distance (2.13 mm) compared to regional growing (4.58 mm) and level set methods (8.55 mm and 4.74 mm). Similar results are observed in kidney segmentation. Physician evaluation of ten liver cases showed that 83% of contours did not need any modification, while 6% of contours needed modifications as assessed by two or more evaluators. In interobserver and intraobserver analysis, Bland-Altman plots showed our method to have better repeatability than the manual method while the delineation time was 15% faster on average. Our method achieves high accuracy in liver and kidney segmentation and considerably reduces the time and labor required for contour delineation. Since it extracts purely statistical information from the samples interactively specified by expert users, the method avoids heuristic assumptions commonly used by other methods. In addition, the method can be expanded to 3D directly without modification because the underlying graphical framework and graph partition optimization method fit naturally with the image grid structure.

  17. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  18. Subgroup analyses in randomised controlled trials: cohort study on trial protocols and journal publications.

    PubMed

    Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias

    2014-07-16

    To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.

  19. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  20. Autoregressive statistical pattern recognition algorithms for damage detection in civil structures

    NASA Astrophysics Data System (ADS)

    Yao, Ruigen; Pakzad, Shamim N.

    2012-08-01

    Statistical pattern recognition has recently emerged as a promising set of complementary methods to system identification for automatic structural damage assessment. Its essence is to use well-known concepts in statistics for boundary definition of different pattern classes, such as those for damaged and undamaged structures. In this paper, several statistical pattern recognition algorithms using autoregressive models, including statistical control charts and hypothesis testing, are reviewed as potentially competitive damage detection techniques. To enhance the performance of statistical methods, new feature extraction techniques using model spectra and residual autocorrelation, together with resampling-based threshold construction methods, are proposed. Subsequently, simulated acceleration data from a multi degree-of-freedom system is generated to test and compare the efficiency of the existing and proposed algorithms. Data from laboratory experiments conducted on a truss and a large-scale bridge slab model are then used to further validate the damage detection methods and demonstrate the superior performance of proposed algorithms.

  1. Los Alamos National Laboratory W76 Pit Tube Lifetime Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abeln, Terri G.

    2012-04-25

    A metallurgical study was requested as part of the Los Alamos National Laboratory (LANL) W76-1 life-extension program (LEP) involving a lifetime analysis of type 304 stainless steel pit tubes subject to repeat bending loads during assembly and disassembly operations at BWXT/Pantex. This initial test phase was completed during the calendar years of 2004-2006 and the report not issued until additional recommended tests could be performed. These tests have not been funded to this date and therefore this report is considered final. Tubes were reportedly fabricated according to Rocky Flats specification P14548 - Seamless Type 304 VIM/VAR Stainless Steel Tubing. Tubemore » diameter was specified as 0.125 inches and wall thickness as 0.028 inches. A heat treat condition is not specified and the hardness range specification can be characteristic of both 1/8 and 1/4 hard conditions. Properties of all tubes tested were within specification. Metallographic analysis could not conclusively determine a specified limit to number of bends allowable. A statistical analysis suggests a range of 5-7 bends with a 99.95% confidence limit. See the 'Statistical Analysis' section of this report. The initial phase of this study involved two separate sets of test specimens. The first group was part of an investigation originating in the ESA-GTS [now Gas Transfer Systems (W-7) Group]. After the bend cycle test parameters were chosen (all three required bends subjected to the same amount of bend cycles) and the tubes bent, the investigation was transferred to Terri Abeln (Metallurgical Science and Engineering) for analysis. Subsequently, another limited quantity of tubes became available for testing and were cycled with the same bending fixture, but with different test parameters determined by T. Abeln.« less

  2. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.

  3. Use of generalized ordered logistic regression for the analysis of multidrug resistance data.

    PubMed

    Agga, Getahun E; Scott, H Morgan

    2015-10-01

    Statistical analysis of antimicrobial resistance data largely focuses on individual antimicrobial's binary outcome (susceptible or resistant). However, bacteria are becoming increasingly multidrug resistant (MDR). Statistical analysis of MDR data is mostly descriptive often with tabular or graphical presentations. Here we report the applicability of generalized ordinal logistic regression model for the analysis of MDR data. A total of 1,152 Escherichia coli, isolated from the feces of weaned pigs experimentally supplemented with chlortetracycline (CTC) and copper, were tested for susceptibilities against 15 antimicrobials and were binary classified into resistant or susceptible. The 15 antimicrobial agents tested were grouped into eight different antimicrobial classes. We defined MDR as the number of antimicrobial classes to which E. coli isolates were resistant ranging from 0 to 8. Proportionality of the odds assumption of the ordinal logistic regression model was violated only for the effect of treatment period (pre-treatment, during-treatment and post-treatment); but not for the effect of CTC or copper supplementation. Subsequently, a partially constrained generalized ordinal logistic model was built that allows for the effect of treatment period to vary while constraining the effects of treatment (CTC and copper supplementation) to be constant across the levels of MDR classes. Copper (Proportional Odds Ratio [Prop OR]=1.03; 95% CI=0.73-1.47) and CTC (Prop OR=1.1; 95% CI=0.78-1.56) supplementation were not significantly associated with the level of MDR adjusted for the effect of treatment period. MDR generally declined over the trial period. In conclusion, generalized ordered logistic regression can be used for the analysis of ordinal data such as MDR data when the proportionality assumptions for ordered logistic regression are violated. Published by Elsevier B.V.

  4. Evidence cross-validation and Bayesian inference of MAST plasma equilibria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nessi, G. T. von; Hole, M. J.; Svensson, J.

    2012-01-15

    In this paper, current profiles for plasma discharges on the mega-ampere spherical tokamak are directly calculated from pickup coil, flux loop, and motional-Stark effect observations via methods based in the statistical theory of Bayesian analysis. By representing toroidal plasma current as a series of axisymmetric current beams with rectangular cross-section and inferring the current for each one of these beams, flux-surface geometry and q-profiles are subsequently calculated by elementary application of Biot-Savart's law. The use of this plasma model in the context of Bayesian analysis was pioneered by Svensson and Werner on the joint-European tokamak [Svensson and Werner,Plasma Phys. Controlledmore » Fusion 50(8), 085002 (2008)]. In this framework, linear forward models are used to generate diagnostic predictions, and the probability distribution for the currents in the collection of plasma beams was subsequently calculated directly via application of Bayes' formula. In this work, we introduce a new diagnostic technique to identify and remove outlier observations associated with diagnostics falling out of calibration or suffering from an unidentified malfunction. These modifications enable a good agreement between Bayesian inference of the last-closed flux-surface with other corroborating data, such as that from force balance considerations using EFIT++[Appel et al., ''A unified approach to equilibrium reconstruction'' Proceedings of the 33rd EPS Conference on Plasma Physics (Rome, Italy, 2006)]. In addition, this analysis also yields errors on the plasma current profile and flux-surface geometry as well as directly predicting the Shafranov shift of the plasma core.« less

  5. Psychotic experiences and general medical conditions: a cross-national analysis based on 28 002 respondents from 16 countries in the WHO World Mental Health Surveys.

    PubMed

    Scott, Kate M; Saha, Sukanta; Lim, Carmen C W; Aguilar-Gaxiola, Sergio; Al-Hamzawi, Ali; Alonso, Jordi; Benjet, Corina; Bromet, Evelyn J; Bruffaerts, Ronny; Caldas-de-Almeida, José Miguel; de Girolamo, Giovanni; de Jonge, Peter; Degenhardt, Louisa; Florescu, Silvia; Gureje, Oye; Haro, Josep M; Hu, Chiyi; Karam, Elie G; Kovess-Masfety, Viviane; Lee, Sing; Lepine, Jean-Pierre; Mneimneh, Zeina; Navarro-Mateu, Fernando; Piazza, Marina; Posada-Villa, José; Sampson, Nancy A; Stagnaro, Juan Carlos; Kessler, Ronald C; McGrath, John J

    2018-02-26

    Previous work has identified associations between psychotic experiences (PEs) and general medical conditions (GMCs), but their temporal direction remains unclear as does the extent to which they are independent of comorbid mental disorders. In total, 28 002 adults in 16 countries from the WHO World Mental Health (WMH) Surveys were assessed for PEs, GMCs and 21 Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) mental disorders. Discrete-time survival analyses were used to estimate the associations between PEs and GMCs with various adjustments. After adjustment for comorbid mental disorders, temporally prior PEs were significantly associated with subsequent onset of 8/12 GMCs (arthritis, back or neck pain, frequent or severe headache, other chronic pain, heart disease, high blood pressure, diabetes and peptic ulcer) with odds ratios (ORs) ranging from 1.3 [95% confidence interval (CI) 1.1-1.5] to 1.9 (95% CI 1.4-2.4). In contrast, only three GMCs (frequent or severe headache, other chronic pain and asthma) were significantly associated with subsequent onset of PEs after adjustment for comorbid GMCs and mental disorders, with ORs ranging from 1.5 (95% CI 1.2-1.9) to 1.7 (95% CI 1.2-2.4). PEs were associated with the subsequent onset of a wide range of GMCs, independent of comorbid mental disorders. There were also associations between some medical conditions (particularly those involving chronic pain) and subsequent PEs. Although these findings will need to be confirmed in prospective studies, clinicians should be aware that psychotic symptoms may be risk markers for a wide range of adverse health outcomes. Whether PEs are causal risk factors will require further research.

  6. Statistical patterns in the location of natural lightning

    NASA Astrophysics Data System (ADS)

    Zoghzoghy, F. G.; Cohen, M. B.; Said, R. K.; Inan, U. S.

    2013-01-01

    Lightning discharges are nature's way of neutralizing the electrical buildup in thunderclouds. Thus, if an individual discharge destroys a substantial fraction of the cloud charge, the probability of a subsequent flash is reduced until the cloud charge separation rebuilds. The temporal pattern of lightning activity in a localized region may thus inherently be a proxy measure of the corresponding timescales for charge separation and electric field buildup processes. We present a statistical technique to bring out this effect (as well as the subsequent recovery) using lightning geo-location data, in this case with data from the National Lightning Detection Network (NLDN) and from the GLD360 Network. We use this statistical method to show that a lightning flash can remove an appreciable fraction of the built up charge, affecting the neighboring lightning activity for tens of seconds within a ˜ 10 km radius. We find that our results correlate with timescales of electric field buildup in storms and suggest that the proposed statistical tool could be used to study the electrification of storms on a global scale. We find that this flash suppression effect is a strong function of flash type, flash polarity, cloud-to-ground flash multiplicity, the geographic location of lightning, and is proportional to NLDN model-derived peak stroke current. We characterize the spatial and temporal extent of the suppression effect as a function of these parameters and discuss various applications of our findings.

  7. Ganymede - A relationship between thermal history and crater statistics

    NASA Technical Reports Server (NTRS)

    Phillips, R. J.; Malin, M. C.

    1980-01-01

    An approach for factoring the effects of a planetary thermal history into a predicted set of crater statistics for an icy satellite is developed and forms the basis for subsequent data inversion studies. The key parameter is a thermal evolution-dependent critical time for which craters of a particular size forming earlier do not contribute to present-day statistics. An example is given for the satellite Ganymede and the effect of the thermal history is easily seen in the resulting predicted crater statistics. A preliminary comparison with the data, subject to the uncertainties in ice rheology and impact flux history, suggests a surface age of 3.8 x 10 to the 9th years and a radionuclide abundance of 0.3 times the chondritic value.

  8. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  9. Providing peak river flow statistics and forecasting in the Niger River basin

    NASA Astrophysics Data System (ADS)

    Andersson, Jafet C. M.; Ali, Abdou; Arheimer, Berit; Gustafsson, David; Minoungou, Bernard

    2017-08-01

    Flooding is a growing concern in West Africa. Improved quantification of discharge extremes and associated uncertainties is needed to improve infrastructure design, and operational forecasting is needed to provide timely warnings. In this study, we use discharge observations, a hydrological model (Niger-HYPE) and extreme value analysis to estimate peak river flow statistics (e.g. the discharge magnitude with a 100-year return period) across the Niger River basin. To test the model's capacity of predicting peak flows, we compared 30-year maximum discharge and peak flow statistics derived from the model vs. derived from nine observation stations. The results indicate that the model simulates peak discharge reasonably well (on average + 20%). However, the peak flow statistics have a large uncertainty range, which ought to be considered in infrastructure design. We then applied the methodology to derive basin-wide maps of peak flow statistics and their associated uncertainty. The results indicate that the method is applicable across the hydrologically active part of the river basin, and that the uncertainty varies substantially depending on location. Subsequently, we used the most recent bias-corrected climate projections to analyze potential changes in peak flow statistics in a changed climate. The results are generally ambiguous, with consistent changes only in very few areas. To test the forecasting capacity, we ran Niger-HYPE with a combination of meteorological data sets for the 2008 high-flow season and compared with observations. The results indicate reasonable forecasting capacity (on average 17% deviation), but additional years should also be evaluated. We finish by presenting a strategy and pilot project which will develop an operational flood monitoring and forecasting system based in-situ data, earth observations, modelling, and extreme statistics. In this way we aim to build capacity to ultimately improve resilience toward floods, protecting lives and infrastructure in the region.

  10. Resonance-assisted decay of nondispersive wave packets.

    PubMed

    Wimberger, Sandro; Schlagheck, Peter; Eltschka, Christopher; Buchleitner, Andreas

    2006-07-28

    We present a quantitative semiclassical theory for the decay of nondispersive electronic wave packets in driven, ionizing Rydberg systems. Statistically robust quantities are extracted combining resonance-assisted tunneling with subsequent transport across chaotic phase space and a final ionization step.

  11. The Dilemma in Black Higher Education: A Synthesis of Recent Statistics and Conceptual Realities.

    ERIC Educational Resources Information Center

    Lang, Marvel

    1988-01-01

    Discusses the effects of desegregation on the academic achievement of black elementary and high school students and the subsequent decline in black enrollment in and graduation from graduate and professional schools. (FMW)

  12. Publication bias in animal research presented at the 2008 Society of Critical Care Medicine Conference.

    PubMed

    Conradi, Una; Joffe, Ari R

    2017-07-07

    To determine a direct measure of publication bias by determining subsequent full-paper publication (P) of studies reported in animal research abstracts presented at an international conference (A). We selected 100 random (using a random-number generator) A from the 2008 Society of Critical Care Medicine Conference. Using a data collection form and study manual, we recorded methodology and result variables from A. We searched PubMed and EMBASE to June 2015, and DOAJ and Google Scholar to May 2017 to screen for subsequent P. Methodology and result variables were recorded from P to determine changes in reporting from A. Predictors of P were examined using Fisher's Exact Test. 62% (95% CI 52-71%) of studies described in A were subsequently P after a median 19 [IQR 9-33.3] months from conference presentation. Reporting of studies in A was of low quality: randomized 27% (the method of randomization and allocation concealment not described), blinded 0%, sample-size calculation stated 0%, specifying the primary outcome 26%, numbers given with denominators 6%, and stating number of animals used 47%. Only being an orally presented (vs. poster presented) A (14/16 vs. 48/84, p = 0.025) predicted P. Reporting of studies in P was of poor quality: randomized 39% (the method of randomization and allocation concealment not described), likely blinded 6%, primary outcome specified 5%, sample size calculation stated 0%, numbers given with denominators 34%, and number of animals used stated 56%. Changes in reporting from A to P occurred: from non-randomized to randomized 19%, from non-blinded to blinded 6%, from negative to positive outcomes 8%, from having to not having a stated primary outcome 16%, and from non-statistically to statistically significant findings 37%. Post-hoc, using publication data, P was predicted by having positive outcomes (published 62/62, unpublished 33/38; p = 0.003), or statistically significant results (published 58/62, unpublished 20/38; p < 0.001). Only 62% (95% CI 52-71%) of animal research A are subsequently P; this was predicted by oral presentation of the A, finally having positive outcomes, and finally having statistically significant results. Publication bias is prevalent in critical care animal research.

  13. Regional analysis of annual maximum rainfall using TL-moments method

    NASA Astrophysics Data System (ADS)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  14. Dynamic association rules for gene expression data analysis.

    PubMed

    Chen, Shu-Chuan; Tsai, Tsung-Hsien; Chung, Cheng-Han; Li, Wen-Hsiung

    2015-10-14

    The purpose of gene expression analysis is to look for the association between regulation of gene expression levels and phenotypic variations. This association based on gene expression profile has been used to determine whether the induction/repression of genes correspond to phenotypic variations including cell regulations, clinical diagnoses and drug development. Statistical analyses on microarray data have been developed to resolve gene selection issue. However, these methods do not inform us of causality between genes and phenotypes. In this paper, we propose the dynamic association rule algorithm (DAR algorithm) which helps ones to efficiently select a subset of significant genes for subsequent analysis. The DAR algorithm is based on association rules from market basket analysis in marketing. We first propose a statistical way, based on constructing a one-sided confidence interval and hypothesis testing, to determine if an association rule is meaningful. Based on the proposed statistical method, we then developed the DAR algorithm for gene expression data analysis. The method was applied to analyze four microarray datasets and one Next Generation Sequencing (NGS) dataset: the Mice Apo A1 dataset, the whole genome expression dataset of mouse embryonic stem cells, expression profiling of the bone marrow of Leukemia patients, Microarray Quality Control (MAQC) data set and the RNA-seq dataset of a mouse genomic imprinting study. A comparison of the proposed method with the t-test on the expression profiling of the bone marrow of Leukemia patients was conducted. We developed a statistical way, based on the concept of confidence interval, to determine the minimum support and minimum confidence for mining association relationships among items. With the minimum support and minimum confidence, one can find significant rules in one single step. The DAR algorithm was then developed for gene expression data analysis. Four gene expression datasets showed that the proposed DAR algorithm not only was able to identify a set of differentially expressed genes that largely agreed with that of other methods, but also provided an efficient and accurate way to find influential genes of a disease. In the paper, the well-established association rule mining technique from marketing has been successfully modified to determine the minimum support and minimum confidence based on the concept of confidence interval and hypothesis testing. It can be applied to gene expression data to mine significant association rules between gene regulation and phenotype. The proposed DAR algorithm provides an efficient way to find influential genes that underlie the phenotypic variance.

  15. Embedded DCT and wavelet methods for fine granular scalable video: analysis and comparison

    NASA Astrophysics Data System (ADS)

    van der Schaar-Mitrea, Mihaela; Chen, Yingwei; Radha, Hayder

    2000-04-01

    Video transmission over bandwidth-varying networks is becoming increasingly important due to emerging applications such as streaming of video over the Internet. The fundamental obstacle in designing such systems resides in the varying characteristics of the Internet (i.e. bandwidth variations and packet-loss patterns). In MPEG-4, a new SNR scalability scheme, called Fine-Granular-Scalability (FGS), is currently under standardization, which is able to adapt in real-time (i.e. at transmission time) to Internet bandwidth variations. The FGS framework consists of a non-scalable motion-predicted base-layer and an intra-coded fine-granular scalable enhancement layer. For example, the base layer can be coded using a DCT-based MPEG-4 compliant, highly efficient video compression scheme. Subsequently, the difference between the original and decoded base-layer is computed, and the resulting FGS-residual signal is intra-frame coded with an embedded scalable coder. In order to achieve high coding efficiency when compressing the FGS enhancement layer, it is crucial to analyze the nature and characteristics of residual signals common to the SNR scalability framework (including FGS). In this paper, we present a thorough analysis of SNR residual signals by evaluating its statistical properties, compaction efficiency and frequency characteristics. The signal analysis revealed that the energy compaction of the DCT and wavelet transforms is limited and the frequency characteristic of SNR residual signals decay rather slowly. Moreover, the blockiness artifacts of the low bit-rate coded base-layer result in artificial high frequencies in the residual signal. Subsequently, a variety of wavelet and embedded DCT coding techniques applicable to the FGS framework are evaluated and their results are interpreted based on the identified signal properties. As expected from the theoretical signal analysis, the rate-distortion performances of the embedded wavelet and DCT-based coders are very similar. However, improved results can be obtained for the wavelet coder by deblocking the base- layer prior to the FGS residual computation. Based on the theoretical analysis and our measurements, we can conclude that for an optimal complexity versus coding-efficiency trade- off, only limited wavelet decomposition (e.g. 2 stages) needs to be performed for the FGS-residual signal. Also, it was observed that the good rate-distortion performance of a coding technique for a certain image type (e.g. natural still-images) does not necessarily translate into similarly good performance for signals with different visual characteristics and statistical properties.

  16. The more total cognitive load is reduced by cues, the better retention and transfer of multimedia learning: A meta-analysis and two meta-regression analyses.

    PubMed

    Xie, Heping; Wang, Fuxing; Hao, Yanbin; Chen, Jiaxue; An, Jing; Wang, Yuxin; Liu, Huashan

    2017-01-01

    Cueing facilitates retention and transfer of multimedia learning. From the perspective of cognitive load theory (CLT), cueing has a positive effect on learning outcomes because of the reduction in total cognitive load and avoidance of cognitive overload. However, this has not been systematically evaluated. Moreover, what remains ambiguous is the direct relationship between the cue-related cognitive load and learning outcomes. A meta-analysis and two subsequent meta-regression analyses were conducted to explore these issues. Subjective total cognitive load (SCL) and scores on a retention test and transfer test were selected as dependent variables. Through a systematic literature search, 32 eligible articles encompassing 3,597 participants were included in the SCL-related meta-analysis. Among them, 25 articles containing 2,910 participants were included in the retention-related meta-analysis and the following retention-related meta-regression, while there were 29 articles containing 3,204 participants included in the transfer-related meta-analysis and the transfer-related meta-regression. The meta-analysis revealed a statistically significant cueing effect on subjective ratings of cognitive load (d = -0.11, 95% CI = [-0.19, -0.02], p < 0.05), retention performance (d = 0.27, 95% CI = [0.08, 0.46], p < 0.01), and transfer performance (d = 0.34, 95% CI = [0.12, 0.56], p < 0.01). The subsequent meta-regression analyses showed that dSCL for cueing significantly predicted dretention for cueing (β = -0.70, 95% CI = [-1.02, -0.38], p < 0.001), as well as dtransfer for cueing (β = -0.60, 95% CI = [-0.92, -0.28], p < 0.001). Thus in line with CLT, adding cues in multimedia materials can indeed reduce SCL and promote learning outcomes, and the more SCL is reduced by cues, the better retention and transfer of multimedia learning.

  17. The more total cognitive load is reduced by cues, the better retention and transfer of multimedia learning: A meta-analysis and two meta-regression analyses

    PubMed Central

    Hao, Yanbin; Chen, Jiaxue; An, Jing; Wang, Yuxin; Liu, Huashan

    2017-01-01

    Cueing facilitates retention and transfer of multimedia learning. From the perspective of cognitive load theory (CLT), cueing has a positive effect on learning outcomes because of the reduction in total cognitive load and avoidance of cognitive overload. However, this has not been systematically evaluated. Moreover, what remains ambiguous is the direct relationship between the cue-related cognitive load and learning outcomes. A meta-analysis and two subsequent meta-regression analyses were conducted to explore these issues. Subjective total cognitive load (SCL) and scores on a retention test and transfer test were selected as dependent variables. Through a systematic literature search, 32 eligible articles encompassing 3,597 participants were included in the SCL-related meta-analysis. Among them, 25 articles containing 2,910 participants were included in the retention-related meta-analysis and the following retention-related meta-regression, while there were 29 articles containing 3,204 participants included in the transfer-related meta-analysis and the transfer-related meta-regression. The meta-analysis revealed a statistically significant cueing effect on subjective ratings of cognitive load (d = −0.11, 95% CI = [−0.19, −0.02], p < 0.05), retention performance (d = 0.27, 95% CI = [0.08, 0.46], p < 0.01), and transfer performance (d = 0.34, 95% CI = [0.12, 0.56], p < 0.01). The subsequent meta-regression analyses showed that dSCL for cueing significantly predicted dretention for cueing (β = −0.70, 95% CI = [−1.02, −0.38], p < 0.001), as well as dtransfer for cueing (β = −0.60, 95% CI = [−0.92, −0.28], p < 0.001). Thus in line with CLT, adding cues in multimedia materials can indeed reduce SCL and promote learning outcomes, and the more SCL is reduced by cues, the better retention and transfer of multimedia learning. PMID:28854205

  18. Indigenous Mortality (Revealed): The Invisible Illuminated

    PubMed Central

    Ring, Ian; Arambula Solomon, Teshia G.; Gachupin, Francine C.; Smylie, Janet; Cutler, Tessa Louise; Waldon, John A.

    2015-01-01

    Inaccuracies in the identification of Indigenous status and the collection of and access to vital statistics data impede the strategic implementation of evidence-based public health initiatives to reduce avoidable deaths. The impact of colonization and subsequent government initiatives has been commonly observed among the Indigenous peoples of Australia, Canada, New Zealand, and the United States. The quality of Indigenous data that informs mortality statistics are similarly connected to these distal processes, which began with colonization. We discuss the methodological and technical challenges in measuring mortality for Indigenous populations within a historical and political context, and identify strategies for the accurate ascertainment and inclusion of Indigenous people in mortality statistics. PMID:25211754

  19. Antenatal perineal massage and subsequent perineal outcomes: a randomised controlled trial.

    PubMed

    Shipman, M K; Boniface, D R; Tefft, M E; McCloghry, F

    1997-07-01

    To study the effects of antenatal perineal massage on subsequent perineal outcomes at delivery. A randomised, single-blind prospective study. Department of Obstetrics and Gynaecology, Watford General Hospital. Eight hundred and sixty-one nulliparous women with singleton pregnancy and fulfilling criteria for entry to the trial between June 1994 and October 1995. Comparison of the group assigned to massage with the group assigned to no massage showed a reduction of 6.1% in second or third degree tears or episiotomies. This corresponded to tear rates of 75.1% in the no-massage group and 69.0% in the massage group (P = 0.073). There was a corresponding reduction in instrumental deliveries from 40.9% to 34.6% (P = 0.094). After adjustment for mother's age and infant's birthweight these reductions achieved statistical significance (P = 0.024 and P = 0.034, respectively). Analysis by mother's age showed a much larger benefit due to massage in those aged 30 and over and a smaller benefit in those under 30. Antenatal perineal massage appears to have some benefit in reducing second or third degree tears or episiotomies and instrumental deliveries. This effect was stronger in the age group 30 years and above.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guiochon, Georges A; Shalliker, R. Andrew

    An algorithm was developed for 2DHPLC that automated the process of peak recognition, measuring their retention times, and then subsequently plotting the information in a two-dimensional retention plane. Following the recognition of peaks, the software then performed a series of statistical assessments of the separation performance, measuring for example, correlation between dimensions, peak capacity and the percentage of usage of the separation space. Peak recognition was achieved by interpreting the first and second derivatives of each respective one-dimensional chromatogram to determine the 1D retention times of each solute and then compiling these retention times for each respective fraction 'cut'. Duemore » to the nature of comprehensive 2DHPLC adjacent cut fractions may contain peaks common to more than one cut fraction. The algorithm determined which components were common in adjacent cuts and subsequently calculated the peak maximum profile by interpolating the space between adjacent peaks. This algorithm was applied to the analysis of a two-dimensional separation of an apple flesh extract separated in a first dimension comprising a cyano stationary phase and an aqueous/THF mobile phase as the first dimension and a second dimension comprising C18-Hydro with an aqueous/MeOH mobile phase. A total of 187 peaks were detected.« less

  1. Patient satisfaction as a predictor of return-to-provider behavior: analysis and assessment of financial implications.

    PubMed

    Garman, Andrew N; Garcia, Joanne; Hargreaves, Marcia

    2004-01-01

    Although customer loyalty is frequently cited as a benefit of patient satisfaction, an empirical link between the two has not, to our knowledge, ever been statistically established in a hospital setting. The goal of the present study was to evaluate the relationship between self-reported patient satisfaction measures and subsequent return to the provider for care at a large academic medical center. Data from all adult medical patients responding to a patient satisfaction survey distributed by a large midwestern academic medical center during fiscal year 1997 (n = 1485) were analyzed. Response patterns were examined as they related to whether patients returned to the provider during the subsequent 2-year period. Analyses suggested that return-to-provider was associated with only a minority of the satisfaction items (approx. 11%). All items showing a significant relationship measured perceptions of how well physicians and nurses attended to, and provided information to, patients and their families. Although the size of these relationships was generally small, the estimated financial implications are substantial. Other implications of these findings for planning effective service improvement initiatives as well as improving patient survey design are discussed.

  2. Medical management of deliberate drug overdose: a neglected area for suicide prevention?

    PubMed

    Gunnell, D; Ho, D; Murray, V

    2004-01-01

    Overdoses account for a quarter of all suicides in England. The number of people who survive the immediate effects of their overdose long enough to reach medical attention, but who subsequently die in hospital is unknown. The aim of this study was to determine the proportion of overdose suicides dying in hospital and describe their sociodemographic characteristics. Cross sectional analysis of routinely collected Hospital Episode Statistics data for England (1997 to 1999) to identify hospital admissions for overdose among people aged 12+ and the outcome of these admissions. Between 1997 and 1999 there were 233 756 hospital admissions for overdose, 1149 (0.5%) of these ended in the death of the patient such deaths accounted for 28% [corrected] of all overdose suicides and 8% [corrected] of total suicides. The median time between admission and death was three days (interquartile range one to nine days). The most commonly identified drugs taken in fatal overdose were paracetamol compounds, benzodiazepines, and tricyclic/tetracyclic antidepressants. Around a quarter of all overdose suicide deaths occur subsequent to hospital admission. Further more detailed research is required to discover if better pre-admission and in-hospital medical management of those taking serious overdoses may prevent some of these deaths.

  3. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  4. Nursing students' intentions to use research as a predictor of use one year post graduation: a prospective study.

    PubMed

    Forsman, Henrietta; Wallin, Lars; Gustavsson, Petter; Rudman, Ann

    2012-09-01

    Graduating nursing students are expected to have acquired the necessary skills to provide research-based care to patients. However, recent studies have shown that new graduate nurses report their extent of research use as relatively low. Because behavior intention is a well-known predictor of subsequent behavior, this gives reasons to further investigate graduating nursing students' intentions to use research in clinical practice after undergraduate study. To investigate graduating nursing students' intentions to use research in clinical practice and, furthermore, to investigate whether intention in itself and as a mediating variable can predict subsequent research use behavior in clinical practice one year post graduation. A follow-up study was performed of graduating nursing students in their final semester of undergraduate study (2006) and at one year post graduation (2008). Data were collected within the larger national survey LANE (Longitudinal Analysis of Nursing Education). A sample of 1319 respondents was prospectively followed. Graduating nursing students' intentions to use research instrumentally were studied as a predictor of their subsequent instrumental research use one year post graduation. A statistical full mediation model was tested to evaluate the effects of intention and factors from undergraduate study on subsequent research use in daily care. Thirty-four percent of the nursing students intended to use research on more than half or almost every working shift in their future clinical practice. Intention showed a direct effect on research use behavior. In addition, significant indirect effects on research use were shown for capability beliefs (regarding practicing the principles of evidence-based practice) and perceived support for research use (from campus and clinical education), where intention acted as a mediating factor for those effects. Students rated a modest level of intention to use research evidence. Intentions close to graduation acted as an essential predictor of subsequent research use behavior, both through a direct effect and as a mediating variable. These findings give support for designing future interventions aiming at influencing students' intention to use research to improve subsequent behavior. Focusing on strengthening students' capability beliefs and providing support for research use appear as promising target activities. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Connectomic correlates of response to treatment in first-episode psychosis

    PubMed Central

    Crossley, Nicolas A; Marques, Tiago Reis; Taylor, Heather; Chaddock, Chris; Dell’Acqua, Flavio; Reinders, Antje A T S; Mondelli, Valeria; DiForti, Marta; Simmons, Andrew; David, Anthony S; Kapur, Shitij; Pariante, Carmine M; Murray, Robin M; Dazzan, Paola

    2017-01-01

    Abstract Connectomic approaches using diffusion tensor imaging have contributed to our understanding of brain changes in psychosis, and could provide further insights into the neural mechanisms underlying response to antipsychotic treatment. We here studied the brain network organization in patients at their first episode of psychosis, evaluating whether connectome-based descriptions of brain networks predict response to treatment, and whether they change after treatment. Seventy-six patients with a first episode of psychosis and 74 healthy controls were included. Thirty-three patients were classified as responders after 12 weeks of antipsychotic treatment. Baseline brain structural networks were built using whole-brain diffusion tensor imaging tractography, and analysed using graph analysis and network-based statistics to explore baseline characteristics of patients who subsequently responded to treatment. A subgroup of 43 patients was rescanned at the 12-week follow-up, to study connectomic changes over time in relation to treatment response. At baseline, those subjects who subsequently responded to treatment, compared to those that did not, showed higher global efficiency in their structural connectomes, a network configuration that theoretically facilitates the flow of information. We did not find specific connectomic changes related to treatment response after 12 weeks of treatment. Our data suggest that patients who have an efficiently-wired connectome at first onset of psychosis show a better subsequent response to antipsychotics. However, response is not accompanied by specific structural changes over time detectable with this method. PMID:28007987

  6. Factors affecting the inter-annual to centennial timescale variability of Indian summer monsoon rainfall

    NASA Astrophysics Data System (ADS)

    Malik, Abdul; Brönnimann, Stefan

    2017-09-01

    The Modes of Ocean Variability (MOV) namely Atlantic Multidecadal Oscillation (AMO), Pacific Decadal Oscillation (PDO), and El Niño Southern Oscillation (ENSO) can have significant impacts on Indian Summer Monsoon Rainfall (ISMR) on different timescales. The timescales at which these MOV interacts with ISMR and the factors which may perturb their relationship with ISMR need to be investigated. We employ De-trended Cross-Correlation Analysis (DCCA), and De-trended Partial-Cross-Correlation Analysis (DPCCA) to study the timescales of interaction of ISMR with AMO, PDO, and ENSO using observational dataset (AD 1854-1999), and atmosphere-ocean-chemistry climate model simulations with SOCOL-MPIOM (AD 1600-1999). Further, this study uses De-trended Semi-Partial Cross-Correlation Analysis (DSPCCA) to address the relation between solar variability and the ISMR. We find statistically significant evidence of intrinsic correlations of ISMR with AMO, PDO, and ENSO on different timescales, consistent between model simulations and observations. However, the model fails to capture modulation in intrinsic relationship between ISRM and MOV due to external signals. Our analysis indicates that AMO is a potential source of non-stationary relationship between ISMR and ENSO. Furthermore, the pattern of correlation between ISMR and Total Solar Irradiance (TSI) is inconsistent between observations and model simulations. The observational dataset indicates statistically insignificant negative intrinsic correlation between ISMR and TSI on decadal-to-centennial timescales. This statistically insignificant negative intrinsic correlation is transformed to statistically significant positive extrinsic by AMO on 61-86-year timescale. We propose a new mechanism for Sun-monsoon connection which operates through AMO by changes in summer (June-September; JJAS) meridional gradient of tropospheric temperatures (ΔTTJJAS). There is a negative (positive) intrinsic correlation between ΔTTJJAS (AMO) and TSI. The negative intrinsic correlation between ΔTTJJAS and TSI indicates that high (low) solar activity weakens (strengthens) the meridional gradient of tropospheric temperature during the summer monsoon season and subsequently the weak (strong) ΔTTJJAS decreases (increases) the ISMR. However, the presence of AMO transforms the negative intrinsic relation between ΔTTJJAS and TSI into positive extrinsic and strengthens the ISMR. We conclude that the positive relation between ISMR and solar activity, as found by other authors, is mainly due to the effect of AMO on ISMR.

  7. Factors affecting the inter-annual to centennial timescale variability of Indian summer monsoon rainfall

    NASA Astrophysics Data System (ADS)

    Malik, Abdul; Brönnimann, Stefan

    2018-06-01

    The Modes of Ocean Variability (MOV) namely Atlantic Multidecadal Oscillation (AMO), Pacific Decadal Oscillation (PDO), and El Niño Southern Oscillation (ENSO) can have significant impacts on Indian Summer Monsoon Rainfall (ISMR) on different timescales. The timescales at which these MOV interacts with ISMR and the factors which may perturb their relationship with ISMR need to be investigated. We employ De-trended Cross-Correlation Analysis (DCCA), and De-trended Partial-Cross-Correlation Analysis (DPCCA) to study the timescales of interaction of ISMR with AMO, PDO, and ENSO using observational dataset (AD 1854-1999), and atmosphere-ocean-chemistry climate model simulations with SOCOL-MPIOM (AD 1600-1999). Further, this study uses De-trended Semi-Partial Cross-Correlation Analysis (DSPCCA) to address the relation between solar variability and the ISMR. We find statistically significant evidence of intrinsic correlations of ISMR with AMO, PDO, and ENSO on different timescales, consistent between model simulations and observations. However, the model fails to capture modulation in intrinsic relationship between ISRM and MOV due to external signals. Our analysis indicates that AMO is a potential source of non-stationary relationship between ISMR and ENSO. Furthermore, the pattern of correlation between ISMR and Total Solar Irradiance (TSI) is inconsistent between observations and model simulations. The observational dataset indicates statistically insignificant negative intrinsic correlation between ISMR and TSI on decadal-to-centennial timescales. This statistically insignificant negative intrinsic correlation is transformed to statistically significant positive extrinsic by AMO on 61-86-year timescale. We propose a new mechanism for Sun-monsoon connection which operates through AMO by changes in summer (June-September; JJAS) meridional gradient of tropospheric temperatures (ΔTTJJAS). There is a negative (positive) intrinsic correlation between ΔTTJJAS (AMO) and TSI. The negative intrinsic correlation between ΔTTJJAS and TSI indicates that high (low) solar activity weakens (strengthens) the meridional gradient of tropospheric temperature during the summer monsoon season and subsequently the weak (strong) ΔTTJJAS decreases (increases) the ISMR. However, the presence of AMO transforms the negative intrinsic relation between ΔTTJJAS and TSI into positive extrinsic and strengthens the ISMR. We conclude that the positive relation between ISMR and solar activity, as found by other authors, is mainly due to the effect of AMO on ISMR.

  8. German aircraft accident statistics, 1930

    NASA Technical Reports Server (NTRS)

    Weitzmann, Ludwig

    1932-01-01

    The investigation of all serious accidents, involving technical defects in the airplane or engine, is undertaken by the D.V.L. in conjunction with the imperial traffic minister and other interested parties. All accidents not clearly explained in the reports are subsequently cleared up.

  9. Investigating the association of alerts from a national mortality surveillance system with subsequent hospital mortality in England: an interrupted time series analysis.

    PubMed

    Cecil, Elizabeth; Bottle, Alex; Esmail, Aneez; Wilkinson, Samantha; Vincent, Charles; Aylin, Paul P

    2018-05-04

    To investigate the association between alerts from a national hospital mortality surveillance system and subsequent trends in relative risk of mortality. There is increasing interest in performance monitoring in the NHS. Since 2007, Imperial College London has generated monthly mortality alerts, based on statistical process control charts and using routinely collected hospital administrative data, for all English acute NHS hospital trusts. The impact of this system has not yet been studied. We investigated alerts sent to Acute National Health Service hospital trusts in England in 2011-2013. We examined risk-adjusted mortality (relative risk) for all monitored diagnosis and procedure groups at a hospital trust level for 12 months prior to an alert and 23 months post alert. We used an interrupted time series design with a 9-month lag to estimate a trend prior to a mortality alert and the change in trend after, using generalised estimating equations. On average there was a 5% monthly increase in relative risk of mortality during the 12 months prior to an alert (95% CI 4% to 5%). Mortality risk fell, on average by 61% (95% CI 56% to 65%), during the 9-month period immediately following an alert, then levelled to a slow decline, reaching on average the level of expected mortality within 18 months of the alert. Our results suggest an association between an alert notification and a reduction in the risk of mortality, although with less lag time than expected. It is difficult to determine any causal association. A proportion of alerts may be triggered by random variation alone and subsequent falls could simply reflect regression to the mean. Findings could also indicate that some hospitals are monitoring their own mortality statistics or other performance information, taking action prior to alert notification. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Neurophysiological correlates of depressive symptoms in young adults: A quantitative EEG study.

    PubMed

    Lee, Poh Foong; Kan, Donica Pei Xin; Croarkin, Paul; Phang, Cheng Kar; Doruk, Deniz

    2018-01-01

    There is an unmet need for practical and reliable biomarkers for mood disorders in young adults. Identifying the brain activity associated with the early signs of depressive disorders could have important diagnostic and therapeutic implications. In this study we sought to investigate the EEG characteristics in young adults with newly identified depressive symptoms. Based on the initial screening, a total of 100 participants (n = 50 euthymic, n = 50 depressive) underwent 32-channel EEG acquisition. Simple logistic regression and C-statistic were used to explore if EEG power could be used to discriminate between the groups. The strongest EEG predictors of mood using multivariate logistic regression models. Simple logistic regression analysis with subsequent C-statistics revealed that only high-alpha and beta power originating from the left central cortex (C3) have a reliable discriminative value (ROC curve >0.7 (70%)) for differentiating the depressive group from the euthymic group. Multivariate regression analysis showed that the single most significant predictor of group (depressive vs. euthymic) is the high-alpha power over C3 (p = 0.03). The present findings suggest that EEG is a useful tool in the identification of neurophysiological correlates of depressive symptoms in young adults with no previous psychiatric history. Our results could guide future studies investigating the early neurophysiological changes and surrogate outcomes in depression. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Differences in genotoxic activity of alpha-Ni3S2 on human lymphocytes from nickel-hypersensitized and nickel-unsensitized donors.

    PubMed

    Arrouijal, F Z; Marzin, D; Hildebrand, H F; Pestel, J; Haguenoer, J M

    1992-05-01

    The genotoxic activity of alpha-Ni3S2 was assessed on human lymphocytes from nickel-hypersensitized (SSL) and nickel-unsensitized (USL) subjects. Three genotoxicity tests were performed: the sister chromatid exchange (SCE) test, the metaphase analysis test and the micronucleus test. (i) The SCE test (3-100 micrograms/ml) showed a weak but statistically significant increase in the number of SCE in both lymphocyte types with respect to controls, USL presenting a slightly higher SCE incidence but only at one concentration. (ii) The metaphase analysis test demonstrated a high dose-dependent clastogenic activity of alpha-Ni3S2 in both lymphocyte types. The frequency of chromosomal anomalies was significantly higher in USL than in SSL for all concentrations applied. (iii) The micronucleus test confirmed the dose-dependent clastogenic activity of alpha-Ni3S2 and the differences already observed between USL and SSL, i.e. the number of cells with micronuclei was statistically higher in USL. Finally, the incorporation study with alpha-63Ni3S2 showed a higher uptake of its solubilized fraction by USL. This allows an explanation of the different genotoxic action of nickel on the two cell types. In this study we demonstrated that hypersensitivity has an influence on the incorporation of alpha-Ni3S2 and subsequently on the different induction of chromosomal aberrations in human lymphocytes.

  12. D-Light on promoters: a client-server system for the analysis and visualization of cis-regulatory elements

    PubMed Central

    2013-01-01

    Background The binding of transcription factors to DNA plays an essential role in the regulation of gene expression. Numerous experiments elucidated binding sequences which subsequently have been used to derive statistical models for predicting potential transcription factor binding sites (TFBS). The rapidly increasing number of genome sequence data requires sophisticated computational approaches to manage and query experimental and predicted TFBS data in the context of other epigenetic factors and across different organisms. Results We have developed D-Light, a novel client-server software package to store and query large amounts of TFBS data for any number of genomes. Users can add small-scale data to the server database and query them in a large scale, genome-wide promoter context. The client is implemented in Java and provides simple graphical user interfaces and data visualization. Here we also performed a statistical analysis showing what a user can expect for certain parameter settings and we illustrate the usage of D-Light with the help of a microarray data set. Conclusions D-Light is an easy to use software tool to integrate, store and query annotation data for promoters. A public D-Light server, the client and server software for local installation and the source code under GNU GPL license are available at http://biwww.che.sbg.ac.at/dlight. PMID:23617301

  13. Statistical detection of geographic clusters of resistant Escherichia coli in a regional network with WHONET and SaTScan

    PubMed Central

    Park, Rachel; O'Brien, Thomas F.; Huang, Susan S.; Baker, Meghan A.; Yokoe, Deborah S.; Kulldorff, Martin; Barrett, Craig; Swift, Jamie; Stelling, John

    2016-01-01

    Objectives While antimicrobial resistance threatens the prevention, treatment, and control of infectious diseases, systematic analysis of routine microbiology laboratory test results worldwide can alert new threats and promote timely response. This study explores statistical algorithms for recognizing geographic clustering of multi-resistant microbes within a healthcare network and monitoring the dissemination of new strains over time. Methods Escherichia coli antimicrobial susceptibility data from a three-year period stored in WHONET were analyzed across ten facilities in a healthcare network utilizing SaTScan's spatial multinomial model with two models for defining geographic proximity. We explored geographic clustering of multi-resistance phenotypes within the network and changes in clustering over time. Results Geographic clustering identified from both latitude/longitude and non-parametric facility groupings geographic models were similar, while the latter was offers greater flexibility and generalizability. Iterative application of the clustering algorithms suggested the possible recognition of the initial appearance of invasive E. coli ST131 in the clinical database of a single hospital and subsequent dissemination to others. Conclusion Systematic analysis of routine antimicrobial resistance susceptibility test results supports the recognition of geographic clustering of microbial phenotypic subpopulations with WHONET and SaTScan, and iterative application of these algorithms can detect the initial appearance in and dissemination across a region prompting early investigation, response, and containment measures. PMID:27530311

  14. Cortical mechanisms for the segregation and representation of acoustic textures.

    PubMed

    Overath, Tobias; Kumar, Sukhbinder; Stewart, Lauren; von Kriegstein, Katharina; Cusack, Rhodri; Rees, Adrian; Griffiths, Timothy D

    2010-02-10

    Auditory object analysis requires two fundamental perceptual processes: the definition of the boundaries between objects, and the abstraction and maintenance of an object's characteristic features. Although it is intuitive to assume that the detection of the discontinuities at an object's boundaries precedes the subsequent precise representation of the object, the specific underlying cortical mechanisms for segregating and representing auditory objects within the auditory scene are unknown. We investigated the cortical bases of these two processes for one type of auditory object, an "acoustic texture," composed of multiple frequency-modulated ramps. In these stimuli, we independently manipulated the statistical rules governing (1) the frequency-time space within individual textures (comprising ramps with a given spectrotemporal coherence) and (2) the boundaries between textures (adjacent textures with different spectrotemporal coherences). Using functional magnetic resonance imaging, we show mechanisms defining boundaries between textures with different coherences in primary and association auditory cortices, whereas texture coherence is represented only in association cortex. Furthermore, participants' superior detection of boundaries across which texture coherence increased (as opposed to decreased) was reflected in a greater neural response in auditory association cortex at these boundaries. The results suggest a hierarchical mechanism for processing acoustic textures that is relevant to auditory object analysis: boundaries between objects are first detected as a change in statistical rules over frequency-time space, before a representation that corresponds to the characteristics of the perceived object is formed.

  15. Tract-based spatial statistics analysis of white matter changes in children with anisometropic amblyopia.

    PubMed

    Li, Qian; Zhai, Liying; Jiang, Qinying; Qin, Wen; Li, Qingji; Yin, Xiaohui; Guo, Mingxia

    2015-06-15

    Amblyopia is a neurological disorder of vision that follows abnormal binocular interaction or visual deprivation during early life. Previous studies have reported multiple functional or structural cortical alterations. Although white matter was also studied, it still cannot be clarified clearly which fasciculus was affected by amblyopia. In the present study, tract-based spatial statistics analysis was applied to diffusion tensor imaging (DTI) to investigate potential diffusion changes of neural tracts in anisometropic amblyopia. Fractional anisotropy (FA) value was calculated and compared between 20 amblyopic children and 18 healthy age-matched controls. In contrast to the controls, significant decreases in FA values were found in right optic radiation (OR), left inferior longitudinal fasciculus/inferior fronto-occipital fasciculus (ILF/IFO) and right superior longitudinal fasciculus (SLF) in the amblyopia. Furthermore, FA values of these identified tracts showed positive correlation with visual acuity. It can be inferred that abnormal visual input not only hinders OR from well developed, but also impairs fasciculi associated with dorsal and ventral visual pathways, which may be responsible for the amblyopic deficiency in object discrimination and stereopsis. Increased FA was detected in right posterior part of corpus callosum (CC) with a medium effect size, which may be due to compensation effect. DTI with subsequent measurement of FA is a useful tool for investigating neuronal tract involvement in amblyopia. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Methodological and Statistical Quality in Research Evaluating Nutritional Attitudes in Sports.

    PubMed

    Kouvelioti, Rozalia; Vagenas, George

    2015-12-01

    The assessment of dietary attitudes and behaviors provides information of interest to sports nutritionists. Although there has been little analysis of the quality of research undertaken in this field, there is evidence of a number of flaws and methodological concerns in some of the studies in the available literature. This review undertook a systematic assessment of the attributes of research assessing the nutritional knowledge and attitudes of athletes and coaches. Sixty questionnaire-based studies were identified by a search of official databases using specific key terms with subsequent analysis by certain inclusion-exclusion criteria. These studies were then analyzed using 33 research quality criteria related to the methods, questionnaires, and statistics used. We found that many studies did not provide information on critical issues such as research hypotheses (92%), the gaining of ethics approval (50%) or informed consent (35%), or acknowledgment of limitations in the implementation of studies or interpretation of data (72%). Many of the samples were nonprobabilistic (85%) and rather small (42%). Many questionnaires were of unknown origin (30%), validity (72%), and reliability (70%) and resulted in low (≤ 60%) response rates (38%). Pilot testing was not undertaken in 67% of the studies. Few studies dealt with sample size (2%), power (3%), assumptions (7%), confidence intervals (3%), or effect sizes (3%). Improving some of these problems and deficits may enhance future research in this field.

  17. Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data

    PubMed Central

    Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil

    2014-01-01

    Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202

  18. Landslide and flood hazard assessment in urban areas of Levoča region (Eastern Slovakia)

    NASA Astrophysics Data System (ADS)

    Magulova, Barbora; Caporali, Enrica; Bednarik, Martin

    2010-05-01

    The case study presents the use of statistical methods and analysis tools, for hazard assessment of "urbanization units", implemented in a Geographic Information Systems (GIS) environment. As a case study, the Levoča region (Slovakia) is selected. The region, with a total area of about 351 km2, is widely affected by landslides and floods. The problem, for small urbanization areas, is nowadays particularly significant from the socio-economic point of view. It is considered, presently, also an increasing problem, mainly because of climate change and more frequent extreme rainfall events. The geo-hazards are evaluated using a multivariate analysis. The landslide hazard assessment is based on the comparison and subsequent statistical elaboration of territorial dependence among different input factors influencing the instability of the slopes. Particularly, five factors influencing slope stability are evaluated, i.e. lithology, slope aspect, slope angle, hypsographic level and present land use. As a result a new landslide susceptibility map is compiled and different zones of stable, dormant and non-stable areas are defined. For flood hazard map a detailed digital elevation model is created. A compose index of flood hazard is derived from topography, land cover and pedology related data. To estimate flood discharge, time series of stream flow and precipitation measurements are used. The assessment results are prognostic maps of landslide hazard and flood hazard, which presents the optimal base for urbanization planning.

  19. Infinite von Mises-Fisher Mixture Modeling of Whole Brain fMRI Data.

    PubMed

    Røge, Rasmus E; Madsen, Kristoffer H; Schmidt, Mikkel N; Mørup, Morten

    2017-10-01

    Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises-Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Monte Carlo sampling. Comparing the vMF and gaussian mixture models on synthetic data, we demonstrate that the vMF model has a slight advantage inferring the true underlying clustering when compared to gaussian-based models on data generated from both a mixture of vMFs and a mixture of gaussians subsequently normalized. Thus, when performing model selection, the two models are not in agreement. Analyzing multisubject whole brain resting-state fMRI data from healthy adult subjects, we find that the vMF mixture model is considerably more reliable than the gaussian mixture model when comparing solutions across models trained on different groups of subjects, and again we find that the two models disagree on the optimal number of components. The analysis indicates that the fMRI data support more than a thousand clusters, and we confirm this is not a result of overfitting by demonstrating better prediction on data from held-out subjects. Our results highlight the utility of using directional statistics to model standardized fMRI data and demonstrate that whole brain segmentation of fMRI data requires a very large number of functional units in order to adequately account for the discernible statistical patterns in the data.

  20. Bioclimatic Classification of Northeast Asia for climate change response

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Jeon, S. W.; Lim, C. H.

    2016-12-01

    As climate change has been getting worse, we should monitor the change of biodiversity, and distribution of species to handle the crisis and take advantage of climate change. The development of bioclimatic map which classifies land into homogenous zones by similar environment properties is the first step to establish a strategy. Statistically derived classifications of land provide useful spatial frameworks to support ecosystem research, monitoring and policy decisions. Many countries are trying to make this kind of map and actively utilize it to ecosystem conservation and management. However, the Northeast Asia including North Korea doesn't have detailed environmental information, and has not built environmental classification map. Therefore, this study presents a bioclimatic map of Northeast Asia based on statistical clustering of bioclimate data. Bioclim data ver1.4 which provided by WorldClim were considered for inclusion in a model. Eight of the most relevant climate variables were selected by correlation analysis, based on previous studies. Principal Components Analysis (PCA) was used to explain 86% of the variation into three independent dimensions, which were subsequently clustered using an ISODATA clustering. The bioclimatic zone of Northeast Asia could consist of 29, 35, and 50 zones. This bioclimatic map has a 30' resolution. To assess the accuracy, the correlation coefficient was calculated between the first principal component values of the classification variables and the vegetation index, Gross Primary Production (GPP). It shows about 0.5 Pearson correlation coefficient. This study constructed Northeast Asia bioclimatic map by statistical method with high resolution, but in order to better reflect the realities, the variety of climate variables should be considered. Also, further studies should do more quantitative and qualitative validation in various ways. Then, this could be used more effectively to support decision making on climate change adaptation.

  1. The impact of clinical use on the torsional behavior of Reciproc and WaveOne instruments.

    PubMed

    Magalhães, Rafael Rodrigues Soares de; Braga, Lígia Carolina Moreira; Pereira, Érika Sales Joviano; Peixoto, Isabella Faria da Cunha; Buono, Vicente Tadeu Lopes; Bahia, Maria Guiomar de Azevedo

    2016-01-01

    The aim of this study was to assess the influence of clinical use, in vivo, on the torsional behavior of Reciproc and WaveOne instruments considering the possibility that they degraded with use. Diameter at each millimeter, pitch length, and area at 3 mm from the tip were determined for both types of instruments. Twenty-four instruments, size 25, 0.08 taper, of each system were divided into two groups (n=12 each): Control Group (CG), in which new Reciproc (RC) and WaveOne Primary (WO) instruments were tested in torsion until rupture based on ISO 3630-1; and Experimental Group (EG), in which each new instrument was clinically used to clean and shape the root canals of one molar. After clinical use, the instruments were analyzed using optical and scanning electron microscopy and subsequently tested in torsion until fracture. Data were analyzed using one-way analysis of variance at a=.05. WO instruments showed significantly higher mean values of cross-sectional area A3 (P=0.000) and smaller pitch lengths than RC instruments with no statistically significant differences in the diameter at D3 (P=0.521). No significant differences in torsional resistance between the RC and WO new instruments (P=0.134) were found. The clinical use resulted in a tendency of reduction in the maximum torque of the analyzed instruments but no statistically significant difference was observed between them (P=0.327). During the preparation of the root canals, two fractured RC instruments and longitudinal and transversal cracks in RC and WO instruments were observed through SEM analysis. After clinical use, no statistically significant reduction in the torsional resistance was observed.

  2. Additive scales in degenerative disease--calculation of effect sizes and clinical judgment.

    PubMed

    Riepe, Matthias W; Wilkinson, David; Förstl, Hans; Brieden, Andreas

    2011-12-16

    The therapeutic efficacy of an intervention is often assessed in clinical trials by scales measuring multiple diverse activities that are added to produce a cumulative global score. Medical communities and health care systems subsequently use these data to calculate pooled effect sizes to compare treatments. This is done because major doubt has been cast over the clinical relevance of statistically significant findings relying on p values with the potential to report chance findings. Hence in an aim to overcome this pooling the results of clinical studies into a meta-analyses with a statistical calculus has been assumed to be a more definitive way of deciding of efficacy. We simulate the therapeutic effects as measured with additive scales in patient cohorts with different disease severity and assess the limitations of an effect size calculation of additive scales which are proven mathematically. We demonstrate that the major problem, which cannot be overcome by current numerical methods, is the complex nature and neurobiological foundation of clinical psychiatric endpoints in particular and additive scales in general. This is particularly relevant for endpoints used in dementia research. 'Cognition' is composed of functions such as memory, attention, orientation and many more. These individual functions decline in varied and non-linear ways. Here we demonstrate that with progressive diseases cumulative values from multidimensional scales are subject to distortion by the limitations of the additive scale. The non-linearity of the decline of function impedes the calculation of effect sizes based on cumulative values from these multidimensional scales. Statistical analysis needs to be guided by boundaries of the biological condition. Alternatively, we suggest a different approach avoiding the error imposed by over-analysis of cumulative global scores from additive scales.

  3. MUC4: a novel prognostic factor of oral squamous cell carcinoma.

    PubMed

    Hamada, Tomofumi; Wakamatsu, Tsunenobu; Miyahara, Mayumi; Nagata, Satoshi; Nomura, Masahiro; Kamikawa, Yoshiaki; Yamada, Norishige; Batra, Surinder K; Yonezawa, Suguru; Sugihara, Kazumasa

    2012-04-15

    MUC4 mucin is now known to be expressed in various normal and cancer tissues. We have previously reported that MUC4 expression is a novel prognostic factor in several malignant tumors; however, it has not been investigated in oral squamous cell carcinoma (OSCC). The aim of our study is to evaluate the prognostic significance of MUC4 expression in OSCC. We examined the expression profile of MUC4 in OSCC tissues from 150 patients using immunohistochemistry. Its prognostic significance in OSCC was statistically analyzed. MUC4 was expressed in 61 of the 150 patients with OSCC. MUC4 expression was significantly correlated with higher T classification (p = 0.0004), positive nodal metastasis (p = 0.049), advanced tumor stage (p = 0.002), diffuse invasion of cancer cells (p = 0.004) and patient's death (p = 0.004) in OSCC. Multivariate analysis showed that MUC4 expression (p = 0.011), tumor location (p = 0.032) and diffuse invasion (p = 0.009) were statistically significant risk factors. Backward stepwise multivariate analysis demonstrated MUC4 expression (p = 0.0015) and diffuse invasion (p = 0.018) to be statistically significant independent risk factors of poor survival in OSCC. The disease-free and overall survival of patients with MUC4 expression was significantly worse than those without MUC4 expression (p < 0.0001 and p = 0.0001). In addition, the MUC4 expression was a significant risk factor for local recurrence and subsequent nodal metastasis in OSCC (p = 0.017 and p = 0.0001). We first report MUC4 overexpression is an independent factor for poor prognosis of patients with OSCC; therefore, patients with OSCC showing positive MUC4 expression should be followed up carefully. Copyright © 2011 UICC.

  4. Just add water: Accuracy of analysis of diluted human milk samples using mid-infrared spectroscopy.

    PubMed

    Smith, R W; Adamkin, D H; Farris, A; Radmacher, P G

    2017-01-01

    To determine the maximum dilution of human milk (HM) that yields reliable results for protein, fat and lactose when analyzed by mid-infrared spectroscopy. De-identified samples of frozen HM were obtained. Milk was thawed and warmed (40°C) prior to analysis. Undiluted (native) HM was analyzed by mid-infrared spectroscopy for macronutrient composition: total protein (P), fat (F), carbohydrate (C); Energy (E) was calculated from the macronutrient results. Subsequent analyses were done with 1 : 2, 1 : 3, 1 : 5 and 1 : 10 dilutions of each sample with distilled water. Additional samples were sent to a certified lab for external validation. Quantitatively, F and P showed statistically significant but clinically non-critical differences in 1 : 2 and 1 : 3 dilutions. Differences at higher dilutions were statistically significant and deviated from native values enough to render those dilutions unreliable. External validation studies also showed statistically significant but clinically unimportant differences at 1 : 2 and 1 : 3 dilutions. The Calais Human Milk Analyzer can be used with HM samples diluted 1 : 2 and 1 : 3 and return results within 5% of values from undiluted HM. At a 1 : 5 or 1 : 10 dilution, however, results vary as much as 10%, especially with P and F. At the 1 : 2 and 1 : 3 dilutions these differences appear to be insignificant in the context of nutritional management. However, the accuracy and reliability of the 1 : 5 and 1 : 10 dilutions are questionable.

  5. Applications of LANDSAT data to the integrated economic development of Mindoro, Phillipines

    NASA Technical Reports Server (NTRS)

    Wagner, T. W.; Fernandez, J. C.

    1977-01-01

    LANDSAT data is seen as providing essential up-to-date resource information for the planning process. LANDSAT data of Mindoro Island in the Philippines was processed to provide thematic maps showing patterns of agriculture, forest cover, terrain, wetlands and water turbidity. A hybrid approach using both supervised and unsupervised classification techniques resulted in 30 different scene classes which were subsequently color-coded and mapped at a scale of 1:250,000. In addition, intensive image analysis is being carried out in evaluating the images. The images, maps, and aerial statistics are being used to provide data to seven technical departments in planning the economic development of Mindoro. Multispectral aircraft imagery was collected to compliment the application of LANDSAT data and validate the classification results.

  6. Validation of a Deterministic Vibroacoustic Response Prediction Model

    NASA Technical Reports Server (NTRS)

    Caimi, Raoul E.; Margasahayam, Ravi

    1997-01-01

    This report documents the recently completed effort involving validation of a deterministic theory for the random vibration problem of predicting the response of launch pad structures in the low-frequency range (0 to 50 hertz). Use of the Statistical Energy Analysis (SEA) methods is not suitable in this range. Measurements of launch-induced acoustic loads and subsequent structural response were made on a cantilever beam structure placed in close proximity (200 feet) to the launch pad. Innovative ways of characterizing random, nonstationary, non-Gaussian acoustics are used for the development of a structure's excitation model. Extremely good correlation was obtained between analytically computed responses and those measured on the cantilever beam. Additional tests are recommended to bound the problem to account for variations in launch trajectory and inclination.

  7. Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information.

    PubMed

    Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Yan, Bin; Li, Jianxin

    2015-01-01

    Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.

  8. Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information

    PubMed Central

    Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Li, Jianxin

    2015-01-01

    Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition. PMID:26380294

  9. A spreadsheet template compatible with Microsoft Excel and iWork Numbers that returns the simultaneous confidence intervals for all pairwise differences between multiple sample means.

    PubMed

    Brown, Angus M

    2010-04-01

    The objective of the method described in this paper is to develop a spreadsheet template for the purpose of comparing multiple sample means. An initial analysis of variance (ANOVA) test on the data returns F--the test statistic. If F is larger than the critical F value drawn from the F distribution at the appropriate degrees of freedom, convention dictates rejection of the null hypothesis and allows subsequent multiple comparison testing to determine where the inequalities between the sample means lie. A variety of multiple comparison methods are described that return the 95% confidence intervals for differences between means using an inclusive pairwise comparison of the sample means. 2009 Elsevier Ireland Ltd. All rights reserved.

  10. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  11. Volumetric Analysis of Regional Variability in the Cerebellum of Children with Dyslexia

    PubMed Central

    Stuebing, Karla; Juranek, Jenifer; Fletcher, Jack M.

    2013-01-01

    Cerebellar deficits and subsequent impairment in procedural learning may contribute to both motor difficulties and reading impairment in dyslexia. We used quantitative magnetic resonance imaging to investigate the role of regional variation in cerebellar anatomy in children with single-word decoding impairments (N=23), children with impairment in fluency alone (N=8), and typically developing children (N=16). Children with decoding impairments (dyslexia) demonstrated no statistically significant differences in overall grey and white matter volumes or cerebellar asymmetry; however, reduced volume in the anterior lobe of the cerebellum relative to typically developing children was observed. These results implicate cerebellar involvement in dyslexia and establish an important foundation for future research on the connectivity of the cerebellum and cortical regions typically associated with reading impairment. PMID:23828023

  12. Volumetric analysis of regional variability in the cerebellum of children with dyslexia.

    PubMed

    Fernandez, Vindia G; Stuebing, Karla; Juranek, Jenifer; Fletcher, Jack M

    2013-12-01

    Cerebellar deficits and subsequent impairment in procedural learning may contribute to both motor difficulties and reading impairment in dyslexia. We used quantitative magnetic resonance imaging to investigate the role of regional variation in cerebellar anatomy in children with single-word decoding impairments (N = 23), children with impairment in fluency alone (N = 8), and typically developing children (N = 16). Children with decoding impairments (dyslexia) demonstrated no statistically significant differences in overall grey and white matter volumes or cerebellar asymmetry; however, reduced volume in the anterior lobe of the cerebellum relative to typically developing children was observed. These results implicate cerebellar involvement in dyslexia and establish an important foundation for future research on the connectivity of the cerebellum and cortical regions typically associated with reading impairment.

  13. Fast and accurate automated cell boundary determination for fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider

    2013-07-01

    Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.

  14. Estimation of streamflow for selected sites on the Carson and Truckee rivers in California and Nevada, 1944-80

    USGS Publications Warehouse

    Blodgett, J.C.; Oltmann, R.N.; Poeschel, K.R.

    1984-01-01

    Daily mean and monthly discharges were estimated for 10 sites on the Carson and Truckee Rivers for periods of incomplete records and for tributary sites affected by reservoir regulation. On the basis of the hydrologic characteristics, stream-flow data for a water year were grouped by month or season for subsequent regression analysis. In most cases, simple linear regressions adequately defined a relation of streamflow between gaging stations, but in some instances a nonlinear relation for several months of the water year was derived. Statistical data are presented to indicate the reliability of the estimated streamflow data. Records of discharges including historical and estimated data for the gaging stations for the water years 1944-80 are presented. (USGS)

  15. Estimation of Apple Intake for the Exposure Assessment of Residual Chemicals Using Korea National Health and Nutrition Examination Survey Database

    PubMed Central

    2016-01-01

    The aims of this study were to develop strategies and algorithms of calculating food commodity intake suitable for exposure assessment of residual chemicals by using the food intake database of Korea National Health and Nutrition Examination Survey (KNHANES). In this study, apples and their processed food products were chosen as a model food for accurate calculation of food commodity intakes uthrough the recently developed Korea food commodity intake calculation (KFCIC) software. The average daily intakes of total apples in Korea Health Statistics were 29.60 g in 2008, 32.40 g in 2009, 34.30 g in 2010, 28.10 g in 2011, and 24.60 g in 2012. The average daily intakes of apples by KFCIC software was 2.65 g higher than that by Korea Health Statistics. The food intake data in Korea Health Statistics might have less reflected the intake of apples from mixed and processed foods than KFCIC software has. These results can affect outcome of risk assessment for residual chemicals in foods. Therefore, the accurate estimation of the average daily intake of food commodities is very important, and more data for food intakes and recipes have to be applied to improve the quality of data. Nevertheless, this study can contribute to the predictive estimation of exposure to possible residual chemicals and subsequent analysis for their potential risks. PMID:27152299

  16. The taxonomy statistic uncovers novel clinical patterns in a population of ischemic stroke patients.

    PubMed

    Tukiendorf, Andrzej; Kaźmierski, Radosław; Michalak, Sławomir

    2013-01-01

    In this paper, we describe a simple taxonomic approach for clinical data mining elaborated by Marczewski and Steinhaus (M-S), whose performance equals the advanced statistical methodology known as the expectation-maximization (E-M) algorithm. We tested these two methods on a cohort of ischemic stroke patients. The comparison of both methods revealed strong agreement. Direct agreement between M-S and E-M classifications reached 83%, while Cohen's coefficient of agreement was κ = 0.766(P < 0.0001). The statistical analysis conducted and the outcomes obtained in this paper revealed novel clinical patterns in ischemic stroke patients. The aim of the study was to evaluate the clinical usefulness of Marczewski-Steinhaus' taxonomic approach as a tool for the detection of novel patterns of data in ischemic stroke patients and the prediction of disease outcome. In terms of the identification of fairly frequent types of stroke patients using their age, National Institutes of Health Stroke Scale (NIHSS), and diabetes mellitus (DM) status, when dealing with rough characteristics of patients, four particular types of patients are recognized, which cannot be identified by means of routine clinical methods. Following the obtained taxonomical outcomes, the strong correlation between the health status at moment of admission to emergency department (ED) and the subsequent recovery of patients is established. Moreover, popularization and simplification of the ideas of advanced mathematicians may provide an unconventional explorative platform for clinical problems.

  17. Dispersal of potato cyst nematodes measured using historical and spatial statistical analyses.

    PubMed

    Banks, N C; Hodda, M; Singh, S K; Matveeva, E M

    2012-06-01

    Rates and modes of dispersal of potato cyst nematodes (PCNs) were investigated. Analysis of records from eight countries suggested that PCNs spread a mean distance of 5.3 km/year radially from the site of first detection, and spread 212 km over ≈40 years before detection. Data from four countries with more detailed histories of invasion were analyzed further, using distance from first detection, distance from previous detection, distance from nearest detection, straight line distance, and road distance. Linear distance from first detection was significantly related to the time since the first detection. Estimated rate of spread was 5.7 km/year, and did not differ statistically between countries. Time between the first detection and estimated introduction date varied between 0 and 20 years, and differed among countries. Road distances from nearest and first detection were statistically significantly related to time, and gave slightly higher estimates for rate of spread of 6.0 and 7.9 km/year, respectively. These results indicate that the original site of introduction of PCNs may act as a source for subsequent spread and that this may occur at a relatively constant rate over time regardless of whether this distance is measured by road or by a straight line. The implications of this constant radial rate of dispersal for biosecurity and pest management are discussed, along with the effects of control strategies.

  18. Head injury as a PTSD predictor among Oklahoma City bombing survivors.

    PubMed

    Walilko, Timothy; North, Carol; Young, Lee Ann; Lux, Warren E; Warden, Deborah L; Jaffee, Michael S; Moore, David F

    2009-12-01

    The aim of the Oklahoma City (OKC) bombing retrospective review was to investigate the relationship between physical injury, environmental contributors, and psychiatric disorders such as posttraumatic stress disorder (PTSD) in an event-based, matched design study focused on injury. The 182 selected participants were a random subset of the 1,092 direct survivors from the OKC bombing. Only 124 of these 182 cases had a full complement of medical/clinical data in the OKC database. These 124 cases were assessed to explore relationships among PTSD diagnoses, levels of blast exposure, and physical injuries. Associations among variables were statistically tested using contingency analysis and logistic regression. Comparison of the PTSD cases to symptoms/diagnoses reported in the medical records reveals a statistically significant association between PTSD and head/brain injuries associated with head acceleration. PTSD was not highly correlated with other injuries. Although blast pressure and impulse were highly correlated with head injuries, the correlation with PTSD was not statistically significant. Thus, a correlation between blast pressure and PTSD may exist, but higher fidelity pressure calculations are required to elucidate this potential relationship. This study provides clear evidence that head injury is associated with subsequent PTSD, giving caregivers' information on what physical injuries may suggest the development of psychologic disorders to aid them in developing a profile for the identification of future survivors of terrorist attacks and Warfighters with brain injuries and potential PTSD.

  19. The association of 83 plasma proteins with CHD mortality, BMI, HDL-, and total-cholesterol in men: applying multivariate statistics to identify proteins with prognostic value and biological relevance.

    PubMed

    Heidema, A Geert; Thissen, Uwe; Boer, Jolanda M A; Bouwman, Freek G; Feskens, Edith J M; Mariman, Edwin C M

    2009-06-01

    In this study, we applied the multivariate statistical tool Partial Least Squares (PLS) to analyze the relative importance of 83 plasma proteins in relation to coronary heart disease (CHD) mortality and the intermediate end points body mass index, HDL-cholesterol and total cholesterol. From a Dutch monitoring project for cardiovascular disease risk factors, men who died of CHD between initial participation (1987-1991) and end of follow-up (January 1, 2000) (N = 44) and matched controls (N = 44) were selected. Baseline plasma concentrations of proteins were measured by a multiplex immunoassay. With the use of PLS, we identified 15 proteins with prognostic value for CHD mortality and sets of proteins associated with the intermediate end points. Subsequently, sets of proteins and intermediate end points were analyzed together by Principal Components Analysis, indicating that proteins involved in inflammation explained most of the variance, followed by proteins involved in metabolism and proteins associated with total-C. This study is one of the first in which the association of a large number of plasma proteins with CHD mortality and intermediate end points is investigated by applying multivariate statistics, providing insight in the relationships among proteins, intermediate end points and CHD mortality, and a set of proteins with prognostic value.

  20. Identification of microRNAs with regulatory potential using a matched microRNA-mRNA time-course data.

    PubMed

    Jayaswal, Vivek; Lutherborrow, Mark; Ma, David D F; Hwa Yang, Yee

    2009-05-01

    Over the past decade, a class of small RNA molecules called microRNAs (miRNAs) has been shown to regulate gene expression at the post-transcription stage. While early work focused on the identification of miRNAs using a combination of experimental and computational techniques, subsequent studies have focused on identification of miRNA-target mRNA pairs as each miRNA can have hundreds of mRNA targets. The experimental validation of some miRNAs as oncogenic has provided further motivation for research in this area. In this article we propose an odds-ratio (OR) statistic for identification of regulatory miRNAs. It is based on integrative analysis of matched miRNA and mRNA time-course microarray data. The OR-statistic was used for (i) identification of miRNAs with regulatory potential, (ii) identification of miRNA-target mRNA pairs and (iii) identification of time lags between changes in miRNA expression and those of its target mRNAs. We applied the OR-statistic to a cancer data set and identified a small set of miRNAs that were negatively correlated to mRNAs. A literature survey revealed that some of the miRNAs that were predicted to be regulatory, were indeed oncogenic or tumor suppressors. Finally, some of the predicted miRNA targets have been shown to be experimentally valid.

  1. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  2. The Influence of Statistical versus Exemplar Appeals on Indian Adults' Health Intentions: An Investigation of Direct Effects and Intervening Persuasion Processes.

    PubMed

    McKinley, Christopher J; Limbu, Yam; Jayachandran, C N

    2017-04-01

    In two separate investigations, we examined the persuasive effectiveness of statistical versus exemplar appeals on Indian adults' smoking cessation and mammography screening intentions. To more comprehensively address persuasion processes, we explored whether message response and perceived message effectiveness functioned as antecedents to persuasive effects. Results showed that statistical appeals led to higher levels of health intentions than exemplar appeals. In addition, findings from both studies indicated that statistical appeals stimulated more attention and were perceived as more effective than anecdotal accounts. Among male smokers, statistical appeals also generated greater cognitive processing than exemplar appeals. Subsequent mediation analyses revealed that message response and perceived message effectiveness fully carried the influence of appeal format on health intentions. Given these findings, future public health initiatives conducted among similar populations should design messages that include substantive factual information while ensuring that this content is perceived as credible and valuable.

  3. Infant Statistical Learning

    PubMed Central

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  4. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  5. Spanking by parents and subsequent antisocial behavior of children.

    PubMed

    Straus, M A; Sugarman, D B; Giles-Sims, J

    1997-08-01

    To deal with the causal relationship between corporal punishment and antisocial behavior (ASB) by considering the level of ASB of the child at the start of the study. Data from interviews with a national sample of 807 mothers of children aged 6 to 9 years in the National Longitudinal Survey of Youth-Child Supplement. Analysis of variance was used to test the hypothesis that when parents use corporal punishment to correct ASB, it increases subsequent ASB. The analysis controlled for the level of ASB at the start of the study, family socio-economic status, sex of the child, and the extent to which the home provided emotional support and cognitive stimulation. Forty-four percent of the mothers reported spanking their children during the week prior to the study and they spanked them an average of 2.1 times that week. The more spanking at the start of the period, the higher the level of ASB 2 years later. The change is unlikely to be owing to the child's tendency toward ASB or to confounding with demographic characteristics or with parental deficiency in other key aspects of socialization because those variables were statistically controlled. When parents use corporal punishment to reduce ASB, the long-term effect tends to be the opposite. The findings suggest that if parents replace corporal punishment by nonviolent modes of discipline, it could reduce the risk of ASB among children and reduce the level of violence in American society.

  6. Risk factors for the development of active methicillin-resistant Staphylococcus aureus (MRSA) infection in patients colonized with MRSA at hospital admission.

    PubMed

    Cadena, Jose; Thinwa, Josephine; Walter, Elizabeth A; Frei, Christopher R

    2016-12-01

    Patients who present to Veterans Affairs hospitals are screened for methicillin-resistant Staphylococcus aureus (MRSA) colonization. Those who test positive are isolated during their hospital stay. However, it is unknown which of these patients are most likely to subsequently develop active MRSA infections. This retrospective case-control study characterized risk factors for active MRSA infection among patients colonized with MRSA at hospital admission. Potential demographic and clinical risk factors were identified using electronic queries and manual chart abstraction; data were compared by standard statistical tests, and variables with P ≤ .05 in bivariable analysis were entered into a multivariable logistic regression model. There were 71 cases and 213 controls. Risk factors associated with MRSA infection included diabetes mellitus with or without end organ damage (26% vs 14%, P = .02), hemiplegia (9% vs 2%, P = .01), chronic kidney disease (33% vs 20%, P = .03), postcolonization inpatient admission within 90 days (44% vs 29%, P = .03), surgery (41% vs 9%, P < .01), and dialysis (10% vs 3%, P = .02). On multivariable analysis, surgery during follow-up, dialysis during follow-up, and hemiplegia remained significant. Among patients with MRSA colonization, surgery or dialysis during follow-up and history of hemiplegia were associated with subsequent MRSA infection. Knowledge of these risk factors may allow for future targeted interventions to prevent MRSA infections among colonized patients. Published by Elsevier Inc.

  7. Importance of Hydrophobic Cavities in Allosteric Regulation of Formylglycinamide Synthetase: Insight from Xenon Trapping and Statistical Coupling Analysis

    PubMed Central

    Choudhary, Deepanshu; Panjikar, Santosh; Anand, Ruchi

    2013-01-01

    Formylglycinamide ribonucleotide amidotransferase (FGAR-AT) is a 140 kDa bi-functional enzyme involved in a coupled reaction, where the glutaminase active site produces ammonia that is subsequently utilized to convert FGAR to its corresponding amidine in an ATP assisted fashion. The structure of FGAR-AT has been previously determined in an inactive state and the mechanism of activation remains largely unknown. In the current study, hydrophobic cavities were used as markers to identify regions involved in domain movements that facilitate catalytic coupling and subsequent activation of the enzyme. Three internal hydrophobic cavities were located by xenon trapping experiments on FGAR-AT crystals and further, these cavities were perturbed via site-directed mutagenesis. Biophysical characterization of the mutants demonstrated that two of these three voids are crucial for stability and function of the protein, although being ∼20 Å from the active centers. Interestingly, correlation analysis corroborated the experimental findings, and revealed that amino acids lining the functionally important cavities form correlated sets (co-evolving residues) that connect these regions to the amidotransferase active center. It was further proposed that the first cavity is transient and allows for breathing motion to occur and thereby serves as an allosteric hotspot. In contrast, the third cavity which lacks correlated residues was found to be highly plastic and accommodated steric congestion by local adjustment of the structure without affecting either stability or activity. PMID:24223728

  8. MALDI-TOF Mass Spectrometry Enables a Comprehensive and Fast Analysis of Dynamics and Qualities of Stress Responses of Lactobacillus paracasei subsp. paracasei F19

    PubMed Central

    Schott, Ann-Sophie; Behr, Jürgen; Quinn, Jennifer; Vogel, Rudi F.

    2016-01-01

    Lactic acid bacteria (LAB) are widely used as starter cultures in the manufacture of foods. Upon preparation, these cultures undergo various stresses resulting in losses of survival and fitness. In order to find conditions for the subsequent identification of proteomic biomarkers and their exploitation for preconditioning of strains, we subjected Lactobacillus (Lb.) paracasei subsp. paracasei TMW 1.1434 (F19) to different stress qualities (osmotic stress, oxidative stress, temperature stress, pH stress and starvation stress). We analysed the dynamics of its stress responses based on the expression of stress proteins using MALDI-TOF mass spectrometry (MS), which has so far been used for species identification. Exploiting the methodology of accumulating protein expression profiles by MALDI-TOF MS followed by the statistical evaluation with cluster analysis and discriminant analysis of principle components (DAPC), it was possible to monitor the expression of low molecular weight stress proteins, identify a specific time point when the expression of stress proteins reached its maximum, and statistically differentiate types of adaptive responses into groups. Above the specific result for F19 and its stress response, these results demonstrate the discriminatory power of MALDI-TOF MS to characterize even dynamics of stress responses of bacteria and enable a knowledge-based focus on the laborious identification of biomarkers and stress proteins. To our knowledge, the implementation of MALDI-TOF MS protein profiling for the fast and comprehensive analysis of various stress responses is new to the field of bacterial stress responses. Consequently, we generally propose MALDI-TOF MS as an easy and quick method to characterize responses of microbes to different environmental conditions, to focus efforts of more elaborate approaches on time points and dynamics of stress responses. PMID:27783652

  9. Implantable cardioverter defibrillators for primary prevention in patients with nonischemic cardiomyopathy: A systematic review and meta-analysis.

    PubMed

    Akel, Tamer; Lafferty, James

    2017-06-01

    Implantable cardioverter defibrillators (ICDs) have proved their favorable outcomes on survival in selected patients with cardiomyopathy. Although previous meta-analyses have shown benefit for their use in primary prevention, the evidence remains less robust for patients with nonischemic cardiomyopathy (NICM) in comparison to patients with coronary artery disease (CAD). To evaluate the effect of ICD therapy on reducing all-cause mortality and sudden cardiac death (SCD) in patients with NICM. PubMed (1993-2016), the Cochrane Central Register of Controlled Trials (2000-2016), reference lists of relevant articles, and previous meta-analyses. Search terms included defibrillator, heart failure, cardiomyopathy, randomized controlled trials, and clinical trials. Eligible trials were randomized controlled trials with at least an arm of ICD, an arm of medical therapy and enrolled some patients with NICM. The primary endpoint in the trials should include all-cause mortality or mortality from SCD. Hazard ratios (HRs) for all-cause mortality and mortality from SCD were either extracted or calculated along with their standard errors. Of the 1047 abstracts retained by the initial screen, eight randomized controlled trials were identified. Five of these trials reported relevant data regarding patients with NICM and were subsequently included in this meta-analysis. Pooled analysis of HRs suggested a statistically significant reduction in all-cause mortality among a total of 2573 patients randomized to ICD vs medical therapy (HR 0.80; 95% CI, 0.67-0.96; P=.02). Pooled analysis of HRs for mortality from SCD was also statistically significant (n=1677) (HR 0.51; 95% CI, 0.34-0.76; P=.001). ICD implantation is beneficial in terms of all-cause mortality and mortality from SCD in certain subgroups of patients with NICM. © 2017 John Wiley & Sons Ltd.

  10. Gene expression analysis of rheumatoid arthritis synovial lining regions by cDNA microarray combined with laser microdissection: up-regulation of inflammation-associated STAT1, IRF1, CXCL9, CXCL10, and CCL5

    PubMed Central

    Yoshida, S; Arakawa, F; Higuchi, F; Ishibashi, Y; Goto, M; Sugita, Y; Nomura, Y; Niino, D; Shimizu, K; Aoki, R; Hashikawa, K; Kimura, Y; Yasuda, K; Tashiro, K; Kuhara, S; Nagata, K; Ohshima, K

    2012-01-01

    Objectives The main histological change in rheumatoid arthritis (RA) is the villous proliferation of synovial lining cells, an important source of cytokines and chemokines, which are associated with inflammation. The aim of this study was to evaluate gene expression in the microdissected synovial lining cells of RA patients, using those of osteoarthritis (OA) patients as the control. Methods Samples were obtained during total joint replacement from 11 RA and five OA patients. Total RNA from the synovial lining cells was derived from selected specimens by laser microdissection (LMD) for subsequent cDNA microarray analysis. In addition, the expression of significant genes was confirmed immunohistochemically. Results The 14 519 genes detected by cDNA microarray were used to compare gene expression levels in synovial lining cells from RA with those from OA patients. Cluster analysis indicated that RA cells, including low- and high-expression subgroups, and OA cells were stored in two main clusters. The molecular activity of RA was statistically consistent with its clinical and histological activity. Expression levels of signal transducer and activator of transcription 1 (STAT1), interferon regulatory factor 1 (IRF1), and the chemokines CXCL9, CXCL10, and CCL5 were statistically significantly higher in the synovium of RA than in that of OA. Immunohistochemically, the lining synovium of RA, but not that of OA, clearly expressed STAT1, IRF1, and chemokines, as was seen in microarray analysis combined with LMD. Conclusions Our findings indicate an important role for lining synovial cells in the inflammatory and proliferative processes of RA. Further understanding of the local signalling in structural components is important in rheumatology. PMID:22401175

  11. A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.

    PubMed

    Xue, Xiaoming; Zhou, Jianzhong

    2017-01-01

    To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Major bleeding after percutaneous coronary intervention and risk of subsequent mortality: a systematic review and meta-analysis

    PubMed Central

    Kwok, Chun Shing; Rao, Sunil V; Myint, Phyo K; Keavney, Bernard; Nolan, James; Ludman, Peter F; de Belder, Mark A; Loke, Yoon K; Mamas, Mamas A

    2014-01-01

    Objectives To examine the relationship between periprocedural bleeding complications and major adverse cardiovascular events (MACEs) and mortality outcomes following percutaneous coronary intervention (PCI) and study differences in the prognostic impact of different bleeding definitions. Methods We conducted a systematic review and meta-analysis of PCI studies that evaluated periprocedural bleeding complications and their impact on MACEs and mortality outcomes. A systematic search of MEDLINE and EMBASE was conducted to identify relevant studies. Data from relevant studies were extracted and random effects meta-analysis was used to estimate the risk of adverse outcomes with periprocedural bleeding. Statistical heterogeneity was assessed by considering the I2 statistic. Results 42 relevant studies were identified including 533 333 patients. Meta-analysis demonstrated that periprocedural major bleeding complications was independently associated with increased risk of mortality (OR 3.31 (2.86 to 3.82), I2=80%) and MACEs (OR 3.89 (3.26 to 4.64), I2=42%). A differential impact of major bleeding as defined by different bleeding definitions on mortality outcomes was observed, in which the REPLACE-2 (OR 6.69, 95% CI 2.26 to 19.81), STEEPLE (OR 6.59, 95% CI 3.89 to 11.16) and BARC (OR 5.40, 95% CI 1.74 to 16.74) had the worst prognostic impacts while HORIZONS-AMI (OR 1.51, 95% CI 1.11 to 2.05) had the least impact on mortality outcomes. Conclusions Major bleeding after PCI is independently associated with a threefold increase in mortality and MACEs outcomes. Different contemporary bleeding definitions have differential impacts on mortality outcomes, with 1.5–6.7-fold increases in mortality observed depending on the definition of major bleeding used. PMID:25332786

  13. Detecting concerted demographic response across community assemblages using hierarchical approximate Bayesian computation.

    PubMed

    Chan, Yvonne L; Schanzenbach, David; Hickerson, Michael J

    2014-09-01

    Methods that integrate population-level sampling from multiple taxa into a single community-level analysis are an essential addition to the comparative phylogeographic toolkit. Detecting how species within communities have demographically tracked each other in space and time is important for understanding the effects of future climate and landscape changes and the resulting acceleration of extinctions, biological invasions, and potential surges in adaptive evolution. Here, we present a statistical framework for such an analysis based on hierarchical approximate Bayesian computation (hABC) with the goal of detecting concerted demographic histories across an ecological assemblage. Our method combines population genetic data sets from multiple taxa into a single analysis to estimate: 1) the proportion of a community sample that demographically expanded in a temporally clustered pulse and 2) when the pulse occurred. To validate the accuracy and utility of this new approach, we use simulation cross-validation experiments and subsequently analyze an empirical data set of 32 avian populations from Australia that are hypothesized to have expanded from smaller refugia populations in the late Pleistocene. The method can accommodate data set heterogeneity such as variability in effective population size, mutation rates, and sample sizes across species and exploits the statistical strength from the simultaneous analysis of multiple species. This hABC framework used in a multitaxa demographic context can increase our understanding of the impact of historical climate change by determining what proportion of the community responded in concert or independently and can be used with a wide variety of comparative phylogeographic data sets as biota-wide DNA barcoding data sets accumulate. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  14. Risk assessment of vector-borne diseases for public health governance.

    PubMed

    Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J

    2014-12-01

    In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  15. Gambling, games of skill and human ecology: a pilot study by a multidimensional analysis approach.

    PubMed

    Valera, Luca; Giuliani, Alessandro; Gizzi, Alessio; Tartaglia, Francesco; Tambone, Vittoradolfo

    2015-01-01

    The present pilot study aims at analyzing the human activity of playing in the light of an indicator of human ecology (HE). We highlighted the four essential anthropological dimensions (FEAD), starting from the analysis of questionnaires administered to actual gamers. The coherence between theoretical construct and observational data is a remarkable proof-of-concept of the possibility of establishing an experimentally motivated link between a philosophical construct (coming from Huizinga's Homo ludens definition) and actual gamers' motivation pattern. The starting hypothesis is that the activity of playing becomes ecological (and thus not harmful) when it achieves the harmony between the FEAD, thus realizing HE; conversely, it becomes at risk of creating some form of addiction, when destroying FEAD balance. We analyzed the data by means of variable clustering (oblique principal components) so to experimentally verify the existence of the hypothesized dimensions. The subsequent projection of statistical units (gamers) on the orthogonal space spanned by principal components allowed us to generate a meaningful, albeit preliminary, clusterization of gamer profiles.

  16. Comparison of current Shuttle and pre-Challenger flight suit reach capability during launch accelerations

    NASA Technical Reports Server (NTRS)

    Bagian, James P.; Schafer, Lauren E.

    1992-01-01

    The Challenger accident prompted the creation of a crew escape system which replaced the former Launch Entry Helmet (LEH) ensemble with the current Launch Entry Suit (LES). However, questions were raised regarding the impact of this change on crew reach capability. This study addressed the question of reach capability and its effects on realistic ground-based training for Space Shuttle missions. Eleven subjects performed reach sweeps in both the LEH and LES suits during 1 and 3 Gx acceleration trials in the Brooks AFB centrifuge. These reach sweeps were recorded on videotape and subsequently analyzed using a 3D motion analysis system. The ANOVA procedure of the Statistical Analysis System program was used to evaluate differences in forward and overhead reach. The results showed that the LES provided less reach capability than its predecessor, the LEH. This study also demonstrated that, since there was no substantial difference between 1 and 3 Gx reach sweeps in the LES, realistic Shuttle launch training may be accomplished in ground based simulators.

  17. Non-invasive brain stimulation to investigate language production in healthy speakers: A meta-analysis.

    PubMed

    Klaus, Jana; Schutter, Dennis J L G

    2018-06-01

    Non-invasive brain stimulation (NIBS) has become a common method to study the interrelations between the brain and language functioning. This meta-analysis examined the efficacy of transcranial magnetic stimulation (TMS) and direct current stimulation (tDCS) in the study of language production in healthy volunteers. Forty-five effect sizes from 30 studies which investigated the effects of NIBS on picture naming or verbal fluency in healthy participants were meta-analysed. Further sub-analyses investigated potential influences of stimulation type, control, target site, task, online vs. offline application, and current density of the target electrode. Random effects modelling showed a small, but reliable effect of NIBS on language production. Subsequent analyses indicated larger weighted mean effect sizes for TMS as compared to tDCS studies. No statistical differences for the other sub-analyses were observed. We conclude that NIBS is a useful method for neuroscientific studies on language production in healthy volunteers. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Early-childhood housing mobility and subsequent PTSD in adolescence: a Moving to Opportunity reanalysis.

    PubMed

    Norris, David C; Wilson, Andrew

    2016-01-01

    In a 2014 report on adolescent mental health outcomes in the Moving to Opportunity for Fair Housing Demonstration (MTO), Kessler et al. reported that, at 10- to 15-year follow-up, boys from households randomized to an experimental housing voucher intervention experienced 12-month prevalence of post-traumatic stress disorder (PTSD) at several times the rate of boys from control households. We reanalyze this finding here, bringing to light a PTSD outcome imputation procedure used in the original analysis, but not described in the study report. By bootstrapping with repeated draws from the frequentist sampling distribution of the imputation model used by Kessler et al., and by varying two pseudorandom number generator seeds that fed their analysis, we account for several purely statistical components of the uncertainty inherent in their imputation procedure. We also discuss other sources of uncertainty in this procedure that were not accessible to a formal reanalysis.

  19. Twenty-five years of sport performance research in the Journal of Sports Sciences.

    PubMed

    Nevill, Alan; Atkinson, Greg; Hughes, Mike

    2008-02-15

    In this historical review covering the past 25 years, we reflect on the content of manuscripts relevant to the Sport Performance section of the Journal of Sports Sciences. Due to the wide diversity of sport performance research, the remit of the Sport Performance section has been broad and includes mathematical and statistical evaluation of competitive sports performances, match- and notation-analysis, talent identification, training and selection or team organization. In addition, due to the academic interests of its section editors, they adopted a quality-assurance role for the Sport Performance section, invariably communicated through key editorials that subsequently shaped the editorial policy of the Journal. Key high-impact manuscripts are discussed, providing readers with some insight into what might lead an article to become a citation "classic". Finally, landmark articles in the areas of "science and football" and "notation analysis" are highlighted, providing further insight into how such articles have contributed to the development of sport performance research in general and the Journal of Sports Sciences in particular.

  20. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  1. Neuroimaging with functional near infrared spectroscopy: From formation to interpretation

    NASA Astrophysics Data System (ADS)

    Herrera-Vega, Javier; Treviño-Palacios, Carlos G.; Orihuela-Espina, Felipe

    2017-09-01

    Functional Near Infrared Spectroscopy (fNIRS) is gaining momentum as a functional neuroimaging modality to investigate the cerebral hemodynamics subsequent to neural metabolism. As other neuroimaging modalities, it is neuroscience's tool to understand brain systems functions at behaviour and cognitive levels. To extract useful knowledge from functional neuroimages it is critical to understand the series of transformations applied during the process of the information retrieval and how they bound the interpretation. This process starts with the irradiation of the head tissues with infrared light to obtain the raw neuroimage and proceeds with computational and statistical analysis revealing hidden associations between pixels intensities and neural activity encoded to end up with the explanation of some particular aspect regarding brain function.To comprehend the overall process involved in fNIRS there is extensive literature addressing each individual step separately. This paper overviews the complete transformation sequence through image formation, reconstruction and analysis to provide an insight of the final functional interpretation.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornemann, Andrea, E-mail: andrea.hornemann@ptb.de; Hoehl, Arne, E-mail: arne.hoehl@ptb.de; Ulm, Gerhard, E-mail: gerhard.ulm@ptb.de

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzedmore » by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.« less

  3. Incidence of cerebral infarction after radiotherapy for pituitary adenoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flickinger, J.C.; Nelson, P.B.; Taylor, F.H.

    1989-06-15

    The incidence of cerebral infarction was studied in 156 patients irradiated for treatment of pituitary adenomas. Seven patients experienced strokes at intervals of 3.2 to 14.6 years after irradiation. The observed incidence was not significantly greater than the expected value of 3.5 strokes (P = 0.078). Six strokes occurred in patients receiving equivalent doses (ED) of 1070 ret or more (observed to expected ratio 3.87, significantly elevated; P less than 0.001). Univariate log-rank analysis showed that the risk of stroke was significantly higher (P = 0.010) in patients receiving an ED of 1070 ret or more (4180 cGy/22 fractions) thanmore » those receiving lower doses. Multivariate analysis, however, demonstrated that the increased risk of stroke was associated only with increasing age (P less than 0.0001), not ED (P = 0.148). Due to these inconsistent statistical results, no definitive conclusions could be reached about the relationship between radiation dose to the pituitary and subsequent cerebral infarction.« less

  4. Histological study of the effect of some irrigating solutions on bacterial endotoxin in dogs.

    PubMed

    Silva, Léa Assed Bezerra da; Leonardo, Mario Roberto; Assed, Sada; Tanomaru Filho, Mário

    2004-01-01

    The aim of this study was to evaluate, histopathologically, the effectiveness of mechanical preparation of root canals using different irrigating solutions in dog teeth filled with LPS after pulpectomy. A total of 120 root canals of 6 mongrel dogs were filled with a solution of LPS after pulpectomy. The irrigating solutions used were saline, 1, 2.5, and 5% sodium hypochlorite, and 2% chlorhexidine. No irrigation was used in the control group. The animals were sacrificed after 60 days and the teeth were fixed and demineralized. Subsequently, serial 6-microm sections were stained with hematoxylin and eosin and Mallory's trichrome for histopathological analysis and Brown-Brenn for verification of bacterial contamination. Analysis showed that the inflammatory infiltrate was statistically less intense in the groups in which the root canals were irrigated with 5% sodium hypochlorite and 2% chlorhexidine. However, none of the irrigating solutions completely inactivated the harmful effects of LPS. Mechanical preparation associated with different irrigating solutions did not completely inactivate LPS.

  5. Forensic identification science evidence since Daubert: Part II--judicial reasoning in decisions to exclude forensic identification evidence on grounds of reliability.

    PubMed

    Page, Mark; Taylor, Jane; Blenkin, Matt

    2011-07-01

    Many studies regarding the legal status of forensic science have relied on the U.S. Supreme Court's mandate in Daubert v. Merrell Dow Pharmaceuticals Inc., and its progeny in order to make subsequent recommendations or rebuttals. This paper focuses on a more pragmatic approach to analyzing forensic science's immediate deficiencies by considering a qualitative analysis of actual judicial reasoning where forensic identification evidence has been excluded on reliability grounds since the Daubert precedent. Reliance on general acceptance is becoming insufficient as proof of the admissibility of forensic evidence. The citation of unfounded statistics, error rates and certainties, a failure to document the analytical process or follow standardized procedures, and the existence of observe bias represent some of the concerns that have lead to the exclusion or limitation of forensic identification evidence. Analysis of these reasons may serve to refocus forensic practitioners' testimony, resources, and research toward rectifying shortfalls in these areas. © 2011 American Academy of Forensic Sciences.

  6. Evaluating blood-brain barrier permeability in delayed cerebral infarction after aneurysmal subarachnoid hemorrhage.

    PubMed

    Ivanidze, J; Kesavabhotla, K; Kallas, O N; Mir, D; Baradaran, H; Gupta, A; Segal, A Z; Claassen, J; Sanelli, P C

    2015-05-01

    Patients with SAH are at increased risk of delayed infarction. Early detection and treatment of delayed infarction remain challenging. We assessed blood-brain barrier permeability, measured as permeability surface area product, by using CTP in patients with SAH with delayed infarction. We performed a retrospective study of patients with SAH with delayed infarction on follow-up NCCT. CTP was performed before the development of delayed infarction. CTP data were postprocessed into permeability surface area product, CBF, and MTT maps. Coregistration was performed to align the infarcted region on the follow-up NCCT with the corresponding location on the CTP maps obtained before infarction. Permeability surface area product, CBF, and MTT values were then obtained in the location of the subsequent infarction. The contralateral noninfarcted region was compared with the affected side in each patient. Wilcoxon signed rank tests were performed to determine statistical significance. Clinical data were collected at the time of CTP and at the time of follow-up NCCT. Twenty-one patients with SAH were included in the study. There was a statistically significant increase in permeability surface area product in the regions of subsequent infarction compared with the contralateral control regions (P < .0001). However, CBF and MTT values were not significantly different in these 2 regions. Subsequent follow-up NCCT demonstrated new delayed infarction in all 21 patients, at which time 38% of patients had new focal neurologic deficits. Our study reveals a statistically significant increase in permeability surface area product preceding delayed infarction in patients with SAH. Further investigation of early permeability changes in SAH may provide new insights into the prediction of delayed infarction. © 2015 by American Journal of Neuroradiology.

  7. Interrupted time series regression for the evaluation of public health interventions: a tutorial.

    PubMed

    Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio

    2017-02-01

    Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design.

  8. Interrupted time series regression for the evaluation of public health interventions: a tutorial

    PubMed Central

    Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio

    2017-01-01

    Abstract Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design. PMID:27283160

  9. Nuclear forensics investigation of morphological signatures in the thermal decomposition of uranyl peroxide.

    PubMed

    Schwerdt, Ian J; Olsen, Adam; Lusk, Robert; Heffernan, Sean; Klosterman, Michael; Collins, Bryce; Martinson, Sean; Kirkham, Trenton; McDonald, Luther W

    2018-01-01

    The analytical techniques typically utilized in a nuclear forensic investigation often provide limited information regarding the process history and production conditions of interdicted nuclear material. In this study, scanning electron microscopy (SEM) analysis of the surface morphology of amorphous-UO 3 samples calcined at 250, 300, 350, 400, and 450°C from uranyl peroxide was performed to determine if the morphology was indicative of the synthesis route and thermal history for the samples. Thermogravimetic analysis-mass spectrometry (TGA-MS) and differential scanning calorimetry (DSC) were used to correlate transitions in the calcined material to morphological transformations. The high-resolution SEM images were processed using the Morphological Analysis for Material Attribution (MAMA) software. Morphological attributes, particle area and circularity, indicated significant trends as a result of calcination temperature. The quantitative morphological analysis was able to track the process of particle fragmentation and subsequent sintering as calcination temperature was increased. At the 90% confidence interval, with 1000 segmented particles, the use of Kolmogorov-Smirnov statistical comparisons allowed discernment between all calcination temperatures for the uranyl peroxide route. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Construction and Analysis of Functional Networks in the Gut Microbiome of Type 2 Diabetes Patients.

    PubMed

    Li, Lianshuo; Wang, Zicheng; He, Peng; Ma, Shining; Du, Jie; Jiang, Rui

    2016-10-01

    Although networks of microbial species have been widely used in the analysis of 16S rRNA sequencing data of a microbiome, the construction and analysis of a complete microbial gene network are in general problematic because of the large number of microbial genes in metagenomics studies. To overcome this limitation, we propose to map microbial genes to functional units, including KEGG orthologous groups and the evolutionary genealogy of genes: Non-supervised Orthologous Groups (eggNOG) orthologous groups, to enable the construction and analysis of a microbial functional network. We devised two statistical methods to infer pairwise relationships between microbial functional units based on a deep sequencing dataset of gut microbiome from type 2 diabetes (T2D) patients as well as healthy controls. Networks containing such functional units and their significant interactions were constructed subsequently. We conducted a variety of analyses of global properties, local properties, and functional modules in the resulting functional networks. Our data indicate that besides the observations consistent with the current knowledge, this study provides novel biological insights into the gut microbiome associated with T2D. Copyright © 2016. Production and hosting by Elsevier Ltd.

  11. Chemical and toxicological evaluation of underground coal gasification (UCG) effluents. The coal rank effect.

    PubMed

    Kapusta, Krzysztof; Stańczyk, Krzysztof

    2015-02-01

    The effect of coal rank on the composition and toxicity of water effluents resulting from two underground coal gasification experiments with distinct coal samples (lignite and hard coal) was investigated. A broad range of organic and inorganic parameters was determined in the sampled condensates. The physicochemical tests were supplemented by toxicity bioassays based on the luminescent bacteria Vibrio fischeri as the test organism. The principal component analysis and Pearson correlation analysis were adopted to assist in the interpretation of the raw experimental data, and the multiple regression statistical method was subsequently employed to enable predictions of the toxicity based on the values of the selected parameters. Significant differences in the qualitative and quantitative description of the contamination profiles were identified for both types of coal under study. Independent of the coal rank, the most characteristic organic components of the studied condensates were phenols, naphthalene and benzene. In the inorganic array, ammonia, sulphates and selected heavy metals and metalloids were identified as the dominant constituents. Except for benzene with its alkyl homologues (BTEX), selected polycyclic aromatic hydrocarbons (PAHs), zinc and selenium, the values of the remaining parameters were considerably greater for the hard coal condensates. The studies revealed that all of the tested UCG condensates were extremely toxic to V. fischeri; however, the average toxicity level for the hard coal condensates was approximately 56% higher than that obtained for the lignite. The statistical analysis provided results supporting that the toxicity of the condensates was most positively correlated with the concentrations of free ammonia, phenols and certain heavy metals. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Gender differences in knee morphology and the prospects for implant design in total knee replacement.

    PubMed

    Asseln, Malte; Hänisch, Christoph; Schick, Fabian; Radermacher, Klaus

    2018-05-14

    Morphological differences between female and male knees have been reported in the literature, which led to the development of so-called gender-specific implants. However, detailed morphological descriptions covering the entire joint are rare and little is known regarding whether gender differences are real sexual dimorphisms or can be explained by overall differences in size. We comprehensively analysed knee morphology using 33 features of the femur and 21 features of the tibia to quantify knee shape. The landmark recognition and feature extraction based on three-dimensional surface data were fully automatically applied to 412 pathological (248 female and 164 male) knees undergoing total knee arthroplasty. Subsequently, an exploratory statistical analysis was performed and linear correlation analysis was used to investigate normalization factors and gender-specific differences. Statistically significant differences between genders were observed. These were pronounced for distance measurements and negligible for angular (relative) measurements. Female knees were significantly narrower at the same depth compared to male knees. The correlation analysis showed that linear correlations were higher for distance measurements defined in the same direction. After normalizing the distance features according to overall dimensions in the direction of their definition, gender-specific differences disappeared or were smaller than the related confidence intervals. Implants should not be linearly scaled according to one dimension. Instead, features in medial/lateral and anterior/posterior directions should be normalized separately (non-isotropic scaling). However, large inter-individual variations of the features remain after normalization, suggesting that patient-specific design solutions are required for an improved implant design, regardless of gender. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Engineering diverse changes in beta-turn propensities in the N-terminal beta-hairpin of ubiquitin reveals significant effects on stability and kinetics but a robust folding transition state.

    PubMed

    Simpson, Emma R; Meldrum, Jill K; Searle, Mark S

    2006-04-04

    Using the N-terminal 17-residue beta-hairpin of ubiquitin as a "host" for mutational studies, we have investigated the influence of the beta-turn sequence on protein stability and folding kinetics by replacing the native G-bulged turn (TLTGK) with more flexible analogues (TG3K and TG5K) and a series of four-residue type I' beta-turn sequences, commonly found in beta-hairpins. Although a statistical analysis of type I' turns demonstrates residue preferences at specific sites, the frequency of occurrence appears to only broadly correlate with experimentally determined protein stabilities. The subsequent engineering of context-dependent non-native tertiary contacts involving turn residues is shown to produce large changes in stability. Relatively few point mutations have been described that probe secondary structure formation in ubiquitin in a manner that is independent of tertiary contacts. To this end, we have used the more rigorous rate-equilibrium free energy relationship (Leffler analysis), rather than the two-point phi value analysis, to show for a family of engineered beta-turn mutants that stability (range of approximately 20 kJ/mol) and folding kinetics (190-fold variation in refolding rate) are linearly correlated (alpha(f) = 0.74 +/- 0.08). The data are consistent with a transition state that is robust with regard to a wide range of statistically favored and disfavored beta-turn mutations and implicate a loosely assembled beta-hairpin as a key template in transition state stabilization with the beta-turn playing a central role.

  14. 3D QSAR models built on structure-based alignments of Abl tyrosine kinase inhibitors.

    PubMed

    Falchi, Federico; Manetti, Fabrizio; Carraro, Fabio; Naldini, Antonella; Maga, Giovanni; Crespan, Emmanuele; Schenone, Silvia; Bruno, Olga; Brullo, Chiara; Botta, Maurizio

    2009-06-01

    Quality QSAR: A combination of docking calculations and a statistical approach toward Abl inhibitors resulted in a 3D QSAR model, the analysis of which led to the identification of ligand portions important for affinity. New compounds designed on the basis of the model were found to have very good affinity for the target, providing further validation of the model itself.The X-ray crystallographic coordinates of the Abl tyrosine kinase domain in its active, inactive, and Src-like inactive conformations were used as targets to simulate the binding mode of a large series of pyrazolo[3,4-d]pyrimidines (known Abl inhibitors) by means of GOLD software. Receptor-based alignments provided by molecular docking calculations were submitted to a GRID-GOLPE protocol to generate 3D QSAR models. Analysis of the results showed that the models based on the inactive and Src-like inactive conformations had very poor statistical parameters, whereas the sole model based on the active conformation of Abl was characterized by significant internal and external predictive ability. Subsequent analysis of GOLPE PLS pseudo-coefficient contour plots of this model gave us a better understanding of the relationships between structure and affinity, providing suggestions for the next optimization process. On the basis of these results, new compounds were designed according to the hydrophobic and hydrogen bond donor and acceptor contours, and were found to have improved enzymatic and cellular activity with respect to parent compounds. Additional biological assays confirmed the important role of the selected compounds as inhibitors of cell proliferation in leukemia cells.

  15. Only pick the right grains: Modelling the bias due to subjective grain-size interval selection for chronometric and fingerprinting approaches.

    NASA Astrophysics Data System (ADS)

    Dietze, Michael; Fuchs, Margret; Kreutzer, Sebastian

    2016-04-01

    Many modern approaches of radiometric dating or geochemical fingerprinting rely on sampling sedimentary deposits. A key assumption of most concepts is that the extracted grain-size fraction of the sampled sediment adequately represents the actual process to be dated or the source area to be fingerprinted. However, these assumptions are not always well constrained. Rather, they have to align with arbitrary, method-determined size intervals, such as "coarse grain" or "fine grain" with partly even different definitions. Such arbitrary intervals violate principal process-based concepts of sediment transport and can thus introduce significant bias to the analysis outcome (i.e., a deviation of the measured from the true value). We present a flexible numerical framework (numOlum) for the statistical programming language R that allows quantifying the bias due to any given analysis size interval for different types of sediment deposits. This framework is applied to synthetic samples from the realms of luminescence dating and geochemical fingerprinting, i.e. a virtual reworked loess section. We show independent validation data from artificially dosed and subsequently mixed grain-size proportions and we present a statistical approach (end-member modelling analysis, EMMA) that allows accounting for the effect of measuring the compound dosimetric history or geochemical composition of a sample. EMMA separates polymodal grain-size distributions into the underlying transport process-related distributions and their contribution to each sample. These underlying distributions can then be used to adjust grain-size preparation intervals to minimise the incorporation of "undesired" grain-size fractions.

  16. Relationship between gait initiation and disability in individuals affected by multiple sclerosis.

    PubMed

    Galli, Manuela; Coghe, Giancarlo; Sanna, Paola; Cocco, Eleonora; Marrosu, Maria Giovanna; Pau, Massimiliano

    2015-11-01

    This study analyzes how multiple sclerosis (MS) does affect one of the most common voluntary activities in life: the gait initiation (GI). The main aim of the work is to characterize the execution of this task by measuring and comparing relevant parameters based on center of pressure (COP) patterns and to study the relationship between these and the level of expanded disability status scale (EDSS). To this aim, 95 MS subjects with an average EDSS score of 2.4 and 35 healthy subjects were tested using a force platform during the transition from standing posture to gait. COP time-series were acquired and processed to extract a number of parameters related to the trajectory followed by the COP. The statistical analysis revealed that only a few measurements were statistically different between the two groups and only these were subsequently correlated with EDSS score. The correlation analysis underlined that a progressive alteration of the task execution can be directly related with the increase of EDSS score. These finding suggest that most of the impairment found in people with MS comes from the first part of the COP pattern, the anticipatory postural adjustments (APAs). The central nervous system performs APAs before every voluntary movement to minimize balance perturbation due to the movement itself. Gait Initiation's APAs consist in some ankle muscles contractions that induce a backward COP shift to the swing limb. The analysis here performed highlighted that MS affected patients have a reduced posterior COP shift that reveals that the anticipatory mechanism is impaired. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Forging a link between mentoring and collaboration: a new training model for implementation science.

    PubMed

    Luke, Douglas A; Baumann, Ana A; Carothers, Bobbi J; Landsverk, John; Proctor, Enola K

    2016-10-13

    Training investigators for the rapidly developing field of implementation science requires both mentoring and scientific collaboration. Using social network descriptive analyses, visualization, and modeling, this paper presents results of an evaluation of the mentoring and collaborations fostered over time through the National Institute of Mental Health (NIMH) supported by Implementation Research Institute (IRI). Data were comprised of IRI participant self-reported collaborations and mentoring relationships, measured in three annual surveys from 2012 to 2014. Network descriptive statistics, visualizations, and network statistical modeling were conducted to examine patterns of mentoring and collaboration among IRI participants and to model the relationship between mentoring and subsequent collaboration. Findings suggest that IRI is successful in forming mentoring relationships among its participants, and that these mentoring relationships are related to future scientific collaborations. Exponential random graph network models demonstrated that mentoring received in 2012 was positively and significantly related to the likelihood of having a scientific collaboration 2 years later in 2014 (p = 0.001). More specifically, mentoring was significantly related to future collaborations focusing on new research (p = 0.009), grant submissions (p = 0.003), and publications (p = 0.017). Predictions based on the network model suggest that for every additional mentoring relationships established in 2012, the likelihood of a scientific collaboration 2 years later is increased by almost 7 %. These results support the importance of mentoring in implementation science specifically and team science more generally. Mentoring relationships were established quickly and early by the IRI core faculty. IRI fellows reported increasing scientific collaboration of all types over time, including starting new research, submitting new grants, presenting research results, and publishing peer-reviewed papers. Statistical network models demonstrated that mentoring was strongly and significantly related to subsequent scientific collaboration, which supported a core design principle of the IRI. Future work should establish the link between mentoring and scientific productivity. These results may be of interest to team science, as they suggest the importance of mentoring for future team collaborations, as well as illustrate the utility of network analysis for studying team characteristics and activities.

  18. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial

    PubMed Central

    Hallgren, Kevin A.

    2012-01-01

    Many research designs require the assessment of inter-rater reliability (IRR) to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR. PMID:22833776

  19. Teaching and Learning with Individually Unique Exercises

    ERIC Educational Resources Information Center

    Joerding, Wayne

    2010-01-01

    In this article, the author describes the pedagogical benefits of giving students individually unique homework exercises from an exercise template. Evidence from a test of this approach shows statistically significant improvements in subsequent exam performance by students receiving unique problems compared with students who received traditional…

  20. Statistical principle and methodology in the NISAN system.

    PubMed Central

    Asano, C

    1979-01-01

    The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594

  1. Risk factors for early cytologic abnormalities after loop electrosurgical excision procedure.

    PubMed

    Dietrich, Charles S; Yancey, Michael K; Miyazawa, Kunio; Williams, David L; Farley, John

    2002-02-01

    To evaluate risk factors for early cytologic abnormalities and recurrent cervical dysplasia after loop electrosurgical excision procedure (LEEP). A retrospective analysis was performed of all pathology records for LEEPs performed at our institution from January 1996 through July 1998. Follow-up cytology from 2 through 12 months after LEEP was reviewed. Patients with abnormal cytology were referred for further colposcopic evaluation. Statistical analysis using chi2 test for trend, proportional hazards model test, Fisher exact tests, and life table analysis were performed to identify risk factors for early cytologic abnormalities after LEEP and to determine relative risk of recurrent dysplasia. A total of 298 women underwent LEEP during the study period, and 29% of these had cytologic abnormalities after LEEP. Grade of dysplasia, ectocervical marginal status, endocervical marginal status, and glandular involvement with dysplasia were not found to be independent risk factors for early cytologic abnormalities. However, when risk factors were analyzed cumulatively, the abnormal cytology rate increased from 24% with no risk factors to 67% with three risk factors present (P =.037). Of patients with abnormal cytology after LEEP, 40% developed subsequent dysplasia, and the mean time to diagnosis was approximately 6 months. The relative risk of subsequent dysplasia ranged from a 20% increase to twice the risk if post-LEEP cytology was low-grade squamous intraepithelial lesion or high-grade squamous intraepithelial lesion, respectively. Based on these results, consideration should be given for early colposcopic examination of patients who have evidence of marginal involvement or endocervical glandular involvement with dysplasia. These patients are at increased risk for abnormal cytology and recurrent dysplasia. This initial visit should occur at 6 months, as the mean time to recurrence of dysplasia was 6.5 months.

  2. Super-delta: a new differential gene expression analysis procedure with robust data normalization.

    PubMed

    Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing

    2017-12-21

    Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.

  3. A Generalized Approach for the Interpretation of Geophysical Well Logs in Ground-Water Studies - Theory and Application

    USGS Publications Warehouse

    Paillet, Frederick L.; Crowder, R.E.

    1996-01-01

    Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.

  4. Geospatial Characterization of Fluvial Wood Arrangement in a Semi-confined Alluvial River

    NASA Astrophysics Data System (ADS)

    Martin, D. J.; Harden, C. P.; Pavlowsky, R. T.

    2014-12-01

    Large woody debris (LWD) has become universally recognized as an integral component of fluvial systems, and as a result, has become increasingly common as a river restoration tool. However, "natural" processes of wood recruitment and the subsequent arrangement of LWD within the river network are poorly understood. This research used a suite of spatial statistics to investigate longitudinal arrangement patterns of LWD in a low-gradient, Midwestern river. First, a large-scale GPS inventory of LWD, performed on the Big River in the eastern Missouri Ozarks, resulted in over 4,000 logged positions of LWD along seven river segments that covered nearly 100 km of the 237 km river system. A global Moran's I analysis indicates that LWD density is spatially autocorrelated and displays a clustering tendency within all seven river segments (P-value range = 0.000 to 0.054). A local Moran's I analysis identified specific locations along the segments where clustering occurs and revealed that, on average, clusters of LWD density (high or low) spanned 400 m. Spectral analyses revealed that, in some segments, LWD density is spatially periodic. Two segments displayed strong periodicity, while the remaining segments displayed varying degrees of noisiness. Periodicity showed a positive association with gravel bar spacing and meander wavelength, although there were insufficient data to statistically confirm the relationship. A wavelet analysis was then performed to investigate periodicity relative to location along the segment. The wavelet analysis identified significant (α = 0.05) periodicity at discrete locations along each of the segments. Those reaches yielding strong periodicity showed stronger relationships between LWD density and the geomorphic/riparian independent variables tested. Analyses consistently identified valley width and sinuosity as being associated with LWD density. The results of these analyses contribute a new perspective on the longitudinal distribution of LWD in a river system, which should help identify physical and/or riparian control mechanisms of LWD arrangement and support the development of models of LWD arrangement. Additionally, the spatial statistical tools presented here have shown to be valuable for identifying longitudinal patterns in river system components.

  5. An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.

    2016-12-01

    QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.

  6. Risk factors associated with refractoriness to esophageal dilatation for benign dysphagia.

    PubMed

    Rodrigues-Pinto, Eduardo; Pereira, Pedro; Ribeiro, Armando; Lopes, Susana; Moutinho-Ribeiro, Pedro; Silva, Marco; Peixoto, Armando; Gaspar, Rui; Macedo, Guilherme

    2016-06-01

    Benign esophageal strictures need repeated dilatations to relieve dysphagia. Literature is scarce on the risk factors for refractoriness of these strictures. This study aimed to assess the risk factors associated with refractory strictures. This is a retrospective study of patients with benign esophageal strictures who were referred for esophageal dilatation over a period of 3 years. A total of 327 esophageal dilatations were performed in 103 patients; 53% of the patients reported dysphagia for liquids. Clinical success was achieved in 77% of the patients. There was a need for further dilatations in 54% of patients, being more frequent in patients with dysphagia for liquids [78 vs. 64%, P=0.008, odds ratio (OR) 1.930], in those with caustic strictures (89 vs. 70%, P=0.007, OR 3.487), and in those with complex strictures (83 vs. 70%, P=0.047, OR 2.132). Caustic strictures, peptic strictures, and complex strictures showed statistical significance in the multivariate analysis. Time until subsequent dilatations was less in patients with dysphagia for liquids (49 vs. 182 days, P<0.001), in those with peptic strictures (49 vs. 98 days, P=0.004), in those with caustic strictures (49 vs. 78 days, P=0.005), and in patients with complex strictures (47 vs. 80 days P=0.009). In multivariate analysis, further dilatations occurred earlier in patients with dysphagia for liquids [hazard ratio (HR) 1.506, P=0.004], in those with peptic strictures (HR 1.644, P=0.002), in those with caustic strictures (HR 1.581, P=0.016), and in patients with complex strictures (HR 1.408, P=0.046). Caustic, peptic, and complex strictures were associated with a greater need for subsequent dilatations. Time until subsequent dilatations was less in patients with dysphagia for liquids and in those with caustic, peptic, and complex strictures.

  7. First-cycle blood counts and subsequent neutropenia, dose reduction, or delay in early-stage breast cancer therapy.

    PubMed

    Silber, J H; Fridman, M; DiPaola, R S; Erder, M H; Pauly, M V; Fox, K R

    1998-07-01

    If patients could be ranked according to their projected need for supportive care therapy, then more efficient and less costly treatment algorithms might be developed. This work reports on the construction of a model of neutropenia, dose reduction, or delay that rank-orders patients according to their need for costly supportive care such as granulocyte growth factors. A case series and consecutive sample of patients treated for breast cancer were studied. Patients had received standard-dose adjuvant chemotherapy for early-stage nonmetastatic breast cancer and were treated by four medical oncologists. Using 95 patients and validated with 80 additional patients, development models were constructed to predict one or more of the following events: neutropenia (absolute neutrophil count [ANC] < or = 250/microL), dose reduction > or = 15% of that scheduled, or treatment delay > or = 7 days. Two approaches to modeling were attempted. The pretreatment approach used only pretreatment predictors such as chemotherapy regimen and radiation history; the conditional approach included, in addition, blood count information obtained in the first cycle of treatment. The pretreatment model was unsuccessful at predicting neutropenia, dose reduction, or delay (c-statistic = 0.63). Conditional models were good predictors of subsequent events after cycle 1 (c-statistic = 0.87 and 0.78 for development and validation samples, respectively). The depth of the first-cycle ANC was an excellent predictor of events in subsequent cycles (P = .0001 to .004). Chemotherapy plus radiation also increased the risk of subsequent events (P = .0011 to .0901). Decline in hemoglobin (HGB) level during the first cycle of therapy was a significant predictor of events in the development study (P = .0074 and .0015), and although the trend was similar in the validation study, HGB decline failed to reach statistical significance. It is possible to rank patients according to their need of supportive care based on blood counts observed in the first cycle of therapy. Such rankings may aid in the choice of appropriate supportive care for patients with early-stage breast cancer.

  8. Pitfalls in chronobiology: a suggested analysis using intrathecal bupivacaine analgesia as an example.

    PubMed

    Shafer, Steven L; Lemmer, Bjoern; Boselli, Emmanuel; Boiste, Fabienne; Bouvet, Lionel; Allaouchiche, Bernard; Chassard, Dominique

    2010-10-01

    The duration of analgesia from epidural administration of local anesthetics to parturients has been shown to follow a rhythmic pattern according to the time of drug administration. We studied whether there was a similar pattern after intrathecal administration of bupivacaine in parturients. In the course of the analysis, we came to believe that some data points coincident with provider shift changes were influenced by nonbiological, health care system factors, thus incorrectly suggesting a periodic signal in duration of labor analgesia. We developed graphical and analytical tools to help assess the influence of individual points on the chronobiological analysis. Women with singleton term pregnancies in vertex presentation, cervical dilation 3 to 5 cm, pain score >50 mm (of 100 mm), and requesting labor analgesia were enrolled in this study. Patients received 2.5 mg of intrathecal bupivacaine in 2 mL using a combined spinal-epidural technique. Analgesia duration was the time from intrathecal injection until the first request for additional analgesia. The duration of analgesia was analyzed by visual inspection of the data, application of smoothing functions (Supersmoother; LOWESS and LOESS [locally weighted scatterplot smoothing functions]), analysis of variance, Cosinor (Chronos-Fit), Excel, and NONMEM (nonlinear mixed effect modeling). Confidence intervals (CIs) were determined by bootstrap analysis (1000 replications with replacement) using PLT Tools. Eighty-two women were included in the study. Examination of the raw data using 3 smoothing functions revealed a bimodal pattern, with a peak at approximately 0630 and a subsequent peak in the afternoon or evening, depending on the smoother. Analysis of variance did not identify any statistically significant difference between the duration of analgesia when intrathecal injection was given from midnight to 0600 compared with the duration of analgesia after intrathecal injection at other times. Chronos-Fit, Excel, and NONMEM produced identical results, with a mean duration of analgesia of 38.4 minutes (95% CI: 35.4-41.6 minutes), an 8-hour periodic waveform with an amplitude of 5.8 minutes (95% CI: 2.1-10.7 minutes), and a phase offset of 6.5 hours (95% CI: 5.4-8.0 hours) relative to midnight. The 8-hour periodic model did not reach statistical significance in 40% of bootstrap analyses, implying that statistical significance of the 8-hour periodic model was dependent on a subset of the data. Two data points before the change of shift at 0700 contributed most strongly to the statistical significance of the periodic waveform. Without these data points, there was no evidence of an 8-hour periodic waveform for intrathecal bupivacaine analgesia. Chronobiology includes the influence of external daily rhythms in the environment (e.g., nursing shifts) as well as human biological rhythms. We were able to distinguish the influence of an external rhythm by combining several novel analyses: (1) graphical presentation superimposing the raw data, external rhythms (e.g., nursing and anesthesia provider shifts), and smoothing functions; (2) graphical display of the contribution of each data point to the statistical significance; and (3) bootstrap analysis to identify whether the statistical significance was highly dependent on a data subset. These approaches suggested that 2 data points were likely artifacts of the change in nursing and anesthesia shifts. When these points were removed, there was no suggestion of biological rhythm in the duration of intrathecal bupivacaine analgesia.

  9. Analysis of Climatic and Environmental Changes Using CLEARS Web-GIS Information-Computational System: Siberia Case Study

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.

    2011-12-01

    Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total annual precipitation over the south of Western Siberia. In particular, we conclude that analysis of trends of growing season length, sum of growing degree-days and total precipitation during the growing season reveals a tendency to an increase of vegetation ecosystems productivity across the south of Western Siberia (55°-60°N, 59°-84°E) in the past several decades. The developed system functionality providing instruments for comparison of modeling and observational data and for reliable climatological analysis allowed us to obtain new results characterizing regional manifestations of global change. It should be added that each analysis performed using the system leads also to generation of the archive of spatio-temporal data fields ready for subsequent usage by other specialists. In particular, the archive of bioclimatic indices obtained will allow performing further detailed studies of interrelations between local climate and vegetation cover changes, including changes of carbon uptake related to variations of types and amount of vegetation and spatial shift of vegetation zones. This work is partially supported by RFBR grants #10-07-00547 and #11-05-01190-a, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7.

  10. Damage localization by statistical evaluation of signal-processed mode shapes

    NASA Astrophysics Data System (ADS)

    Ulriksen, M. D.; Damkilde, L.

    2015-07-01

    Due to their inherent, ability to provide structural information on a local level, mode shapes and t.lieir derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in the spatial mode shape signals, hereby potentially facilitating damage detection and/or localization. However, by being based on distinguishing damage-induced discontinuities from other signal irregularities, an intrinsic deficiency in these methods is the high sensitivity towards measurement, noise. The present, article introduces a damage localization method which, compared to the conventional mode shape-based methods, has greatly enhanced robustness towards measurement, noise. The method is based on signal processing of spatial mode shapes by means of continuous wavelet, transformation (CWT) and subsequent, application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact, damage-induced, outlier analysis of principal components of the signal-processed mode shapes is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context, of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.

  11. Analysis of data on large explosive eruptions of stratovolcanoes to constrain under-recording and eruption rates

    NASA Astrophysics Data System (ADS)

    Rougier, Jonty; Cashman, Kathy; Sparks, Stephen

    2016-04-01

    We have analysed the Large Magnitude Explosive Volcanic Eruptions database (LaMEVE) for volcanoes that classify as stratovolcanoes. A non-parametric statistical approach is used to assess the global recording rate for large (M4+). The approach imposes minimal structure on the shape of the recording rate through time. We find that the recording rates have declined rapidly, going backwards in time. Prior to 1600 they are below 50%, and prior to 1100 they are below 20%. Even in the recent past, e.g. the 1800s, they are likely to be appreciably less than 100%.The assessment for very large (M5+) eruptions is more uncertain, due to the scarcity of events. Having taken under-recording into account the large-eruption rates of stratovolcanoes are modelled exchangeably, in order to derive an informative prior distribution as an input into a subsequent volcano-by-volcano hazard assessment. The statistical model implies that volcano-by-volcano predictions can be grouped by the number of recorded large eruptions. Further, it is possible to combine all volcanoes together into a global large eruption prediction, with an M4+ rate computed from the LaMEVE database of 0.57/yr.

  12. Statistical analysis of environmental variability within the CELSS breadboard project's biomass production chamber

    NASA Technical Reports Server (NTRS)

    Stutte, G. W.; Chetirkin, P. V.; Mackowiak, C. L.; Fortson, R. E.

    1993-01-01

    Variability in the aerial and root environments of NASA's Breadboard Project's Biomass Production Chamber (BPC) was determined. Data from two lettuce and two potato growouts were utilized. One growout of each crop was conducted prior to separating the upper and lower chambers; the other was subsequent to separation. There were little or no differences in pH, EC, or solution temperature between the upper and lower chamber or within a chamber. Variation in the aerial environment within a chamber was two to three times greater than variation between chambers for air temperature, relative humidity, and PPF. High variability in air velocity, relative to tray position, was observed. Separating the BPC had no effect on PPF, air velocity, solution temperature, pH, or EC. Separation reduced the gradient in air temperature and relative humidity between the upper and lower chambers, but increased the variability within a chamber. Variation between upper and lower chambers was within 5 percent of environmental set-points and of little or no physiological significance. In contrast, the variability within a chamber limits the capability of the BPC to generate statistically reliable data from individual tray treatments at this time.

  13. Sample size determination in combinatorial chemistry.

    PubMed Central

    Zhao, P L; Zambias, R; Bolognese, J A; Boulton, D; Chapman, K

    1995-01-01

    Combinatorial chemistry is gaining wide appeal as a technique for generating molecular diversity. Among the many combinatorial protocols, the split/recombine method is quite popular and particularly efficient at generating large libraries of compounds. In this process, polymer beads are equally divided into a series of pools and each pool is treated with a unique fragment; then the beads are recombined, mixed to uniformity, and redivided equally into a new series of pools for the subsequent couplings. The deviation from the ideal equimolar distribution of the final products is assessed by a special overall relative error, which is shown to be related to the Pearson statistic. Although the split/recombine sampling scheme is quite different from those used in analysis of categorical data, the Pearson statistic is shown to still follow a chi2 distribution. This result allows us to derive the required number of beads such that, with 99% confidence, the overall relative error is controlled to be less than a pregiven tolerable limit L1. In this paper, we also discuss another criterion, which determines the required number of beads so that, with 99% confidence, all individual relative errors are controlled to be less than a pregiven tolerable limit L2 (0 < L2 < 1). PMID:11607586

  14. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    PubMed

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  15. Prognostic value of heart rate turbulence for risk assessment in patients with unstable angina and non-ST elevation myocardial infarction

    PubMed Central

    Harris, Patricia RE; Stein, Phyllis K; Fung, Gordon L; Drew, Barbara J

    2013-01-01

    Background We sought to examine the prognostic value of heart rate turbulence derived from electrocardiographic recordings initiated in the emergency department for patients with non-ST elevation myocardial infarction (NSTEMI) or unstable angina. Methods Twenty-four-hour Holter recordings were started in patients with cardiac symptoms approximately 45 minutes after arrival in the emergency department. Patients subsequently diagnosed with NSTEMI or unstable angina who had recordings with ≥18 hours of sinus rhythm and sufficient data to compute Thrombolysis In Myocardial Infarction (TIMI) risk scores were chosen for analysis (n = 166). Endpoints were emergent re-entry to the cardiac emergency department and/or death at 30 days and one year. Results In Cox regression models, heart rate turbulence and TIMI risk scores together were significant predictors of 30-day (model chi square 13.200, P = 0.001, C-statistic 0.725) and one-year (model chi square 31.160, P < 0.001, C-statistic 0.695) endpoints, outperforming either measure alone. Conclusion Measurement of heart rate turbulence, initiated upon arrival at the emergency department, may provide additional incremental value in the risk assessment for patients with NSTEMI or unstable angina. PMID:23976860

  16. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Hun C.; Fang, Ho T.

    1987-01-01

    The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).

  17. CAPACITY BUILDING PROCESS IN ENVIRONMENTAL AND HEALTH IMPACT ASSESSMENT FOR A THAI COMMUNITY.

    PubMed

    Chaithui, Suthat; Sithisarankul, Pornchai; Hengpraprom, Sarunya

    2017-03-01

    This research aimed at exploring the development of the capacitybuilding process in environmental and health impact assessment, including the consideration of subsequent, capacity-building achievements. Data were gathered through questionnaires, participatory observations, in-depth interviews, focus group discussions, and capacity building checklist forms. These data were analyzed using content analysis, descriptive statistics, and inferential statistics. Our study used the components of the final draft for capacity-building processes consisting of ten steps that were formulated by synthesis from each respective process. Additionally, the evaluation of capacity building levels was performed using 10-item evaluation criteria for nine communities. The results indicated that the communities performed well under these criteria. Finally, exploration of the factors influencing capacity building in environmental and health impact assessment indicated that the learning of community members by knowledge exchange via activities and study visits were the most influential factors of the capacity building processes in environmental and health impact assessment. The final revised version of capacitybuilding process in environmental and health impact assessment could serve as a basis for the consideration of interventions in similar areas, so that they increased capacity in environmental and health impact assessments.

  18. A statistical metadata model for clinical trials' data management.

    PubMed

    Vardaki, Maria; Papageorgiou, Haralambos; Pentaris, Fragkiskos

    2009-08-01

    We introduce a statistical, process-oriented metadata model to describe the process of medical research data collection, management, results analysis and dissemination. Our approach explicitly provides a structure for pieces of information used in Clinical Study Data Management Systems, enabling a more active role for any associated metadata. Using the object-oriented paradigm, we describe the classes of our model that participate during the design of a clinical trial and the subsequent collection and management of the relevant data. The advantage of our approach is that we focus on presenting the structural inter-relation of these classes when used during datasets manipulation by proposing certain transformations that model the simultaneous processing of both data and metadata. Our solution reduces the possibility of human errors and allows for the tracking of all changes made during datasets lifecycle. The explicit modeling of processing steps improves data quality and assists in the problem of handling data collected in different clinical trials. The case study illustrates the applicability of the proposed framework demonstrating conceptually the simultaneous handling of datasets collected during two randomized clinical studies. Finally, we provide the main considerations for implementing the proposed framework into a modern Metadata-enabled Information System.

  19. Approach toward enhancement of halophilic protease production by Halobacterium sp. strain LBU50301 using statistical design response surface methodology.

    PubMed

    Chuprom, Julalak; Bovornreungroj, Preeyanuch; Ahmad, Mehraj; Kantachote, Duangporn; Dueramae, Sawitree

    2016-06-01

    A new potent halophilic protease producer, Halobacterium sp. strain LBU50301 was isolated from salt-fermented fish samples ( budu ) and identified by phenotypic analysis, and 16S rDNA gene sequencing. Thereafter, sequential statistical strategy was used to optimize halophilic protease production from Halobacterium sp. strain LBU50301 by shake-flask fermentation. The classical one-factor-at-a-time (OFAT) approach determined gelatin was the best nitrogen source. Based on Plackett - Burman (PB) experimental design; gelatin, MgSO 4 ·7H 2 O, NaCl and pH significantly influenced the halophilic protease production. Central composite design (CCD) determined the optimum level of medium components. Subsequently, an 8.78-fold increase in corresponding halophilic protease yield (156.22 U/mL) was obtained, compared with that produced in the original medium (17.80 U/mL). Validation experiments proved the adequacy and accuracy of model, and the results showed the predicted value agreed well with the experimental values. An overall 13-fold increase in halophilic protease yield was achieved using a 3 L laboratory fermenter and optimized medium (231.33 U/mL).

  20. Pre-hemorrhage statin use and the risk of vasospasm following aneurysmal subarachnoid hemorrhage

    PubMed Central

    Moskowitz, Shaye I.; Ahrens, Christine; Provencio, J Javier; Chow, Michael; Rasmussen, Peter A

    2010-01-01

    Background and Purpose Aneurysmal subarachnoid hemorrhage (SAH) is often followed by delayed ischemic deficits attributable to cerebral vasospasm. Recent studies suggest a positive impact of statin therapy on the incidence of vasospasm. This study was designed to assess whether a history of prior use of statin therapy was associated with a lower risk of vasospasm in patients with SAH. Methods We performed a comprehensive retrospective review of patients with aneurysmal SAH between 1997 and 2004. Clinical demographics and imaging data for all patients were reviewed and a logistic regression analysis was performed to identify the predictors of cerebral vasospasm, defined as a combination of clinical signs with radiographic confirmation. Results 308 patients were included. Mean age was higher in the group receiving statins (64 +/- 12 versus 54+/- 12 years). Hunt and Hess scores and treatment modality were not significantly different between the groups. Vasospasm was observed in 31% of patients not taking a statin (n=282) versus 23% taking a statin (n=26), without achieving statistical significance. Discontinuation of the statin did not affect risk of vasospasm. Conclusions Use of a statin prior to an aneurysmal SAH trended to reduce the incidence of subsequent vasospasm, without achieving statistical significance. PMID:18423529

  1. Using transportation accident databases to investigate ignition and explosion probabilities of flammable spills.

    PubMed

    Ronza, A; Vílchez, J A; Casal, J

    2007-07-19

    Risk assessment of hazardous material spill scenarios, and quantitative risk assessment in particular, make use of event trees to account for the possible outcomes of hazardous releases. Using event trees entails the definition of probabilities of occurrence for events such as spill ignition and blast formation. This study comprises an extensive analysis of ignition and explosion probability data proposed in previous work. Subsequently, the results of the survey of two vast US federal spill databases (HMIRS, by the Department of Transportation, and MINMOD, by the US Coast Guard) are reported and commented on. Some tens of thousands of records of hydrocarbon spills were analysed. The general pattern of statistical ignition and explosion probabilities as a function of the amount and the substance spilled is discussed. Equations are proposed based on statistical data that predict the ignition probability of hydrocarbon spills as a function of the amount and the substance spilled. Explosion probabilities are put forth as well. Two sets of probability data are proposed: it is suggested that figures deduced from HMIRS be used in land transportation risk assessment, and MINMOD results with maritime scenarios assessment. Results are discussed and compared with previous technical literature.

  2. Skin injury model classification based on shape vector analysis

    PubMed Central

    2012-01-01

    Background: Skin injuries can be crucial in judicial decision making. Forensic experts base their classification on subjective opinions. This study investigates whether known classes of simulated skin injuries are correctly classified statistically based on 3D surface models and derived numerical shape descriptors. Methods: Skin injury surface characteristics are simulated with plasticine. Six injury classes – abrasions, incised wounds, gunshot entry wounds, smooth and textured strangulation marks as well as patterned injuries - with 18 instances each are used for a k-fold cross validation with six partitions. Deformed plasticine models are captured with a 3D surface scanner. Mean curvature is estimated for each polygon surface vertex. Subsequently, distance distributions and derived aspect ratios, convex hulls, concentric spheres, hyperbolic points and Fourier transforms are used to generate 1284-dimensional shape vectors. Subsequent descriptor reduction maximizing SNR (signal-to-noise ratio) result in an average of 41 descriptors (varying across k-folds). With non-normal multivariate distribution of heteroskedastic data, requirements for LDA (linear discriminant analysis) are not met. Thus, shrinkage parameters of RDA (regularized discriminant analysis) are optimized yielding a best performance with λ = 0.99 and γ = 0.001. Results: Receiver Operating Characteristic of a descriptive RDA yields an ideal Area Under the Curve of 1.0for all six categories. Predictive RDA results in an average CRR (correct recognition rate) of 97,22% under a 6 partition k-fold. Adding uniform noise within the range of one standard deviation degrades the average CRR to 71,3%. Conclusions: Digitized 3D surface shape data can be used to automatically classify idealized shape models of simulated skin injuries. Deriving some well established descriptors such as histograms, saddle shape of hyperbolic points or convex hulls with subsequent reduction of dimensionality while maximizing SNR seem to work well for the data at hand, as predictive RDA results in CRR of 97,22%. Objective basis for discrimination of non-overlapping hypotheses or categories are a major issue in medicolegal skin injury analysis and that is where this method appears to be strong. Technical surface quality is important in that adding noise clearly degrades CRR. Trial registration: This study does not cover the results of a controlled health care intervention as only plasticine was used. Thus, there was no trial registration. PMID:23497357

  3. Statistical word learning in children with autism spectrum disorder and specific language impairment.

    PubMed

    Haebig, Eileen; Saffran, Jenny R; Ellis Weismer, Susan

    2017-11-01

    Word learning is an important component of language development that influences child outcomes across multiple domains. Despite the importance of word knowledge, word-learning mechanisms are poorly understood in children with specific language impairment (SLI) and children with autism spectrum disorder (ASD). This study examined underlying mechanisms of word learning, specifically, statistical learning and fast-mapping, in school-aged children with typical and atypical development. Statistical learning was assessed through a word segmentation task and fast-mapping was examined in an object-label association task. We also examined children's ability to map meaning onto newly segmented words in a third task that combined exposure to an artificial language and a fast-mapping task. Children with SLI had poorer performance on the word segmentation and fast-mapping tasks relative to the typically developing and ASD groups, who did not differ from one another. However, when children with SLI were exposed to an artificial language with phonemes used in the subsequent fast-mapping task, they successfully learned more words than in the isolated fast-mapping task. There was some evidence that word segmentation abilities are associated with word learning in school-aged children with typical development and ASD, but not SLI. Follow-up analyses also examined performance in children with ASD who did and did not have a language impairment. Children with ASD with language impairment evidenced intact statistical learning abilities, but subtle weaknesses in fast-mapping abilities. As the Procedural Deficit Hypothesis (PDH) predicts, children with SLI have impairments in statistical learning. However, children with SLI also have impairments in fast-mapping. Nonetheless, they are able to take advantage of additional phonological exposure to boost subsequent word-learning performance. In contrast to the PDH, children with ASD appear to have intact statistical learning, regardless of language status; however, fast-mapping abilities differ according to broader language skills. © 2017 Association for Child and Adolescent Mental Health.

  4. Statistical Analysis of Research Data | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  5. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    PubMed

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  6. Trimethoprim prescription and subsequent resistance in childhood urinary infection: multilevel modelling analysis

    PubMed Central

    Duffy, Mary A; Hernandez-Santiago, Virginia; Orange, Gillian; Davey, Peter G; Guthrie, Bruce

    2013-01-01

    Background Antibiotic resistance is a growing concern and antibiotic usage the main contributing factor, but there are few studies examining antibiotic use and resistance in children. Aim To investigate the association between previous trimethoprim prescribing and resistance in urinary Escherichia coli (E. coli) isolates in children. Design and setting Retrospective, population cohort study in Tayside, Scotland. Method Multilevel modelling of linked microbiology and dispensed prescribing data for 1373 ≤16-year-olds with E. coli urinary isolates in 2004–2009, examining the association between prior trimethoprim prescription and subsequent trimethoprim resistance in people with urinary E. coli isolates. Results Trimethoprim resistance was common (26.6%, 95% confidence interval [CI] = 24.6 to 28.6). Previous trimethoprim prescription was associated with subsequent culture of trimethoprim-resistant E. coli, with more recent prescription being more strongly associated with resistance. After adjusting for the number of previous E. coli isolates and sample year, trimethoprim prescribing in the previous 84 days remained significantly associated with culturing trimethoprim-resistant E. coli (adjusted OR 4.71, 95% CI = 1.83 to 12.16 for the previous 15–28 days versus never prescribed; adjusted OR 3.16, 95% CI = 1.63 to 6.13 for the previous 29–84 days); however, associations were not statistically significant for longer periods since prior exposure. Conclusion Trimethoprim prescription has implications for future resistance in individual children, as well as at population level. Clinicians must ensure appropriateness of treatment choice and duration, and alternative antibiotics should be considered for childhood urinary tract infections if trimethoprim has been prescribed in the preceding 3 months. PMID:23540479

  7. Trimethoprim prescription and subsequent resistance in childhood urinary infection: multilevel modelling analysis.

    PubMed

    Duffy, Mary A; Hernandez-Santiago, Virginia; Orange, Gillian; Davey, Peter G; Guthrie, Bruce

    2013-04-01

    Antibiotic resistance is a growing concern and antibiotic usage the main contributing factor, but there are few studies examining antibiotic use and resistance in children. To investigate the association between previous trimethoprim prescribing and resistance in urinary Escherichia coli (E. coli) isolates in children. Retrospective, population cohort study in Tayside, Scotland. Multilevel modelling of linked microbiology and dispensed prescribing data for 1373 ≤16-year-olds with E. coli urinary isolates in 2004-2009, examining the association between prior trimethoprim prescription and subsequent trimethoprim resistance in people with urinary E. coli isolates. Trimethoprim resistance was common (26.6%, 95% confidence interval [CI] = 24.6 to 28.6). Previous trimethoprim prescription was associated with subsequent culture of trimethoprim-resistant E. coli, with more recent prescription being more strongly associated with resistance. After adjusting for the number of previous E. coli isolates and sample year, trimethoprim prescribing in the previous 84 days remained significantly associated with culturing trimethoprim-resistant E. coli (adjusted OR 4.71, 95% CI = 1.83 to 12.16 for the previous 15-28 days versus never prescribed; adjusted OR 3.16, 95% CI = 1.63 to 6.13 for the previous 29-84 days); however, associations were not statistically significant for longer periods since prior exposure. Trimethoprim prescription has implications for future resistance in individual children, as well as at population level. Clinicians must ensure appropriateness of treatment choice and duration, and alternative antibiotics should be considered for childhood urinary tract infections if trimethoprim has been prescribed in the preceding 3 months.

  8. Deformation-induced spatiotemporal fluctuation, evolution and localization of strain fields in a bulk metallic glass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yuan; Bei, Hongbin; Wang, Yanli

    Deformation behavior and local strain evolutions upon loading and unloading of a bulk metallic glass (BMG) were systematically investigated by in situ digital image correlation (DIC). Distinct fluctuations and irreversible local strains were observed before the onset of macroscopic yielding. Statistical analysis shows that these fluctuations might be related to intrinsic structural heterogeneities, and that the evolution history and characteristics of local strain fields play an important role in the subsequent initiation of shear bands. Effects of sample size, pre-strain, and loading conditions were systematically analyzed in terms of the probability distributions of the resulting local strain fields. It ismore » found that a higher degree of local shear strain heterogeneity corresponds to a more ductile stressestrain curve. Implications of these findings are discussed for the design of new materials.« less

  9. Regularly arranged indium islands on glass/molybdenum substrates upon femtosecond laser and physical vapor deposition processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringleb, F.; Eylers, K.; Teubner, Th.

    2016-03-14

    A bottom-up approach is presented for the production of arrays of indium islands on a molybdenum layer on glass, which can serve as micro-sized precursors for indium compounds such as copper-indium-gallium-diselenide used in photovoltaics. Femtosecond laser ablation of glass and a subsequent deposition of a molybdenum film or direct laser processing of the molybdenum film both allow the preferential nucleation and growth of indium islands at the predefined locations in a following indium-based physical vapor deposition (PVD) process. A proper choice of laser and deposition parameters ensures the controlled growth of indium islands exclusively at the laser ablated spots. Basedmore » on a statistical analysis, these results are compared to the non-structured molybdenum surface, leading to randomly grown indium islands after PVD.« less

  10. Reliability of excess-flow check-valves in turbine lubrication systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dundas, R.E.

    1996-12-31

    Reliability studies on excess-flow check valves installed in a gas turbine lubrication system for prevention of spray fires subsequent to fracture or separation of lube lines were conducted. Fault-tree analyses are presented for the case of failure of a valve to close when called upon by separation of a downstream line, as well as for the case of accidental closure during normal operation, leading to interruption of lubricating oil flow to a bearing. The probabilities of either of these occurrences are evaluated. The results of a statistical analysis of accidental closure of excess-flow check valves in commercial airplanes in themore » period 1986--91 are also given, as well as a summary of reliability studies on the use of these valves in residential gas installations, conducted under the sponsorship of the Gas Research Institute.« less

  11. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  12. Multichannel Spectrometer of Time Distribution

    NASA Astrophysics Data System (ADS)

    Akindinova, E. V.; Babenko, A. G.; Vakhtel, V. M.; Evseev, N. A.; Rabotkin, V. A.; Kharitonova, D. D.

    2015-06-01

    For research and control of characteristics of radiation fluxes, radioactive sources in particular, for example, in paper [1], a spectrometer and methods of data measurement and processing based on the multichannel counter of time intervals of accident events appearance (impulses of particle detector) MC-2A (SPC "ASPECT") were created. The spectrometer has four independent channels of registration of time intervals of impulses appearance and correspondent amplitude and spectrometric channels for control along the energy spectra of the operation stationarity of paths of each of the channels from the detector to the amplifier. The registration of alpha-radiation is carried out by the semiconductor detectors with energy resolution of 16-30 keV. Using a spectrometer there have been taken measurements of oscillations of alpha-radiation 239-Pu flux intensity with a subsequent autocorrelative statistical analysis of the time series of readings.

  13. Deformation-induced spatiotemporal fluctuation, evolution and localization of strain fields in a bulk metallic glass

    DOE PAGES

    Wu, Yuan; Bei, Hongbin; Wang, Yanli; ...

    2015-05-16

    Deformation behavior and local strain evolutions upon loading and unloading of a bulk metallic glass (BMG) were systematically investigated by in situ digital image correlation (DIC). Distinct fluctuations and irreversible local strains were observed before the onset of macroscopic yielding. Statistical analysis shows that these fluctuations might be related to intrinsic structural heterogeneities, and that the evolution history and characteristics of local strain fields play an important role in the subsequent initiation of shear bands. Effects of sample size, pre-strain, and loading conditions were systematically analyzed in terms of the probability distributions of the resulting local strain fields. It ismore » found that a higher degree of local shear strain heterogeneity corresponds to a more ductile stressestrain curve. Implications of these findings are discussed for the design of new materials.« less

  14. Search for periodicities near 59 s in the COS-B gamma-ray data of 2CG195+04 (Geminga)

    NASA Technical Reports Server (NTRS)

    Buccheri, R.; Pollock, A. M. T.; Bennett, K.; Bignami, G. F.; Caraveo, P. A.; Hermsen, W.; Mayer-Hasselwander, H. A.; Sacco, B.

    1985-01-01

    The COS-B data relating to five observations in the general direction of Geminga, spanning 6.7 years, were searched for pulsation near 59 s. The SAS-2 indication is not confirmed. An indication of a 59 s pulsation in the gamma ray emission from 2CG195+04 (Geminga) was reported. Early analysis of COS-B data supported the result while later improved statistics did not confirm it. Subsequently, detection of a 59 s pulsation in the emission from the direction of Geminga at ultra high gamma and X-rays was reported. Geminga was identified with the X-ray source 1E0630+128. The final COS-B data on Geminga which was observed five times for a total of 214 days are reported.

  15. Density-dependent blood stage Plasmodium falciparum suppresses malaria super-infection in a malaria holoendemic population.

    PubMed

    Pinkevych, Mykola; Petravic, Janka; Chelimo, Kiprotich; Vulule, John; Kazura, James W; Moormann, Ann M; Davenport, Miles P

    2013-11-01

    Recent studies of Plasmodium berghei malaria in mice show that high blood-stage parasitemia levels inhibit the development of subsequent liver-stage infections. Whether a similar inhibitory effect on liver-stage Plasmodium falciparum by blood-stage infection occurs in humans is unknown. We have analyzed data from a treatment-time-to-infection cohort of children < 10 years of age residing in a malaria holoendemic area of Kenya where people experience a new blood-stage infection approximately every 2 weeks. We hypothesized that if high parasitemia blocked the liver stage, then high levels of parasitemia should be followed by a "skipped" peak of parasitemia. Statistical analysis of "natural infection" field data and stochastic simulation of infection dynamics show that the data are consistent with high P. falciparum parasitemia inhibiting liver-stage parasite development in humans.

  16. What's Hot and What's Not: Multivariate Statistical Analysis of Ten Labile Trace Elements in H-Chondrite Population Pairs

    NASA Astrophysics Data System (ADS)

    Wolf, S. F.; Lipschutz, M. E.

    1993-07-01

    Dodd et al. [1] found that, from their circumstances of fall, 17 H chondrites ("H Cluster 1") which fell in May, from 1855 to 1895, are distinguishable from other H chondrite falls and apparently derive from a co-orbital stream of meteoroids. From data for 10 moderately to highly labile trace elements (Rb, Ag, Se, Cs, Te, Zn, Cd, Bi, Tl, In), they used two multivariate statistical techniques--linear discriminant analysis and logistic regression--to demonstrate that 1. 13 H Cluster 1 chondrites are compositionally distinguishable from 45 other H chondrite falls, probably because of differences in thermal histories of the meteorites' parent materials; 2. The reality of the compositional differences between the populations of falls are beyond any reasonable statistical doubt. 3. The compositional differences are inconsistent with the notion that the results reflect analytical bias. We have used these techniques to assess analogous data for various H chondrite populations [2-4] with results that are listed in Table 1. These data indicate that 1. There is no statistical reason to believe that random populations from Victoria Land, Antarctica, differ compositionally from each other. 2. There is significant statistical reason to believe that the H chondrite population recovered from Victoria Land, Antarctica, differs compositionally from that from Queen Maud Land, Antarctica, and from falls. 3. There is no reason to believe that the H chondrite population recovered from Queen Maud Land, Antarctica, differs compositionally from falls. 4. These observations can be made either by data obtained by one analyst or several. These results, coupled with earlier ones [5], demonstrate that trivial explanations cannot explain compositional differences involving labile trace elements in pairs of H chondrite populations. These differences must then reflect differences of preterrestrial thermal histories of the meteorites' parent materials. Acceptance of these differences as preterrestrial has led to predictions subsequently verified by others (meteoroid and asteroid stream discoveries, differencesin thermoluminescence or TL). We predict that a TL difference will be seen between the populations of falls defined by Dodd et al. [1]. References: [1] Dodd R. T. et al. (1993) JGR, submitted. [2] Lingner D. W. et al. (1987) GCA, 51, 727-739. [3] Dennison J. E. and Lipschutz M. E. (1987) GCA, 51, 741-754. [4] Wolf S. F. and Lipschutz M. E. (1993) in Advances in Analytical Geochemistry (M. Hyman and M. Rowe, eds.), in press. [5] Wang M.-S. et al. (1992) Meteoritics, 27, 303. [6] Lipschutz M. E. and Samuels S. M. (1991) GCA, 55, 19-47. Table 1, which appears in the hard copy, shows a multivariate statistical analysis of H chondrite population pairs using 10 labile trace elements (number of meteorites in population in parentheses).

  17. A scoping review of spatial cluster analysis techniques for point-event data.

    PubMed

    Fritz, Charles E; Schuurman, Nadine; Robertson, Colin; Lear, Scott

    2013-05-01

    Spatial cluster analysis is a uniquely interdisciplinary endeavour, and so it is important to communicate and disseminate ideas, innovations, best practices and challenges across practitioners, applied epidemiology researchers and spatial statisticians. In this research we conducted a scoping review to systematically search peer-reviewed journal databases for research that has employed spatial cluster analysis methods on individual-level, address location, or x and y coordinate derived data. To illustrate the thematic issues raised by our results, methods were tested using a dataset where known clusters existed. Point pattern methods, spatial clustering and cluster detection tests, and a locally weighted spatial regression model were most commonly used for individual-level, address location data (n = 29). The spatial scan statistic was the most popular method for address location data (n = 19). Six themes were identified relating to the application of spatial cluster analysis methods and subsequent analyses, which we recommend researchers to consider; exploratory analysis, visualization, spatial resolution, aetiology, scale and spatial weights. It is our intention that researchers seeking direction for using spatial cluster analysis methods, consider the caveats and strengths of each approach, but also explore the numerous other methods available for this type of analysis. Applied spatial epidemiology researchers and practitioners should give special consideration to applying multiple tests to a dataset. Future research should focus on developing frameworks for selecting appropriate methods and the corresponding spatial weighting schemes.

  18. Making it the Hard Way: Adolescents in the 1980s.

    ERIC Educational Resources Information Center

    Lipsitz, Joan Scheff

    This testimony, prepared for presentation to the Crisis Intervention Task Force of the House Select Committee on Children Youth, and Families, begins with an overview of statistics on education, employment, and marriage among adolescents. Subsequently, it reviews data on indicators of youth crises: pregnancy, sexuality, substance abuse, smoking,…

  19. "Transfer Shock" or "Transfer Ecstasy?"

    ERIC Educational Resources Information Center

    Nickens, John M.

    The alleged characteristic drop in grade point average (GPA) of transfer students and the subsequent rise in GPA was investigated in this study. No statistically significant difference was found in first term junior year GPA between junior college transfers and native Florida State University students after the variance accounted for by the…

  20. A Model to predict the impact of specification changes on the chloride-induced service life of Virginia bridge decks.

    DOT National Transportation Integrated Search

    2002-01-01

    A model to determine the time to first repair and subsequent rehabilitation of concrete bridge decks exposed to chloride deicer salts that recognizes and incorporates the statistical nature of factors affecting the corrosion process is developed. The...

  1. Innovation & Risk Management Result in Energy and Life-Cycle Savings.

    ERIC Educational Resources Information Center

    Anstrand, David E.; Singh, J. B.

    1999-01-01

    Examines a Pennsylvania school's successful planning, design, and bidding process for acquiring a geothermal heat pump (GHP)system whose subsequent efficiency became award-winning for environmental excellence. Charts and statistical tables describe the GHP's energy savings. Concluding comments review the lessons learned from the process. (GR)

  2. The Influence of Level of Discrepancy on the Identification of Students with Learning Disabilities.

    ERIC Educational Resources Information Center

    McLeskey, James

    1989-01-01

    Investigation of the relationship between a statistically determined severe discrepancy between expected and actual achievement levels and subsequent labeling of 733 students as learning disabled found only a slight majority of labeled students manifesting a severe discrepancy suggesting this criterion is inconsistently applied in making…

  3. Alfalfa stand length and subsequent crop patterns in the upper Midwestern United States

    USDA-ARS?s Scientific Manuscript database

    To gain perspective on alfalfa (Medicago sativa L.), annual crop rotations in the upper midwestern United States, USDA-National Agricultural Statistics Service (NASS) cropland data layers (CDLs) and USDA-NRCS soil survey layers were combined for six states (North Dakota, South Dakota, Nebraska, Minn...

  4. Phospholipid Fatty Acid Analysis: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Findlay, R. H.

    2008-12-01

    With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.

  5. Statistical forecasting of repetitious dome failures during the waning eruption of Redoubt Volcano, Alaska, February-April 1990

    USGS Publications Warehouse

    Page, R.A.; Lahr, J.C.; Chouet, B.A.; Power, J.A.; Stephens, C.D.

    1994-01-01

    The waning phase of the 1989-1990 eruption of Redoubt Volcano in the Cook Inlet region of south-central Alaska comprised a quasi-regular pattern of repetitious dome growth and destruction that lasted from February 15 to late April 1990. The dome failures produced ash plumes hazardous to airline traffic. In response to this hazard, the Alaska Volcano Observatory sought to forecast these ash-producing events using two approaches. One approach built on early successes in issuing warnings before major eruptions on December 14, 1989 and January 2, 1990. These warnings were based largely on changes in seismic activity related to the occurrence of precursory swarms of long-period seismic events. The search for precursory swarms of long-period seismicity was continued through the waning phase of the eruption and led to warnings before tephra eruptions on March 23 and April 6. The observed regularity of dome failures after February 15 suggested that a statistical forecasting method based on a constant-rate failure model might also be successful. The first statistical forecast was issued on March 16 after seven events had occurred, at an average interval of 4.5 days. At this time, the interval between dome failures abruptly lengthened. Accordingly, the forecast was unsuccessful and further forecasting was suspended until the regularity of subsequent failures could be confirmed. Statistical forecasting resumed on April 12, after four dome failure episodes separated by an average of 7.8 days. One dome failure (April 15) was successfully forecast using a 70% confidence window, and a second event (April 21) was narrowly missed before the end of the activity. The cessation of dome failures after April 21 resulted in a concluding false alarm. Although forecasting success during the eruption was limited, retrospective analysis shows that early and consistent application of the statistical method using a constant-rate failure model and a 90% confidence window could have yielded five successful forecasts and two false alarms; no events would have been missed. On closer examination, the intervals between successive dome failures are not uniform but tend to increase with time. This increase attests to the continuous, slowly decreasing supply of magma to the surface vent during the waning phase of the eruption. The domes formed in a precarious position in a breach in the summit crater rim where they were susceptible to gravitational collapse. The instability of the February 15-April 21 domes relative to the earlier domes is attributed to reaming the lip of the vent by a laterally directed explosion during the major dome-destroying eruption of February 15, a process which would leave a less secure foundation for subsequent domes. ?? 1994.

  6. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  7. Learning Object Names at Different Hierarchical Levels Using Cross-Situational Statistics.

    PubMed

    Chen, Chi-Hsin; Zhang, Yayun; Yu, Chen

    2018-05-01

    Objects in the world usually have names at different hierarchical levels (e.g., beagle, dog, animal). This research investigates adults' ability to use cross-situational statistics to simultaneously learn object labels at individual and category levels. The results revealed that adults were able to use co-occurrence information to learn hierarchical labels in contexts where the labels for individual objects and labels for categories were presented in completely separated blocks, in interleaved blocks, or mixed in the same trial. Temporal presentation schedules significantly affected the learning of individual object labels, but not the learning of category labels. Learners' subsequent generalization of category labels indicated sensitivity to the structure of statistical input. Copyright © 2017 Cognitive Science Society, Inc.

  8. Basal cell skin cancer and the risk of second primary cancers: a cancer registry-based study in Lithuania.

    PubMed

    Krilaviciute, Agne; Vincerzevskiene, Ieva; Smailyte, Giedre

    2016-07-01

    The aim of this population-based cohort study was to determine the risk of second primary cancer in basal cell carcinoma (BCC) patients in Lithuania. This analysis was based on patients diagnosed with BCC in Lithuania between 1998 and 2007 and followed until 2011. Standardized incidence ratios for subsequent cancers as a ratio of observed number of cancer cases in people with previous BCC diagnosis to the expected number of cancer cases in the underlying general population were calculated. After diagnosis of BCC, 1442 new cases of selected cancers were diagnosed. Compared with the general population, the incidence of all new primaries combined after BCC was very close to expected. Statistically meaningful increase in developing subsequent cancer was obtained for Hodgkin's lymphoma, prostate cancer, and leukemia in men, and for cancers of the lip, lung, and breast in women. Risk of melanoma and thyroid cancer was significantly elevated in both sexes. Relative risk of cancer of the eye was increased although not significant. In our study, we found increased cancer risk for cancers related to sun exposure. In addition, increased risks were identified for Hodgkin's lymphoma, thyroid cancer, leukemia, prostate, and breast cancer in BCC patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Cephalometric norms for orthognathic surgery for North India (Eastern Uttar Pradesh).

    PubMed

    Gulati, Rajeev; Jain, Shikha

    2011-01-01

    The present study was aimed at development of the cephalometric norms for orthognathic surgery for the population of eastern Uttar Pradesh in North India. This study was conducted at a dental college. The study sample consisted of 50 males and 50 females. Each lateral cephalogram was taken in occlusion and subsequently traced. All reference points, landmarks, and measurements were made according to cephalometrics for orthognathic surgery (COGS) system. The statistical analysis involved calculation of mean and standard deviation for each of the 23 parameters assessed for each subject. The data was subsequently compared with COGS study by using Normal (Z) test. The norms were derived for the purvanchal population of North India and these were found to be quite distinct compared to those obtained from COGS study with respect to specific parameters. Male subjects indicated greater prominence of chin relative to the face, decreased posterior divergence, infraeruption of upper and lower molar as well as lower incisors, decreased total effective length of the maxilla, tendency towards Class III occlusion, and procumbent lower incisors. Female subjects, however, indicated increased anterior cranial base length, greater prominence of chin relative to the face, prognathic maxilla and mandible, increased middle third facial height, infraerupted lower incisors, increased mandibular body length, and procumbent lower incisors.

  10. Factors related to low birth rate among married women in Korea

    PubMed Central

    Song, Ju-Eun; Lee, Sun-Kyoung; Roh, Eun Ha

    2018-01-01

    The purpose of this study was to explore the factors influencing low birth rate among married women using the National Survey data in Korea. We compared the different influences on women’s first and subsequent childbirths. This study was a secondary analysis using the “National Survey on Fertility and Family Health and Welfare”, which was a nationally representative survey conducted by the Korea Institute for Health and Social Affairs. We analyzed the data of 3,482 married women (aged between 19 and 39 years) using SPSS 20.0 program for descriptive statistics, t-test, one-way ANOVA, and binary and ordinal logistic regression models. The factors influencing women’s first childbirth included perceptions about the value of marriage and children and their education level. The factors influencing their subsequent childbirths included multifaceted variables of maternal age during the first childbirth, residential area, religion, monthly household income, perceptions about the value of marriage and children, and social media. It is necessary to improve women’s awareness and positive perceptions about marriage and children in order to increase the birth rate in Korea. Moreover, consistently providing financial and political support for maternal and childcare concerns and using social media to foster more positive attitudes toward having children may enhance birth rates in the future. PMID:29558506

  11. Factors related to low birth rate among married women in Korea.

    PubMed

    Song, Ju-Eun; Ahn, Jeong-Ah; Lee, Sun-Kyoung; Roh, Eun Ha

    2018-01-01

    The purpose of this study was to explore the factors influencing low birth rate among married women using the National Survey data in Korea. We compared the different influences on women's first and subsequent childbirths. This study was a secondary analysis using the "National Survey on Fertility and Family Health and Welfare", which was a nationally representative survey conducted by the Korea Institute for Health and Social Affairs. We analyzed the data of 3,482 married women (aged between 19 and 39 years) using SPSS 20.0 program for descriptive statistics, t-test, one-way ANOVA, and binary and ordinal logistic regression models. The factors influencing women's first childbirth included perceptions about the value of marriage and children and their education level. The factors influencing their subsequent childbirths included multifaceted variables of maternal age during the first childbirth, residential area, religion, monthly household income, perceptions about the value of marriage and children, and social media. It is necessary to improve women's awareness and positive perceptions about marriage and children in order to increase the birth rate in Korea. Moreover, consistently providing financial and political support for maternal and childcare concerns and using social media to foster more positive attitudes toward having children may enhance birth rates in the future.

  12. Using Dynamic Time Warping and Data Forensics to Examine Tradeoffs among Land-Energy-Water Networks Across the Conterminous United States

    NASA Astrophysics Data System (ADS)

    McManamay, R.; Allen, M. R.; Piburn, J.; Sanyal, J.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Characterizing interdependencies among land-energy-water sectors, their vulnerabilities, and tipping points, is challenging, especially if all sectors are simultaneously considered. Because such holistic system behavior is uncertain, largely unmodeled, and in need of testable hypotheses of system drivers, these dynamics are conducive to exploratory analytics of spatiotemporal patterns, powered by tools, such as Dynamic Time Warping (DTW). Here, we conduct a retrospective analysis (1950 - 2010) of temporal trends in land use, energy use, and water use within US counties to identify commonalities in resource consumption and adaptation strategies to resource limitations. We combine existing and derived data from statistical downscaling to synthesize a temporally comprehensive land-energy-water dataset at the US county level and apply DTW and subsequent hierarchical clustering to examine similar temporal trends in resource typologies for land, energy, and water sectors. As expected, we observed tradeoffs among water uses (e.g., public supply vs irrigation) and land uses (e.g., urban vs ag). Strong associations between clusters amongst sectors reveal tight system interdependencies, whereas weak associations suggest unique behaviors and potential for human adaptations towards disruptive technologies and less resource-dependent population growth. Our framework is useful for exploring complex human-environmental system dynamics and generating hypotheses to guide subsequent energy-water-nexus research.

  13. Peak picking and the assessment of separation performance in two-dimensional high performance liquid chromatography.

    PubMed

    Stevenson, Paul G; Mnatsakanyan, Mariam; Guiochon, Georges; Shalliker, R Andrew

    2010-07-01

    An algorithm was developed for 2DHPLC that automated the process of peak recognition, measuring their retention times, and then subsequently plotting the information in a two-dimensional retention plane. Following the recognition of peaks, the software then performed a series of statistical assessments of the separation performance, measuring for example, correlation between dimensions, peak capacity and the percentage of usage of the separation space. Peak recognition was achieved by interpreting the first and second derivatives of each respective one-dimensional chromatogram to determine the 1D retention times of each solute and then compiling these retention times for each respective fraction 'cut'. Due to the nature of comprehensive 2DHPLC adjacent cut fractions may contain peaks common to more than one cut fraction. The algorithm determined which components were common in adjacent cuts and subsequently calculated the peak maximum profile by interpolating the space between adjacent peaks. This algorithm was applied to the analysis of a two-dimensional separation of an apple flesh extract separated in a first dimension comprising a cyano stationary phase and an aqueous/THF mobile phase as the first dimension and a second dimension comprising C18-Hydro with an aqueous/MeOH mobile phase. A total of 187 peaks were detected.

  14. Influence of food consumption patterns and Galician lifestyle on human gut microbiota.

    PubMed

    Castro-Penalonga, María; Roca-Saavedra, Paula; Miranda, Jose Manuel; Porto-Arias, Jose Julio; Nebot, Carolina; Cardelle-Cobas, Alejandra; Franco, Carlos Manuel; Cepeda, Alberto

    2018-02-01

    The proportion of different microbial populations in the human gut is an important factor that in recent years has been linked to obesity and numerous metabolic diseases. Because there are many factors that can affect the composition of human gut microbiota, it is of interest to have information about what is the composition of the gut microbiota in different populations in order to better understand the possibilities for improving nutritional management. A group of 31 volunteers were selected according to established inclusion and exclusion criteria and were asked about their diet history, lifestyle patterns, and adherence to the Southern European Atlantic Diet. Fecal samples were taken and subsequently analyzed by real-time PCR. The results indicated different dietary patterns for subjects who consumed a higher amount of fruits, vegetables, legumes, and fish and a lower amount of bakery foods and precooked foods and snacks compared to Spanish consumption data. Most participants showed intermediate or high adherence to Southern European Atlantic Diet, and an analysis of gut microbiota showed high numbers of total bacteria and Actinobacteria, as well as high amounts of bacteria belonging to the genera Lactobacillus spp. and Bifidobacterium spp. A subsequent statistical comparison also revealed differences in gut microbiota depending on the subject's body weight, age, or degree of adherence to the Southern European Atlantic Diet.

  15. Oral Lactobacillus Counts Predict Weight Gain Susceptibility: A 6-Year Follow-Up Study

    PubMed Central

    Rosing, Johanne Aviaja; Walker, Karen Christina; Jensen, Benjamin A.H.; Heitmann, Berit L.

    2017-01-01

    Background Recent studies have shown an association between weight change and the makeup of the intestinal microbiota in humans. Specifically, Lactobacillus, a part of the entire gastrointestinal tract's microbiota, has been shown to contribute to weight regulation. Aim We examined the association between the level of oral Lactobacillus and the subsequent 6-year weight change in a healthy population of 322 Danish adults aged 35–65 years at baseline. Design Prospective observational study. Results In unadjusted analysis the level of oral Lactobacillus was inversely associated with subsequent 6-year change in BMI. A statistically significant interaction between the baseline level of oral Lactobacillus and the consumption of complex carbohydrates was found, e.g. high oral Lactobacillus count predicted weight loss for those with a low intake of complex carbohydrates, while a medium intake of complex carbohydrates predicted diminished weight gain. A closer examination of these relations showed that BMI change and Lactobacillus level was unrelated for those with high complex carbohydrate consumption. Conclusion A high level of oral Lactobacillus seems related to weight loss among those with medium and low intakes of complex carbohydrates. Absence, or a low level of oral Lactobacillus, may potentially be a novel marker to identify those at increased risk of weight gain. PMID:29020671

  16. Oral Lactobacillus Counts Predict Weight Gain Susceptibility: A 6-Year Follow-Up Study.

    PubMed

    Rosing, Johanne Aviaja; Walker, Karen Christina; Jensen, Benjamin A H; Heitmann, Berit L

    2017-01-01

    Recent studies have shown an association between weight change and the makeup of the intestinal microbiota in humans. Specifically, Lactobacillus, a part of the entire gastrointestinal tract's microbiota, has been shown to contribute to weight regulation. We examined the association between the level of oral Lactobacillus and the subsequent 6-year weight change in a healthy population of 322 Danish adults aged 35-65 years at baseline. Prospective observational study. In unadjusted analysis the level of oral Lactobacillus was inversely associated with subsequent 6-year change in BMI. A statistically significant interaction between the baseline level of oral Lactobacillus and the consumption of complex carbohydrates was found, e.g. high oral Lactobacillus count predicted weight loss for those with a low intake of complex carbohydrates, while a medium intake of complex carbohydrates predicted diminished weight gain. A closer examination of these relations showed that BMI change and Lactobacillus level was unrelated for those with high complex carbohydrate consumption. A high level of oral Lactobacillus seems related to weight loss among those with medium and low intakes of complex carbohydrates. Absence, or a low level of oral Lactobacillus, may potentially be a novel marker to identify those at increased risk of weight gain. © 2017 The Author(s) Published by S. Karger GmbH, Freiburg.

  17. Weight change later in life and colon and rectal cancer risk in participants in the EPIC-PANACEA study.

    PubMed

    Steins Bisschop, Charlotte N; van Gils, Carla H; Emaus, Marleen J; Bueno-de-Mesquita, H Bas; Monninkhof, Evelyn M; Boeing, Heiner; Aleksandrova, Krasmira; Jenab, Mazda; Norat, Teresa; Riboli, Elio; Boutron-Rualt, Marie-Christine; Fagherazzi, Guy; Racine, Antoine; Palli, Domenico; Krogh, Vittorio; Tumino, Rosario; Naccarati, Alessio; Mattiello, Amalia; Argüelles, Marcial Vicente; Sanchez, Maria José; Tormo, Maria José; Ardanaz, Eva; Dorronsoro, Miren; Bonet, Catalina; Khaw, Kay-Tee; Key, Tim; Trichopoulou, Antonia; Orfanos, Philippos; Naska, Androniki; Kaaks, Rudolph R; Lukanova, Annekatrin; Pischon, Tobias; Ljuslinder, Ingrid; Jirström, Karin; Ohlsson, Bodil; Overvad, Kim; Landsvig Berentzen, Tina; Halkjaer, Jytte; Tjonneland, Anne; Weiderpass, Elisabete; Skeie, Guri; Braaten, Tonje; Siersema, Peter D; Freisling, Heinz; Ferrari, Pietro; Peeters, Petra H M; May, Anne M

    2014-01-01

    A moderate association exists between body mass index (BMI) and colorectal cancer. Less is known about the effect of weight change. We investigated the relation between BMI and weight change and subsequent colon and rectal cancer risk. This was studied among 328,781 participants in the prospective European Prospective Investigation into Cancer-Physical Activity, Nutrition, Alcohol, Cessation of Smoking, Eating study (mean age: 50 y). Body weight was assessed at recruitment and on average 5 y later. Self-reported weight change (kg/y) was categorized in sex-specific quintiles, with quintiles 2 and 3 combined as the reference category (men: -0.6 to 0.3 kg/y; women: -0.4 to 0.4 kg/y). In the subsequent years, participants were followed for the occurrence of colon and rectal cancer (median period: 6.8 y). Multivariable Cox proportional hazards regression analyses were used to study the association. A total of 1261 incident colon cancer and 747 rectal cancer cases were identified. BMI at recruitment was statistically significantly associated with colon cancer risk in men (HR: 1.04; 95% CI: 1.02, 1.07). Moderate weight gain (quintile 4) in men increased risk further (HR: 1.32; 95% CI: 1.04, 1.68), but this relation did not show a clear trend. In women, BMI or weight gain was not related to subsequent risk of colon cancer. No statistically significant associations for weight loss and colon cancer or for BMI and weight changes and rectal cancer were found. BMI attained at adulthood was associated with colon cancer risk. Subsequent weight gain or loss was not related to colon or rectal cancer risk in men or women.

  18. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  19. A Complete Color Normalization Approach to Histopathology Images Using Color Cues Computed From Saturation-Weighted Statistics.

    PubMed

    Li, Xingyu; Plataniotis, Konstantinos N

    2015-07-01

    In digital histopathology, tasks of segmentation and disease diagnosis are achieved by quantitative analysis of image content. However, color variation in image samples makes it challenging to produce reliable results. This paper introduces a complete normalization scheme to address the problem of color variation in histopathology images jointly caused by inconsistent biopsy staining and nonstandard imaging condition. Method : Different from existing normalization methods that either address partial cause of color variation or lump them together, our method identifies causes of color variation based on a microscopic imaging model and addresses inconsistency in biopsy imaging and staining by an illuminant normalization module and a spectral normalization module, respectively. In evaluation, we use two public datasets that are representative of histopathology images commonly received in clinics to examine the proposed method from the aspects of robustness to system settings, performance consistency against achromatic pixels, and normalization effectiveness in terms of histological information preservation. As the saturation-weighted statistics proposed in this study generates stable and reliable color cues for stain normalization, our scheme is robust to system parameters and insensitive to image content and achromatic colors. Extensive experimentation suggests that our approach outperforms state-of-the-art normalization methods as the proposed method is the only approach that succeeds to preserve histological information after normalization. The proposed color normalization solution would be useful to mitigate effects of color variation in pathology images on subsequent quantitative analysis.

  20. Statistical shape modeling of human cochlea: alignment and principal component analysis

    NASA Astrophysics Data System (ADS)

    Poznyakovskiy, Anton A.; Zahnert, Thomas; Fischer, Björn; Lasurashvili, Nikoloz; Kalaidzidis, Yannis; Mürbe, Dirk

    2013-02-01

    The modeling of the cochlear labyrinth in living subjects is hampered by insufficient resolution of available clinical imaging methods. These methods usually provide resolutions higher than 125 μm. This is too crude to record the position of basilar membrane and, as a result, keep apart even the scala tympani from other scalae. This problem could be avoided by the means of atlas-based segmentation. The specimens can endure higher radiation loads and, conversely, provide better-resolved images. The resulting surface can be used as the seed for atlas-based segmentation. To serve this purpose, we have developed a statistical shape model (SSM) of human scala tympani based on segmentations obtained from 10 μCT image stacks. After segmentation, we aligned the resulting surfaces using Procrustes alignment. This algorithm was slightly modified to accommodate single models with nodes which do not necessarily correspond to salient features and vary in number between models. We have established correspondence by mutual proximity between nodes. Rather than using the standard Euclidean norm, we have applied an alternative logarithmic norm to improve outlier treatment. The minimization was done using BFGS method. We have also split the surface nodes along an octree to reduce computation cost. Subsequently, we have performed the principal component analysis of the training set with Jacobi eigenvalue algorithm. We expect the resulting method to help acquiring not only better understanding in interindividual variations of cochlear anatomy, but also a step towards individual models for pre-operative diagnostics prior to cochlear implant insertions.

  1. Treatment Utilization and Unmet Treatment Need among Hispanics Following Brief Intervention

    PubMed Central

    Cochran, Gerald; Caetano, Raul

    2012-01-01

    Background In a large randomized trial examining ethnic differences in response to a brief alcohol intervention following an alcohol related injury, we showed that Hispanics, but not non-Hispanics, were more likely to reduce alcohol intake in comparison to treatment as usual (Field et al, 2010). The current study evaluates whether the observed improvements in drinking outcomes previously reported among Hispanics following brief intervention might be related to prior or subsequent treatment utilization. . Methods The present study is a secondary analysis of data collected in a randomized clinical trial that evaluated ethnic differences in the effect of a brief motivational intervention (BMI) on alcohol use among medical inpatients admitted for alcohol related injury. For the current study, statistical analyses were carried out to compare alcohol use, alcohol problems, treatment utilization and unmet treatment need between Hispanic (n=539) and White, non-Hispanic (n=667). In addition, we examined the relationship between prior treatment utilization and unmet treatment need and alcohol use outcomes following brief intervention and the impact of brief intervention on subsequent treatment utilization and unmet treatment need. Results In comparison to White, non-Hispanics, Hispanics at baseline reported heavier drinking, more alcohol problems, greater unmet treatment need and lower rates of treatment utilization. Among Hispanics, multilevel analyses showed that prior treatment utilization or unmet treatment need did not moderate the effect of BMI on alcohol outcomes. Furthermore, BMI did not significantly impact subsequent treatment utilization or unmet treatment need among Hispanics. Finally, treatment utilization and unmet treatment need at six-months were not significant mediators between BMI and alcohol use outcomes at follow up. Conclusion The benefits of brief intervention among Hispanics do not appear to be better explained by subsequent engagement in mutual help groups or formal substance abuse treatment. Prior history of treatment, regardless of the severity of alcohol problems, does not appear to influence the impact of brief intervention on alcohol use among Hispanics. These findings support prior results reporting the benefits of brief intervention among Hispanics and demonstrate that these improvements are not related to prior or subsequent treatment utilization. PMID:22823528

  2. Intelligent Monitoring? Assessing the ability of the Care Quality Commission's statistical surveillance tool to predict quality and prioritise NHS hospital inspections.

    PubMed

    Griffiths, Alex; Beaussier, Anne-Laure; Demeritt, David; Rothstein, Henry

    2017-02-01

    The Care Quality Commission (CQC) is responsible for ensuring the quality of the health and social care delivered by more than 30 000 registered providers in England. With only limited resources for conducting on-site inspections, the CQC has used statistical surveillance tools to help it identify which providers it should prioritise for inspection. In the face of planned funding cuts, the CQC plans to put more reliance on statistical surveillance tools to assess risks to quality and prioritise inspections accordingly. To evaluate the ability of the CQC's latest surveillance tool, Intelligent Monitoring (IM), to predict the quality of care provided by National Health Service (NHS) hospital trusts so that those at greatest risk of providing poor-quality care can be identified and targeted for inspection. The predictive ability of the IM tool is evaluated through regression analyses and χ 2 testing of the relationship between the quantitative risk score generated by the IM tool and the subsequent quality rating awarded following detailed on-site inspection by large expert teams of inspectors. First, the continuous risk scores generated by the CQC's IM statistical surveillance tool cannot predict inspection-based quality ratings of NHS hospital trusts (OR 0.38 (0.14 to 1.05) for Outstanding/Good, OR 0.94 (0.80 to -1.10) for Good/Requires improvement, and OR 0.90 (0.76 to 1.07) for Requires improvement/Inadequate). Second, the risk scores cannot be used more simply to distinguish the trusts performing poorly-those subsequently rated either 'Requires improvement' or 'Inadequate'-from the trusts performing well-those subsequently rated either 'Good' or 'Outstanding' (OR 1.07 (0.91 to 1.26)). Classifying CQC's risk bandings 1-3 as high risk and 4-6 as low risk, 11 of the high risk trusts were performing well and 43 of the low risk trusts were performing poorly, resulting in an overall accuracy rate of 47.6%. Third, the risk scores cannot be used even more simply to distinguish the worst performing trusts-those subsequently rated 'Inadequate'-from the remaining, better performing trusts (OR 1.11 (0.94 to 1.32)). Classifying CQC's risk banding 1 as high risk and 2-6 as low risk, the highest overall accuracy rate of 72.8% was achieved, but still only 6 of the 13 Inadequate trusts were correctly classified as being high risk. Since the IM statistical surveillance tool cannot predict the outcome of NHS hospital trust inspections, it cannot be used for prioritisation. A new approach to inspection planning is therefore required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  3. Randomized trials are frequently fragmented in multiple secondary publications.

    PubMed

    Ebrahim, Shanil; Montoya, Luis; Kamal El Din, Mostafa; Sohani, Zahra N; Agarwal, Arnav; Bance, Sheena; Saquib, Juliann; Saquib, Nazmus; Ioannidis, John P A

    2016-11-01

    To assess the frequency and features of secondary publications of randomized controlled trials (RCTs). For 191 RCTs published in high-impact journals in 2009, we searched for secondary publications coauthored by at least one same author of the primary trial publication. We evaluated the probability of having secondary publications, characteristics of the primary trial publication that predict having secondary publications, types of secondary analyses conducted, and statistical significance of those analyses. Of 191 primary trials, 88 (46%) had a total of 475 secondary publications by 2/2014. Eight trials had >10 (up to 51) secondary publications each. In multivariable modeling, the risk of having subsequent secondary publications increased 1.32-fold (95% CI 1.05-1.68) per 10-fold increase in sample size, and 1.71-fold (95% CI 1.19-2.45) in the presence of a design article. In a sample of 197 secondary publications examined in depth, 193 tested different hypotheses than the primary publication. Of the 193, 43 tested differences between subgroups, 85 assessed predictive factors associated with an outcome of interest, 118 evaluated different outcomes than the original article, 71 had differences in eligibility criteria, and 21 assessed different durations of follow-up; 176 (91%) presented at least one analysis with statistically significant results. Approximately half of randomized trials in high-impact journals have secondary publications published with a few trials followed by numerous secondary publications. Almost all of these publications report some statistically significant results. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Comparative shotgun proteomics using spectral count data and quasi-likelihood modeling.

    PubMed

    Li, Ming; Gray, William; Zhang, Haixia; Chung, Christine H; Billheimer, Dean; Yarbrough, Wendell G; Liebler, Daniel C; Shyr, Yu; Slebos, Robbert J C

    2010-08-06

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography-tandem mass spectrometry (LC-MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher's Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples.

  5. Comparative Shotgun Proteomics Using Spectral Count Data and Quasi-Likelihood Modeling

    PubMed Central

    2010-01-01

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography−tandem mass spectrometry (LC−MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher’s Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography−multiple reaction monitoring mass spectrometry (LC−MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples. PMID:20586475

  6. Regional intensity-duration-frequency analysis in the Eastern Black Sea Basin, Turkey, by using L-moments and regression analysis

    NASA Astrophysics Data System (ADS)

    Ghiaei, Farhad; Kankal, Murat; Anilan, Tugce; Yuksek, Omer

    2018-01-01

    The analysis of rainfall frequency is an important step in hydrology and water resources engineering. However, a lack of measuring stations, short duration of statistical periods, and unreliable outliers are among the most important problems when designing hydrology projects. In this study, regional rainfall analysis based on L-moments was used to overcome these problems in the Eastern Black Sea Basin (EBSB) of Turkey. The L-moments technique was applied at all stages of the regional analysis, including determining homogeneous regions, in addition to fitting and estimating parameters from appropriate distribution functions in each homogeneous region. We studied annual maximum rainfall height values of various durations (5 min to 24 h) from seven rain gauge stations located in the EBSB in Turkey, which have gauging periods of 39 to 70 years. Homogeneity of the region was evaluated by using L-moments. The goodness-of-fit criterion for each distribution was defined as the ZDIST statistics, depending on various distributions, including generalized logistic (GLO), generalized extreme value (GEV), generalized normal (GNO), Pearson type 3 (PE3), and generalized Pareto (GPA). GLO and GEV determined the best distributions for short (5 to 30 min) and long (1 to 24 h) period data, respectively. Based on the distribution functions, the governing equations were extracted for calculation of intensities of 2, 5, 25, 50, 100, 250, and 500 years return periods (T). Subsequently, the T values for different rainfall intensities were estimated using data quantifying maximum amount of rainfall at different times. Using these T values, duration, altitude, latitude, and longitude values were used as independent variables in a regression model of the data. The determination coefficient ( R 2) value indicated that the model yields suitable results for the regional relationship of intensity-duration-frequency (IDF), which is necessary for the design of hydraulic structures in small and medium sized catchments.

  7. A Handbook of Sound and Vibration Parameters

    DTIC Science & Technology

    1978-09-18

    fixed in space. (Reference 1.) no motion atay node Static Divergence: (See Divergence.) Statistical Energy Analysis (SEA): Statistical energy analysis is...parameters of the circuits come from statistics of the vibrational characteristics of the structure. Statistical energy analysis is uniquely successful

  8. Medical students review of formative OSCE scores, checklists, and videos improves with student-faculty debriefing meetings

    PubMed Central

    Bernard, Aaron W.; Ceccolini, Gabbriel; Feinn, Richard; Rockfeld, Jennifer; Rosenberg, Ilene; Thomas, Listy; Cassese, Todd

    2017-01-01

    ABSTRACT Background: Performance feedback is considered essential to clinical skills development. Formative objective structured clinical exams (F-OSCEs) often include immediate feedback by standardized patients. Students can also be provided access to performance metrics including scores, checklists, and video recordings after the F-OSCE to supplement this feedback. How often students choose to review this data and how review impacts future performance has not been documented. Objective: We suspect student review of F-OSCE performance data is variable. We hypothesize that students who review this data have better performance on subsequent F-OSCEs compared to those who do not. We also suspect that frequency of data review can be improved with faculty involvement in the form of student-faculty debriefing meetings. Design: Simulation recording software tracks and time stamps student review of performance data. We investigated a cohort of first- and second-year medical students from the 2015-16 academic year. Basic descriptive statistics were used to characterize frequency of data review and a linear mixed-model analysis was used to determine relationships between data review and future F-OSCE performance. Results: Students reviewed scores (64%), checklists (42%), and videos (28%) in decreasing frequency. Frequency of review of all metric and modalities improved when student-faculty debriefing meetings were conducted (p<.001). Among 92 first-year students, checklist review was associated with an improved performance on subsequent F-OSCEs (p = 0.038) by 1.07 percentage points on a scale of 0-100. Among 86 second year students, no review modality was associated with improved performance on subsequent F-OSCEs. Conclusion: Medical students review F-OSCE checklists and video recordings less than 50% of the time when not prompted. Student-faculty debriefing meetings increased student data reviews. First-year student’s review of checklists on F-OSCEs was associated with increases in performance on subsequent F-OSCEs, however this outcome was not observed among second-year students. PMID:28521646

  9. Medical students review of formative OSCE scores, checklists, and videos improves with student-faculty debriefing meetings.

    PubMed

    Bernard, Aaron W; Ceccolini, Gabbriel; Feinn, Richard; Rockfeld, Jennifer; Rosenberg, Ilene; Thomas, Listy; Cassese, Todd

    2017-01-01

    Performance feedback is considered essential to clinical skills development. Formative objective structured clinical exams (F-OSCEs) often include immediate feedback by standardized patients. Students can also be provided access to performance metrics including scores, checklists, and video recordings after the F-OSCE to supplement this feedback. How often students choose to review this data and how review impacts future performance has not been documented. We suspect student review of F-OSCE performance data is variable. We hypothesize that students who review this data have better performance on subsequent F-OSCEs compared to those who do not. We also suspect that frequency of data review can be improved with faculty involvement in the form of student-faculty debriefing meetings. Simulation recording software tracks and time stamps student review of performance data. We investigated a cohort of first- and second-year medical students from the 2015-16 academic year. Basic descriptive statistics were used to characterize frequency of data review and a linear mixed-model analysis was used to determine relationships between data review and future F-OSCE performance. Students reviewed scores (64%), checklists (42%), and videos (28%) in decreasing frequency. Frequency of review of all metric and modalities improved when student-faculty debriefing meetings were conducted (p<.001). Among 92 first-year students, checklist review was associated with an improved performance on subsequent F-OSCEs (p = 0.038) by 1.07 percentage points on a scale of 0-100. Among 86 second year students, no review modality was associated with improved performance on subsequent F-OSCEs. Medical students review F-OSCE checklists and video recordings less than 50% of the time when not prompted. Student-faculty debriefing meetings increased student data reviews. First-year student's review of checklists on F-OSCEs was associated with increases in performance on subsequent F-OSCEs, however this outcome was not observed among second-year students.

  10. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  11. Technology Use in Science Instruction (TUSI): Aligning the Integration of Technology in Science Instruction in Ways Supportive of Science Education Reform

    NASA Astrophysics Data System (ADS)

    Campbell, Todd; Abd-Hamid, Nor Hashidah

    2013-08-01

    This study describes the development of an instrument to investigate the extent to which technology is integrated in science instruction in ways aligned to science reform outlined in standards documents. The instrument was developed by: (a) creating items consistent with the five dimensions identified in science education literature, (b) establishing content validity with both national and international content experts, (c) refining the item pool based on content expert feedback, (d) piloting testing of the instrument, (e) checking statistical reliability and item analysis, and (f) subsequently refining and finalization of the instrument. The TUSI was administered in a field test across eleven classrooms by three observers, with a total of 33 TUSI ratings completed. The finalized instrument was found to have acceptable inter-rater intraclass correlation reliability estimates. After the final stage of development, the TUSI instrument consisted of 26-items separated into the original five categories, which aligned with the exploratory factor analysis clustering of the items. Additionally, concurrent validity of the TUSI was established with the Reformed Teaching Observation Protocol. Finally, a subsequent set of 17 different classrooms were observed during the spring of 2011, and for the 9 classrooms where technology integration was observed, an overall Cronbach alpha reliability coefficient of 0.913 was found. Based on the analyses completed, the TUSI appears to be a useful instrument for measuring how technology is integrated into science classrooms and is seen as one mechanism for measuring the intersection of technological, pedagogical, and content knowledge in science classrooms.

  12. Influence of residual bone thickness on primary stability of hybrid self-tapping and cylindric non-self-tapping implants in vitro.

    PubMed

    Divac, Marija; Stawarczyk, Bogna; Sahrmann, Philipp; Attin, Thomas; Schmidlin, Patrick R

    2013-01-01

    To assess the primary stability of a hybrid self-tapping implant and a cylindric non-self-tapping implant in an in vitro test model using polyurethane foam. Eighty standardized blocks of cellular rigid polyurethane foam, 2 cm long and 1 cm wide, with different thicknesses of 2, 4, 6, and 9 mm (n = 10 per group) were cut. Two implant systems--a hybrid self-tapping (Tapered Effect [TE], Straumann) and a cylindric non-self-tapping (Standard Plus [SP] Wide Neck, Straumann) were placed in the block specimens. Subsequently, resonance frequency analysis (RFA) was performed. The RFA measurements were made in triplicate on four aspects of each implant (mesial, distal, buccal, and oral), and the mean RFA value was calculated. Subsequently, the tensile load of the implants was determined by pull-out tests. The data were analyzed using one-way and two-way analysis of variance followed by a post hoc Scheffe test and a t test (α = .05). Additionally, the simple linear correlation between the RFA and tensile load values was evaluated. No statistically significant differences were found between TE and SP in terms of RFA at different bone thicknesses. Starting from a bone thickness of 4 mm, TE implants showed significantly higher tensile load compared to SP implants (P = .016 to .040). A correlation was found between the RFA measurements and tensile load. Mechanically stable placement is possible with TE and SP implants in a trabecular bone model. RFA and tensile load increased with greater bone thickness.

  13. Clinical Outcomes from Androgen Signaling-directed Therapy after Treatment with Abiraterone Acetate and Prednisone in Patients with Metastatic Castration-resistant Prostate Cancer: Post Hoc Analysis of COU-AA-302.

    PubMed

    Smith, Matthew R; Saad, Fred; Rathkopf, Dana E; Mulders, Peter F A; de Bono, Johann S; Small, Eric J; Shore, Neal D; Fizazi, Karim; Kheoh, Thian; Li, Jinhui; De Porre, Peter; Todd, Mary B; Yu, Margaret K; Ryan, Charles J

    2017-07-01

    In the COU-AA-302 trial, abiraterone acetate plus prednisone significantly increased overall survival for patients with chemotherapy-naïve metastatic castration-resistant prostate cancer (mCRPC). Limited information exists regarding response to subsequent androgen signaling-directed therapies following abiraterone acetate plus prednisone in patients with mCRPC. We investigated clinical outcomes associated with subsequent abiraterone acetate plus prednisone (55 patients) and enzalutamide (33 patients) in a post hoc analysis of COU-AA-302. Prostate-specific antigen (PSA) response was assessed. Median time to PSA progression was estimated using the Kaplan-Meier method. The PSA response rate (≥50% PSA decline, unconfirmed) was 44% and 67%, respectively. The median time to PSA progression was 3.9 mo (range 2.6-not estimable) for subsequent abiraterone acetate plus prednisone and 2.8 mo (range 1.8-not estimable) for subsequent enzalutamide. The majority of patients (68%) received intervening chemotherapy before subsequent abiraterone acetate plus prednisone or enzalutamide. While acknowledging the limitations of post hoc analyses and high censoring (>75%) in both treatment groups, these results suggest that subsequent therapy with abiraterone acetate plus prednisone or enzalutamide for patients who progressed on abiraterone acetate is associated with limited clinical benefit. This analysis showed limited clinical benefit for subsequent abiraterone acetate plus prednisone or enzalutamide in patients with metastatic castration-resistant prostate cancer following initial treatment with abiraterone acetate plus prednisone. This analysis does not support prioritization of subsequent abiraterone acetate plus prednisone or enzalutamide following initial therapy with abiraterone acetate plus prednisone. Copyright © 2017 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  14. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean

  15. Structure-related statistical singularities along protein sequences: a correlation study.

    PubMed

    Colafranceschi, Mauro; Colosimo, Alfredo; Zbilut, Joseph P; Uversky, Vladimir N; Giuliani, Alessandro

    2005-01-01

    A data set composed of 1141 proteins representative of all eukaryotic protein sequences in the Swiss-Prot Protein Knowledge base was coded by seven physicochemical properties of amino acid residues. The resulting numerical profiles were submitted to correlation analysis after the application of a linear (simple mean) and a nonlinear (Recurrence Quantification Analysis, RQA) filter. The main RQA variables, Recurrence and Determinism, were subsequently analyzed by Principal Component Analysis. The RQA descriptors showed that (i) within protein sequences is embedded specific information neither present in the codes nor in the amino acid composition and (ii) the most sensitive code for detecting ordered recurrent (deterministic) patterns of residues in protein sequences is the Miyazawa-Jernigan hydrophobicity scale. The most deterministic proteins in terms of autocorrelation properties of primary structures were found (i) to be involved in protein-protein and protein-DNA interactions and (ii) to display a significantly higher proportion of structural disorder with respect to the average data set. A study of the scaling behavior of the average determinism with the setting parameters of RQA (embedding dimension and radius) allows for the identification of patterns of minimal length (six residues) as possible markers of zones specifically prone to inter- and intramolecular interactions.

  16. Evaluation of Different Normalization and Analysis Procedures for Illumina Gene Expression Microarray Data Involving Small Changes

    PubMed Central

    Johnstone, Daniel M.; Riveros, Carlos; Heidari, Moones; Graham, Ross M.; Trinder, Debbie; Berretta, Regina; Olynyk, John K.; Scott, Rodney J.; Moscato, Pablo; Milward, Elizabeth A.

    2013-01-01

    While Illumina microarrays can be used successfully for detecting small gene expression changes due to their high degree of technical replicability, there is little information on how different normalization and differential expression analysis strategies affect outcomes. To evaluate this, we assessed concordance across gene lists generated by applying different combinations of normalization strategy and analytical approach to two Illumina datasets with modest expression changes. In addition to using traditional statistical approaches, we also tested an approach based on combinatorial optimization. We found that the choice of both normalization strategy and analytical approach considerably affected outcomes, in some cases leading to substantial differences in gene lists and subsequent pathway analysis results. Our findings suggest that important biological phenomena may be overlooked when there is a routine practice of using only one approach to investigate all microarray datasets. Analytical artefacts of this kind are likely to be especially relevant for datasets involving small fold changes, where inherent technical variation—if not adequately minimized by effective normalization—may overshadow true biological variation. This report provides some basic guidelines for optimizing outcomes when working with Illumina datasets involving small expression changes. PMID:27605185

  17. Relationship Between Insertion Torque and Resonance Frequency Measurements, Performed by Resonance Frequency Analysis, in Micromobility of Dental Implants: An In Vitro Study.

    PubMed

    Brizuela-Velasco, Aritza; Álvarez-Arenal, Ángel; Gil-Mur, Francisco Javier; Herrero-Climent, Mariano; Chávarri-Prado, David; Chento-Valiente, Yelko; Dieguez-Pereira, Markel

    2015-10-01

    To evaluate the micromobility of dental implants under occlusal loading in relation to stability measurements of resonance frequency analysis and insertion torque. The sample comprised of 24 implants inserted in 12 fresh cow ribs. Insertion torque and Osstell implant stability quotient (ISQ) measurements were recorded. An "ad hoc" acrylic premolar was made on a temporary abutment and screwed to each implant, and a force of 100 N was subsequently applied at an angle of 6 degrees. Implant micromotion was measured using a Questar microscope with a resolution of 2 μm and an image analysis program. Data show a statistically significant inverse correlation between the ISQ values and implant micromotion under a load of 100 N (R = 0.86, P < 0.0001). The same relationship is found between insertion torque and implant micromotion, although the relationship is linear up to 34 N·cm and becomes exponential for higher values (R = 0.78, P < 0.0001). A direct correlation is established between insertion torque and ISQ values. There is an inverse relationship between both ISQ and insertion torque values and implant micromotion under a load of 100 N.

  18. Classification of Rotor Induced Shearing Events in the Near Wake of a Wind Turbine Array Boundary Layer

    NASA Astrophysics Data System (ADS)

    Smith, Sarah; Viggiano, Bianca; Ali, Naseem; Cal, Raul Bayoan

    2017-11-01

    Flow perturbation induced by a turbine rotor imposes considerable turbulence and shearing effects in the near wake of a turbine, altering the efficiency of subsequent units within a wind farm array. Previous methods have characterized near wake vorticity of a turbine and recovery distance of various turbine array configurations. This study aims to build on previous analysis with respect to a turbine rotor within an array and develop a model to examine stress events and energy contribution in the near wake due to rotational effects. Hot wire anemometry was employed downstream of a turbine centrally located in the third row of a 3x3 array. Data considered points planar to the rotor and included simultaneous streamwise and wall-normal velocities as well as concurrent streamwise and transverse velocities. Conditional analysis of Reynolds stresses induced by the rotor agree with former near wake research, and examination of stresses in terms of streamwise and transverse velocity components depicts areas of significant rotational effects. Continued analysis includes spectral decomposition and conditional statistics to further characterize shearing events at various points considering the swept area of the rotor.

  19. Kinematic analysis of total knee prosthesis designed for Asian population.

    PubMed

    Low, F H; Khoo, L P; Chua, C K; Lo, N N

    2000-01-01

    In designing a total knee replacement (TKR) prosthesis catering for the Asian population, 62 sets of femur were harvested and analyzed. The morphometrical data obtained were found to be in good agreement with dimensions typical of the Asian knee and has reaffirmed the fact that Caucasian knees are generally larger than Asian knees. Subsequently, these data when treated using a multivariate statistical technique resulted in the establishment of major design parameters for six different sizes of femoral implants. An extra-small implant size with established dimensions and geometrical shape has surfaced from the study. The differences between the Asian knees and the Caucasian knees are discussed. Employing the established femoral dimensions and motion path of the knee joint, the articulating tibia profile was generated. All the sizes of implants were modeled using a computer-aided software package. Thereupon, these models that accurately fits the local Asian knee were transported into a dynamic and kinematic analysis software package. The tibiofemoral joint was modeled successfully as a slide curve joint to study intuitively the motion of the femur when articulating on the tibia surface. An optimal tibia profile could be synthesized to mimic the natural knee path motion. Details of the analysis are presented and discussed.

  20. Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century.

    NASA Astrophysics Data System (ADS)

    Pino, C.; Lionello, P.; Galati, M. B.

    2009-04-01

    Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century Piero Lionello, University of Salento, piero.lionello@unisalento.it Maria Barbara Galati, University of Salento, mariabarbara.galati@unisalento.it Cosimo Pino, University of Salento, pino@le.infn.it The analysis of extreme Significant Wave Height (SWH) values and their trend is crucial for planning and managing coastal defences and off-shore activities. The analysis provided by this study covers a 44-year long period (1958-2001). First the WW3 (Wave Watch 3) model forced with the REMO-Hipocas regional model wind fields has been used for the hindcast of extreme SWH values over the Mediterranean basin with a 0.25 deg lat-lon resolution. Subsequently, the model results have been processed with an ad hoc software to detect storms. GEV analysis has been perfomed and a set of indicators for extreme SWH have been computed, using the Mann Kendall test for assessing statistical significance of trends for different parameter such as the number of extreme events, their duration and their intensity. Results suggest a transition towards weaker extremes and a milder climate over most of the Mediterranean Sea.

Top